mercredi 3 décembre 2014

Writing a large update/delete script that doesn't provoke locks


While running a T-SQL script that is performing massive amounts of deletes, I had the unpleasant experience of causing some sort of lock contention that prevented inserts of data in some tables for other connections. I tried to remedy this by breaking down the request into smaller transactions with the hopes of avoiding an overarching transaction and locks:



set implicit_transactions off

declare @idsToDelete table (id bigint)
declare @currentId bigint
declare @deleteCursor cursor

begin transaction
insert into @idsToDelete
select someId from someTable with (nolock) where someCondition = 'some value'
commit

set @deleteCursor = cursor for select id from @idsToDelete
open @deleteCursor
fetch next from @deleteCursor into @currentId

while @@fetch_status = 0 begin

begin transaction

delete from relatedTable
where id = @currentId

delete from someTable
where id = @currentId

commit

fetch next from @deleteCursor into @currentId

end

close @deleteCursor


My understanding is that set implicit_transactions off should enable me to not have a top-level transaction. Therefore, each of the deletes would run, complete, and we'd be done with it. I was able to see progress with sets of rows being deleted in piecemeal fashion, so I thought it was all good. However, in actuality, inserts into this table were being blocked. And worse, when I noticed and cancelled the run, the whole thing rolled back, continuing to block as it did so!


Is there a way for me to get this thing to run with independent transactions and no locks (except row locks which are OK)? I need to write a big update script that could have similar problems if I'm not careful.





Aucun commentaire:

Enregistrer un commentaire