This tip is for dealing with archiving and/or updating large volumes of data from a single table in a database while the database is still in use by other users.
Imagine a scenario where you have a requirement to delete financial data older than 7 years. But the table contains both current and historical data that users access frequently.
The problem with large DELETE transactions is that the DBMS can lock the table preventing read requests until the transaction is committed. No indexes or partitions can also make a DELETE transaction can take a long time.
The Nibbling Delete construct resolves this issue by breaking down a large DELETE transaction into multiple smaller DELETE transactions with a short delay in-between to allow read transactions to continue. This is with the assumption that remaining records to be deleted are not accidently read.
I’ve added a condition to the sample code to allow situations where you don’t want code to process after a specific time like when business hours start. This is entirely optional for your situation.
Another option is that the algorithm can be modified to perform a nibbling INSERT or nibbling UPDATE.
declare @batchsize int, @numYearsToRetain int declare @cutoffTime varchar(10), @currentTime varchar(10) set @batchsize=100000 set @numYearsToRetain=-7 set @cutoffTime='08:00:00' while (1=1) BEGIN DELETE top (@batchsize) from [Sales].[SalesOrderDetail] where ModifiedDate = dateadd(yy, @numYearsToRetain,getutcdate()); select @currentTime=convert(varchar(8),getutcdate(),8) /* use timecheck as a failsafe if you need to stop processing during a critical time*/ if @@rowcount < @batchsize or @currentTime>=@cutoffTime BEGIN break; END waitfor delay '00:00:00.010'; END