I have a large data table. There are 10 million records in this table.
What is the best way for this query
Delete LargeTable where readTime < dateadd(MONTH,-7,GETDATE())
If you are Deleting All the rows in that table the simplest option is to Truncate table, something like
TRUNCATE TABLE LargeTable
GO
Truncate table will simply empty the table, you cannot use WHERE clause to limit the rows being deleted and no triggers will be fired.
On the other hand if you are deleting more than 80-90 Percent of the data, say if you have total of 11 million rows and you want to delete 10 million another way would be to Insert these 1 million rows (records you want to keep) to another staging table. Truncate this large table and Insert back these 1 million rows.
Or if permissions/views or other objects which has this large table as their underlying table doesn't get affected by dropping this table, you can get these relatively small amounts of the rows into another table, drop this table and create another table with same schema, and import these rows back into this ex-Large table.
One last option I can think of is to change your database's Recovery Mode to SIMPLE
and then delete rows in smaller batches using a while loop something like this:
DECLARE @Deleted_Rows INT;
SET @Deleted_Rows = 1;
WHILE (@Deleted_Rows > 0)
BEGIN
-- Delete some small number of rows at a time
DELETE TOP (10000) LargeTable
WHERE readTime < dateadd(MONTH,-7,GETDATE())
SET @Deleted_Rows = @@ROWCOUNT;
END
and don't forget to change the Recovery mode back to full and I think you have to take a backup to make it fully effective (the change or recovery modes).