I currently have a lot of triggers running on most of my tables. I'm using Insert, Update and Delete triggers on all of them. They log into a separate tabel. However the processing time to use the software has increased because of this. It is barely/not noticable for smaller changes, however for big changes it can go from 10-15min to 1hr.
I would like to change my triggers to stop insterting new log records after say 250 log records in 1 minute (bulk action), delete the newly created logs and create 1 record mentiong bulk and the query used. Problem is I can't seem to get the trigger to stop when activated.
I have already created the conditions needed for this:
CREATE TRIGGER AUDIT_LOGGING_INSERT_ACTUALISERINGSCOEFFICIENT ON ACTUALISERINGSCOEFFICIENT FOR INSERT AS
BEGIN
SET NOCOUNT ON
DECLARE @Group_ID INT = (SELECT COALESCE(MAX(Audit_Trail_Group_ID), 0) FROM NST.dbo.Audit_Trail) + 1
DECLARE @BulkCount INT = (SELECT COUNT(*) FROM NST.dbo.Audit_Trail WHERE Audit_Trail_User = CONCAT('7090-LOCAL-', UPPER(SUSER_SNAME())) AND GETDATE() >= DATEADD(MINUTE, -1, GETDATE()))
IF @BulkCount < 250
BEGIN
INSERT ...
END
ELSE
BEGIN
DECLARE @BulkRecordCount INT = (SELECT COUNT(*) FROM NST.dbo.Audit_Trail WHERE Audit_Trail_User = CONCAT('7090-LOCAL-', UPPER(SUSER_SNAME())) AND GETDATE() >= DATEADD(MINUTE, -60, GETDATE()) AND Audit_Trail_Action LIKE '%BULK%')
IF @BulkRecordCount = 0
BEGIN
INSERT ...
END
END
END
However when I execute a query that changes 10000 plus records the trigger still inserts all 10000. When I execute it again right after it inserts 10000 BULK records. Probably because it executes the first time it triggers (goes through the function) 10000 times?
Also as you can see, this would work only if 1 bulk operation is used in the last 60 min.
Any ideas for handling bulk changes are welcome.
Didn't get it to work by logging the first 250 records.
Instead I did the following:
Problems associated: