sql-servert-sqlloggingdatabase-trigger

How do I stop/change an sql trigger after inserting a certain amount of records?


I currently have a lot of triggers running on most of my tables. I'm using Insert, Update and Delete triggers on all of them. They log into a separate tabel. However the processing time to use the software has increased because of this. It is barely/not noticable for smaller changes, however for big changes it can go from 10-15min to 1hr.

I would like to change my triggers to stop insterting new log records after say 250 log records in 1 minute (bulk action), delete the newly created logs and create 1 record mentiong bulk and the query used. Problem is I can't seem to get the trigger to stop when activated.

I have already created the conditions needed for this:

CREATE TRIGGER AUDIT_LOGGING_INSERT_ACTUALISERINGSCOEFFICIENT ON ACTUALISERINGSCOEFFICIENT FOR INSERT AS  
BEGIN
    SET NOCOUNT ON

DECLARE @Group_ID INT = (SELECT COALESCE(MAX(Audit_Trail_Group_ID), 0) FROM NST.dbo.Audit_Trail) + 1
DECLARE @BulkCount INT = (SELECT COUNT(*) FROM NST.dbo.Audit_Trail WHERE Audit_Trail_User = CONCAT('7090-LOCAL-', UPPER(SUSER_SNAME())) AND GETDATE() >= DATEADD(MINUTE, -1, GETDATE())) 

IF @BulkCount < 250
    BEGIN
        INSERT ...
    END
ELSE
    BEGIN
        DECLARE @BulkRecordCount INT = (SELECT COUNT(*) FROM NST.dbo.Audit_Trail WHERE Audit_Trail_User = CONCAT('7090-LOCAL-', UPPER(SUSER_SNAME())) AND GETDATE() >= DATEADD(MINUTE, -60, GETDATE()) AND Audit_Trail_Action LIKE '%BULK%')
        IF @BulkRecordCount = 0
            BEGIN
                INSERT ...
            END
    END
END

However when I execute a query that changes 10000 plus records the trigger still inserts all 10000. When I execute it again right after it inserts 10000 BULK records. Probably because it executes the first time it triggers (goes through the function) 10000 times?

Also as you can see, this would work only if 1 bulk operation is used in the last 60 min.

Any ideas for handling bulk changes are welcome.


Solution

  • Didn't get it to work by logging the first 250 records.

    Instead I did the following:

    1. Created a new table with 'Action' and 'User' columns
    2. I add a record everytime a bulk action starts and delete it when it ends
    3. Changed the trigger so that if a record is found for the user in the new table that it only writes 1 bulk record in the log table

    Problems associated:

    1. Problem with this is that I also have had to manually go through the biggest bulk functions and implement the add and delete.
    2. An extra point of failure if the add record gets added but an exception occurs that doesnt delete the record again. -> Implemented a Try Catch where needed.