I'm encountering a significant performance difference when using an IF NOT EXISTS subquery within a larger query. The subquery executes quickly when run independently, taking approximately 2 seconds. However, when embedded within the larger query, the overall execution time increases dramatically to 4-5 minutes.
I've ensured that appropriate indexes are defined on the relevant tables, including the temporal table nd.tblReqMatSum
. Note that tmp.##mytable
is a global temporary table with no indexes, but it is almost always very small. Despite this, the performance issue persists.
Here's the problematic query:
declare @result int=0
IF NOT EXISTS (
SELECT 1
FROM tmp.##mytable t
INNER JOIN nd.tblReqMatSum rms ON t.matnr = rms.matnr
WHERE rms.isActive = 1
)
BEGIN
SET @result = 1;
END
This is the execution plan:
You say
Note that
tmp.##mytable
is a global temporary table with no indexes, but it is almost always very small.
But it still has 1000 rows, and a naive nested loop (scanning the whole table each time) is getting you 589796 * 1000 rows being read.
Put an index on that temp table.
CREATE CLUSTERED INDEX CS ON ##material268194 ([material number])
UPDATE STATISTICS ##material268194 WITH FULLSCAN;
Preferably make this index unique if you can.
You also have a type mismatch between the two join columns. One is nvarchar
the other is varchar
. They should be the same, otherwise you get an implicit conversion, which affects cardinality estimations and therefore often affects plan shape.