At work we need to prepare SQL scripts for deployment. They are executed by database admins manually on SQL Server with SSMS.
To simplify things, let's say we prepare those files by concatenating atomic changes for all deployed tasks' scripts. So we've got A, B, C, D tasks to deploy and each of them has necessary .sql file. It would be best if we can execute that file in transaction, so when some error occurs all changes from this script won't be commited.
In theory, script can look like that:
BEGIN TRASACTION
UPDATE dbo.Test
SET SomeColumn = 12;
ALTER TABLE dbo.OtherTest ADD NewCol bit;
ALTER VIEW dbo.vSomeView AS
SELECT SomeCol
FROM dbo.SomeTbl;
COMMIT TRANSACTION
However, SQL Server has some limitations (for example alter view must be the only statement in the batch), so we use GO statements. But then transaction doesn't work as expected. When error occurs, statement before are not commited, but everything after goes without transaction and leaves database in dirty state.
We tried to add SET XACT_ABORT ON, BEGIN TRY-BEGIN CATCH, etc., but won't be able to achieve main goal: just execute one-file script completely or it should be rollbacked.
Is it possible to do this in SQL Server at all?
As someone wrote in already deleted comment, :on error exit
in SQLCMD mode is good solution for our case. We can instruct db admins to turn on SQLCMD mode in SSMS (Query -> SQLCMD Mode), put :on error exit
clause on the top of script and then everything we need works fine:
GO has effect,
transactions are handled well (when error occurs, whole script is stopped, not only current batch separated by GO's),
COMMIT on main transaction is the last statement, so when script is stopped before, nothing will be commited to database.