I have a huge table with about 100 million rows in SQL Server and want to change the datatype from varchar
to nvarchar
like this:
ALTER TABLE my_table
ALTER COLUMN comment NVARCHAR(250);
The problem is that my transaction log grows until the hard disk is full.
I have thought about kind of a bulk transaction but this is basically only one statement.
Does a temp table work when the new tables are created using the new datatypes?
CREATE TABLE my_table_new (comment NVARCHAR(250);
INSERT INTO my_table_new
SELECT *
FROM my_table;
-- or
SELECT comment
INTO my_table_new
FROM my_table;
DROP TABLE my_table;
What is the best approach to do this?
Any idea how I can avoid the extrem growth of the transaction log file?
You might find it faster to create a new table, truncate the old, and then re-insert:
select t.*
into temp_my_table
from my_table t;
truncate table my_table; -- back it up first!
alter table mytable alter column comment nvarchar(max); -- no need to be cheap here
insert into mytable
select t.*
from t;
One caveat: You may have to pay attention if you have insert triggers or identity columns.