I have a table with ~20million rows which currently occupies about 30gb. There is an additional 30gb free on this database.
I have one column that was imported as text but needs to be converted to jsonb. It's pretty simple and is an array with up to 100 values inside.
Below is an example of converting just one of these strings
select to_jsonb('[45,45,45,55]'::json);
And then on the entire column
ALTER TABLE pointstable ALTER COLUMN aimjoined TYPE jsonb USING to_jsonb(aimjoined::json);
Running this progresses for about 30minutes and then fails. I can see disk space get close to 100% full before it fails.
Is there another way to change this column to jsonb that won't overrun my disk space? Perhaps in batches or another method altogether?
Drop any indexes on the table before altering it. When altering it, new indexes need to be built while the old indexes are still around. Dropping indexes first frees up their space twice, the space of the old index is available to be reused, the new index doesn't need to built as part of the ALTER command. (Once the ALTER command is committed, then the old table goes away, and it space is available to be used in building the new indexes).