ruby-on-railspostgresqlrails-activerecordupsertbulkupdate

Upsert_all with uniqueness index constraint


I'm facing a dilemma.

I've created a model with an wave_order column with uniqueness db index with wavable. wavable is a polymorphic (but we don't care actually).

# migration
create_table :invitation_waves do |t|
  t.string :name, null: false
  t.references :wavable, polymorphic: true, index: true
  t.integer :wave_order, null: false
  t.timestamps
end
add_index :invitation_waves, [:wavable_id, :wavable_type, :wave_order], unique: true
# This first time, it's working
InvitationWave.upsert_all([{id: 1, name: "Wave1", wave_order: 0, wavable_id: my_object.id, wavable_class: my_object.class.name}, {id: 2, name: "Wave2", wave_order: 1, wavable_id: my_object.id, wavable_class: my_object.class.name}], unique_by: :id, update_only: :wave_order, record_timestamps: true)

# The second time, it's not because of the db constraint. 
InvitationWave.upsert_all([{id: 1, name: "Wave1", wave_order: 1, wavable_id: my_object.id, wavable_class: my_object.class.name}, {id: 2, name: "Wave2", wave_order: 2, wavable_id: my_object.id, wavable_class: my_object.class.name}], unique_by: :id, update_only: :wave_order, record_timestamps: true)

# => ActiveRecord::RecordNotUnique:
       PG::UniqueViolation: ERROR:  duplicate key value violates unique constraint "idx_on_wavable_id_wavable_type_wave_order_39aa14a122"
       DETAIL:  Key (wavable_id, wavable_type, wave_order)=(1, MyObject, 1) already exists.

Is there any other solution than those 3:

  1. Remove the db index uniqueness
  2. Update manually each row
  3. Use twice the upsert_all (first time with wrong but available wave_orders, second time with correct wave_orders

Solution

  • What's about adding DEFERRABLE INITIALLY IMMEDIATE for your uniqueness constraint?

    In this case PostgreSQL checks uniqueness before commitment

    def up
      create_table :invitation_waves do |t|
        t.string :name, null: false
        t.references :wavable, polymorphic: true, index: true
        t.integer :wave_order, null: false
        t.timestamps
      end
    
      execute <<~SQL.squish
        ALTER TABLE invitation_waves
        ADD CONSTRAINT idx_on_wavable_id_wavable_type_wave_order
        UNIQUE (wavable_id, wavable_type, wave_order)
        DEFERRABLE INITIALLY IMMEDIATE
      SQL
    end
    
    def down
      drop_table :invitation_waves
    end
    

    In new versions of Rails you can apply migration method unique_constraint

    create_table :invitation_waves do |t|
      t.string :name, null: false
      t.references :wavable, polymorphic: true, index: true   t.integer :wave_order, null: false
      t.timestamps
      t.unique_constraint %i[wavable_id wavable_type wave_order], deferrable: :immediate
    end
    

    or add_unique_constraint

    add_unique_constraint :invitation_waves, %i[wavable_id wavable_type wave_order], deferrable: :immediate
    

    No matter how you add this constraint it will work such way

    # First query (populate records)
    InvitationWave.upsert_all(
      [
        {id: 1, name: "Wave1", wave_order: 0, wavable_id: 1, wavable_type: "MyObject"},
        {id: 2, name: "Wave2", wave_order: 1, wavable_id: 1, wavable_type: "MyObject"}
      ],
      unique_by: :id, update_only: :wave_order, record_timestamps: true
    )
    
    # Second query (update records) that failed without DEFERRABLE, but now is ok
    InvitationWave.upsert_all(
      [
        {id: 1, name: "Wave1", wave_order: 1, wavable_id: 1, wavable_type: "MyObject"},
        {id: 2, name: "Wave2", wave_order: 2, wavable_id: 1, wavable_type: "MyObject"}
      ],
      unique_by: :id, update_only: :wave_order, record_timestamps: true
    )
    

    Both queries will be successful because constraint will be checked at the end of the statement (immediately before commit)

    But when you try to execute third query,

    # Will fail, because there is other record with same wave_order
    InvitationWave.find(2).update!(wave_order: 1)
    

    it will fail because of UNIQUE constraint