postgresqlbucardo

Postgres Sequence out of sync


I'm running a multi-master setup with bucardo and postgres.

I'm finding that some of my table sequences are getting out of sync with each other. Particularly the auto-incremented id.

example:

db1 - table1

INSERT INTO distributors (did, dname) VALUES (DEFAULT, 'XYZ Widgets')

The id of the new row is 1

db2 - table1

INSERT INTO distributors (did, dname) VALUES (DEFAULT, 'XYZ Widgets')

The id of the new row is 1

The id of the new row on db2 should be 2, because bucardo has replicated the data from db1, but db2's auto increment is based on:

nextval('oauth_sessions_id_seq'::regclass)

And if we check the "oauth_sessions_id_seq" we see the last value as 0.

phew... Make sense?

Anyway, can I do any of the following?

  1. Replicate the session tables with bucardo, so each DB's session is shared?

  2. Manipulate the default auto-increment function above to take into account the max existing items in the table?

If you have any better ideas, please feel free to throw them in. Questions just ask, thanks for any help.


Solution

  • You are going to have to change your id generation method, because there is no Bucardo solution according to this comment in the FAQ.

    Can Bucardo replicate DDL?

    No, Bucardo relies on triggers, and Postgres does not yet provide DDL triggers or triggers on its system tables.

    Since Bucardo uses triggers, it cannot "see" the sequence changes, only the data in tables, which it replicates. Sequences are interesting objects that do not support triggers, but you can manually update them. I suppose you could add something like the code below before the INSERT, but there still might be issues.

    SELECT setval('oauth_sessions_id_seq', (SELECT MAX(did) FROM distributors));
    

    See this question for more information.

    I am not fully up on all the issues involved, but you could perform the maximum calculation manually and do the insert operation in a re-try loop. I doubt it will work if you are actually doing inserts on both DBs and allowing Bucardo to replicate, but if you can guarantee that only one DB updates at a time, then you could try something like an UPSERT retry loop. See this post for more info. The "guts" of the loop might look like this:

    INSERT INTO  distributors (did, dname) 
        VALUES ((SELECT max(did)+1 FROM distributors), 'XYZ Widgets');