On a multi-database environment, how to move all data from a database to another?
Settings looks like this:
DATABASES = {
'default': {},
'users': {
'NAME': 'user_data',
'ENGINE': 'django.db.backends.sqlite3',
'USER': 'user',
'PASSWORD': 'superS3cret'
},
'customers': {
'NAME': 'customer_data',
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'USER': 'cust',
'PASSWORD': 'veryPriv@ate'
}
}
Is there any simple solution/app to do this? Though a one-way migration is enough, since there are dozens of Models and objects with complex foreignkey and many-to-many connections, iterating over Models and objects, and saving them into another does not solve my problem.
loaddata
fails too, because object creation uses signals to create nother objects, and this messes up unique keys.
Using sql dumps is not trivial even, since the backends are different. I have tried this, I could not manage to make psql dumps from my sqlite3 database, which do not mess up foreign keys.
So I need a solution, which loads data from a database to another, but does not send signals, and works with foreign keys as well. But I could not find anything that does this.
Finally I managed to solve this.
I created tables using
manage.py syncdb --all
manage.py migrate --fake
Loading dumps with loaddata
did not work due to integrity problems. (Broken foreign key constraints, for example)
So I had to alter my tables. I removed the broken constraints, and then I could run loaddata
without any problems.
However, of course my database was still broken since I dropped a couple constraints. Therefore, I simply had to alter my tables again, using psql
shell.
pgAdmin came in handy, since shows relevant dump queries on a user-friendly interface, separately for each different constraint.