pythonsqldjangomigration

How to programmatically generate the CREATE TABLE SQL statement for a given model in Django?


I need to programmatically generate the CREATE TABLE statement for a given unmanaged model in my Django app (managed = False)

Since i'm working on a legacy database, i don't want to create a migration and use sqlmigrate.

The ./manage.py sql command was useful for this purpose but it has been removed in Django 1.8

Do you know about any alternatives?


Solution

  • As suggested, I post a complete answer for the case, that the question might imply.

    Suppose you have an external DB table, that you decided to access as a Django model and therefore have described it as an unmanaged model (Meta: managed = False). Later you need to be able to create it in your code, e.g for some tests using your local DB. Obviously, Django doesn't make migrations for unmanaged models and therefore won't create it in your test DB. This can be solved using Django APIs without resorting to raw SQL - SchemaEditor. See a more complete example below, but as a short answer you would use it like this:

       from django.db import connections
    
       with connections['db_to_create_a_table_in'].schema_editor() as schema_editor:
            schema_editor.create_model(YourUnmanagedModelClass)
    

    A practical example:

    # your_app/models/your_model.py
    
    from django.db import models
    
    class IntegrationView(models.Model):
        """A read-only model to access a view in some external DB."""
    
        class Meta:
            managed = False
            db_table = 'integration_view'
    
        name = models.CharField(
            db_column='object_name',
            max_length=255,
            primaty_key=True,
            verbose_name='Object Name',
        )
        some_value = models.CharField(
            db_column='some_object_value',
            max_length=255,
            blank=True,
            null=True,
            verbose_name='Some Object Value',
        )
    
        # Depending on the situation it might be a good idea to redefine
        # some methods as a NOOP as a safety-net.
        # Note, that it's not completely safe this way, but might help with some
        # silly mistakes in user code
    
        def save(self, *args, **kwargs):
            """Preventing data modification."""
            pass
    
        def delete(self, *args, **kwargs):
            """Preventing data deletion."""
            pass
    

    Now, suppose you need to be able to create this model via Django, e.g. for some tests.

    # your_app/tests/some_test.py
    
    # This will allow to access the `SchemaEditor` for the DB
    from django.db import connections
    from django.test import TestCase
    from your_app.models.your_model import IntegrationView
    
    class SomeLogicTestCase(TestCase):
        """Tests some logic, that uses `IntegrationView`."""
    
        # Since it is assumed, that the `IntegrationView` is read-only for the
        # the case being described it's a good idea to put setup logic in class 
        # setup fixture, that will run only once for the whole test case
        @classmethod
        def setUpClass(cls):
            """Prepares `IntegrationView` mock data for the test case."""
    
            # This is the actual part, that will create the table in the DB
            # for the unmanaged model (Any model in fact, but managed models will
            # have their tables created already by the Django testing framework)
            # Note: Here we're able to choose which DB, defined in your settings,
            # will be used to create the table
    
            with connections['external_db'].schema_editor() as schema_editor:
                schema_editor.create_model(IntegrationView)
    
            # That's all you need, after the execution of this statements
            # a DB table for `IntegrationView` will be created in the DB
            # defined as `external_db`.
    
            # Now suppose we need to add some mock data...
            # Again, if we consider the table to be read-only, the data can be 
            # defined here, otherwise it's better to do it in `setUp()` method.
    
            # Remember `IntegrationView.save()` is overridden as a NOOP, so simple
            # calls to `IntegrationView.save()` or `IntegrationView.objects.create()`
            # won't do anything, so we need to "Improvise. Adapt. Overcome."
    
            # One way is to use the `save()` method of the base class,
            # but provide the instance of our class
            integration_view = IntegrationView(
                name='Biggus Dickus',
                some_value='Something really important.',
            )
            super(IntegrationView, integration_view).save(using='external_db')
    
            # Another one is to use the `bulk_create()`, which doesn't use
            # `save()` internally, and in fact is a better solution
            # if we're creating many records
    
            IntegrationView.objects.using('external_db').bulk_create([
                IntegrationView(
                    name='Sillius Soddus',
                    some_value='Something important',
                ),
                IntegrationView(
                    name='Naughtius Maximus',
                    some_value='Whatever',
                ),
            ])
    
        # Don't forget to clean after
        @classmethod
        def tearDownClass(cls):
            with connections['external_db'].schema_editor() as schema_editor:
                schema_editor.delete_model(IntegrationView)
    
        def test_some_logic_using_data_from_integration_view(self):
             self.assertTrue(IntegrationView.objects.using('external_db').filter(
                 name='Biggus Dickus',
             ))
    

    To make the example more complete... Since we're using multiple DB (default and external_db) Django will try to run migrations on both of them for the tests and as of now there's no option in DB settings to prevent this. So we have to use a custom DB router for testing.

     # your_app/tests/base.py
    
    class PreventMigrationsDBRouter:
        """DB router to prevent migrations for specific DBs during tests."""
        _NO_MIGRATION_DBS = {'external_db', }
    
        def allow_migrate(self, db, app_label, model_name=None, **hints):
            """Actually disallows migrations for specific DBs."""
            return db not in self._NO_MIGRATION_DBS
    

    And a test settings file example for the described case:

    # settings/test.py
    
    DATABASES = {
        'default': {
            'ENGINE': 'django.db.backends.oracle',
            'NAME': 'db_name',
            'USER': 'username',
            'HOST': 'localhost',
            'PASSWORD': 'password',
            'PORT': '1521',
        },
        # For production here we would have settings to connect to the external DB,
        # but for testing purposes we could get by with an SQLite DB 
        'external_db': {
            'ENGINE': 'django.db.backends.sqlite3',
        },
    }
    
    # Not necessary to use a router in production config, since if the DB 
    # is unspecified explicitly for some action Django will use the `default` DB
    DATABASE_ROUTERS = ['your_app.tests.base.PreventMigrationsDBRouter', ]
    

    Hope this detailed new Django user user-friendly example will help someone and save their time.