I have something like this:
class Base(Model):
...
def downcast(self):
try:
return self.childa
except:
pass
try:
return self.childb
except:
pass
return self
class ChildA(Base):
....
class ChildB(Base):
....
class Foo(Model):
child = ForeignKey(Base)
Whenever I have a Foo object, the child foreignkey is always an instance of Base - my understanding is that's how Django works. For now I have added a downcast() method to Base (see above). I don't mind hardcoding the possible derived types.
What I would like is to somehow centralize that downcast automatically in Foo. I added this multi-table inheritance to existing code and I keep finding instances where the code really needs it downcast -- so I have to then manually downcast it locally in the code.
I was using the django-polymorphic package, but it is giving me some side effects I don't know how/nor want to deal with (like I can't delete rows - got some error about opts.pk being None deep in queryset code.)
So I've wondered -- would putting something in __init__() (after calling the base class init) be ok? Are there side effects I'm not thinking of? This seems like it could be a problem when creating new instances from scratch.
def __init__(*args, **kwargs):
super(Base, self).__init__(*arg, **kwargs)
self.child = self.child.downcast()
Should I just rename child?
class Foo(Model):
child_poly = ForeignKey(Base) # was child
@property
def child(self):
return self.child_poly.downcast()
This could be a problem when creating Foo() from scratch. I can't say Foo(child=c).
Is there a better approach? Not looking for a generic polymorphic solution/mixin -- not after trying to debug django and finding that removing django-polymorphic fixed the deletion issue.
In the end, I went back to django-polymorphic and haven't had the issue I was having before again.