pythonpython-2.7cachingsqlalchemychameleon

Python (Pyramid framework) is persisting data between requests and I can't figure out why


I'm getting the following error on the second refresh of a page: DetachedInstanceError: Instance is not bound to a Session; attribute refresh operation cannot proceed

DetachedInstanceError: Instance <MetadataRef at 0x107b2a0d0> is not bound to a Session; attribute refresh operation cannot proceed

 - Expression: "result.meta_refs(visible_search_only=True)"
 - Filename:   ... ects/WebApps/PYPanel/pypanel/templates/generic/search.pt
 - Location:   (line 45: col 38)
 - Source:     ... meta_ref result.meta_refs(visible_search_only=True)" tal:omi ...
                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 - Arguments:  repeat: {...} (0)
               renderer_name: templates/home.pt
               models: <list - at 0x1069e4d88>
               results: <list - at 0x107c30d40>
               renderer_info: <RendererHelper - at 0x1069b5650>
               active_models: <list - at 0x107b69050>
               query: 
               pagination: <NoneType - at 0x104dd5538>
               req: <Request - at 0x107b4e690>
               title: <NoneType - at 0x104dd5538>
               generic: <NoneType - at 0x104dd5538>
               request: <Request - at 0x107b4e690>
               context: <RootFactory - at 0x107b12090>
               page: 1
               view: <Page - at 0x107b128d0>

The issue seems to be some sharing of cached data between requests. The thing is that it's only supposed to be cached locally (i.e. re-query everything for the next request)

The relevant section of the template is:

        <div tal:repeat="meta_ref result.meta_refs(visible_search_only=True)" tal:omit-tag="True">
            <div tal:define="meta result.meta(meta_ref.key, None)" tal:condition="meta is not None">
                <div>${meta_ref.name} = ${meta}</div>
            </div>
        </div>

My DBSession is only declared once, in models.py (if that makes a difference):

DBSession = scoped_session(sessionmaker(extension=ZopeTransactionExtension()))

If I stop caching it fixes it, which means I just need to make it not cache between requests, which I don't know how to do.

This is my meta_refs function:

def meta_refs(self, visible_only=False, visible_search_only=False):
    model = self.__class__.__name__
    if Base._meta_refs is None:
        Base._meta_refs = {}
        try:
            for result in DBSession.query(MetadataRef):
                if result.model not in Base._meta_refs:
                    Base._meta_refs[result.model] = []
                Base._meta_refs[result.model].append(result)
        except DBAPIError:
            pass
    if model not in Base._meta_refs:
        return []
    results = []
    for result in Base._meta_refs[model]:
        #@TODO: Remove temporary workaround
        if inspect(result).detached:
            Base._meta_refs = None
            return self.meta_refs(visible_only, visible_search_only)
        #END of workaround
        if visible_only and result.visible is False:
            continue
        if visible_search_only and result.visible_search is False:
            continue
        results.append(result)
    return results

It's also worth noting that the meta() function also caches and doesn't have the same issue -- I think likely the key difference is that it caches a dict of strings instead of ORM objects.

I'm using pserve to serve it while I'm developing it (also if that makes a difference)

The temporary workaround in my code, using sqlalchemy.inspect, does work, but I really just want the data to just not persist (i.e. Base._meta_refs should equal None the first time I access it 100% of the time).

Anyone have any ideas? If this is being cached between requests, I'm sure there is other stuff that is as well, and that's too much potential for unexpected behavior.


Solution

  • Assuming Base is a class, you use its _meta_refs attribute to store MetadataRef instances and effectively keep them persisted between requests.

    If SQLAlchemy Session identity map that in many cases works like a cache is not enough you could use request object to store those objects and know that they will only persist for a lifetime of the request.

    And I'd simplify meta_refs method like following:

    @classmethod
    def meta_refs(cls, visible_only=False, visible_search_only=False):
        q = DBSession.query(MetadataRef).filter(MetadataRef.model==cls.__name__)
        if visible_only:
            q = q.filter(MetadataRef.visible==True)
        if visible_search_only:
            q = q.filter(MetadataRef.visible_search==True)
    
        # It might be worth returning q rather than q.all()
        return q.all()