In my project, I use Django as a frontend only. All backend logic is several AWS Lambda Python functions coordinated by the state machine. The database is SQLite in the S3 bucket. I use django_s3_storage
and django_s3_sqlite
. My backend functions change the number of model instances and their details in the SQLite database. I'd like those changes to be reflected by the Django frontend ASAP. However, it does not happen. I see only the outdated in-memory cache.
I have tried several things.
My initial view:
class AllPortfolioRowsView(ListView):
model = PortfolioRow
template_name = "portfolio_all_rows.html"
def get_context_data(self, **kwargs):
context = super().get_context_data(**kwargs)
# do something else
context['object_list'] = PortfolioRow.objects.all().order_by('row_type')
return context
I have read the advice to do stuff inside the get function. In this case, supposedly, a query to the database must be executed every time the function is called.
Variant #2:
class AllPortfolioRowsView(View):
def get(self, request):
template = loader.get_template('portfolio_all_rows.html')
object_list = PortfolioRow.objects.all().order_by('row_type')
# do something else
context = {
'object_list': object_list,
# other stuff
}
return HttpResponse(template.render(context, request))
It worked but did not help.
Then I tried to force querying the database, as recommended by the Django documentation.
Variant #3:
class AllPortfolioRowsView(View):
def get(self, request):
template = loader.get_template('portfolio_all_rows.html')
object_list = PortfolioRow.objects.all().order_by('row_type')
for elem in object_list: # iteration
counter = counter + 1
print(counter, ' - ', elem)
_ = list(object_list) # create list to force re-evaluation
# do something else
context = {
'object_list': object_list,
# other stuff
}
return HttpResponse(template.render(context, request))
It didn't help either.
Only disabling the periodic keep_warm
event in the zappa-settings.json
helped. After that, if there are no calls, the server is released in 10-15 minutes, and the obsolete cache disappears. On the next call, the system has no choice but to re-query the database and display the recent data.
Is there a better solution?
Your Django Lambda container instances will download the SQLite DB on load and then will cache the file locally until the Lambda container is shut down by AWS. This is done automatically by the django_s3_sqlite
package.
Since you are making writes to the database using a separate backend system, the Django Lambda container instances are not aware of changes and thus will not re-download the SQLite DB from S3. The functions you list are all Django based caching and will have no effect.
If you want the Django app to reflect DB changes in a faster time period, you will need to migrate to a centralized database such as PostgreSQL on RDS or DynamoDB.