I'm hoping to avoid any use of Celery at the moment. In Starlette's docs they give two ways to add background tasks:
Via Graphene: https://www.starlette.io/graphql/
class Query(graphene.ObjectType):
user_agent = graphene.String()
def resolve_user_agent(self, info):
"""
Return the User-Agent of the incoming request.
"""
user_agent = request.headers.get("User-Agent", "<unknown>")
background = info.context["background"]
background.add_task(log_user_agent, user_agent=user_agent)
return user_agent
Via a JSON response: https://www.starlette.io/background/
async def signup(request):
data = await request.json()
username = data['username']
email = data['email']
task = BackgroundTask(send_welcome_email, to_address=email)
message = {'status': 'Signup successful'}
return JSONResponse(message, background=task)
Does anyone know of a way to add tasks to Starlette's background with Ariadne? I am unable to return a JSONResponse in my resolver, and I do not have access to a info.context["background"]. The only thing I have attached to my context is my request object.
The above which is marked as a solution does not work for me since BackgroundTask does not work properly when you use a middleware that subclasses BaseHTTPMiddleware see here:
https://github.com/encode/starlette/issues/919
In my case basically the task is not ran in the background and it is awaited to be completed, also I am not using Ariadne, but this should let you do the job and run a task in the background
Edit: This worked for me.
executor = ProcessPoolExecutor()
main.executor.submit(
bg_process_funcs,
export_file_format,
export_headers,
data,
alert_type,
floor_subtitle,
date_subtitle,
pref_datetime,
pref_timezone,
export_file_name,
export_limit,)
executor.shutdown()
logger.info("Process Pool Shutdown")