1

I have a resource intensive async method that i want to run as a background task. Example code for it looks like this:

@staticmethod
async def trigger_task(id: str, run_e2e: bool = False):
    try:
        add_status_for_task(id)
        result1, result2 = await task(id)
        update_status_for_task(id, result1, result2)
    except Exception:
        update_status_for_task(id, 'FAIL')


@router.post("/task")
async def trigger_task(background_tasks: BackgroundTasks):
    background_tasks.add_task(EventsTrigger.trigger_task)
    return {'msg': 'Task submitted!'}

When i trigger this endpoint, I expect an instant output: {'msg': 'Task submitted!'}. But instead the api output is awaited till the task completes. I am following this documentation from fastapi.

fastapi: v0.70.0 python: v3.8.10

I believe the issue is similar to what is described here. Request help in making this a non-blocking call.

udit
  • 61
  • 1
  • 15
  • Does this answer your question? [fastapi asynchronous background tasks blocks other requests?](https://stackoverflow.com/questions/67599119/fastapi-asynchronous-background-tasks-blocks-other-requests) – mihi Dec 02 '21 at 11:15

2 Answers2

0

How are you running your app?

According to the uvicorn docs its running with 1 worker by default, which means only one process will be issued simultaneously. Try configuring your uvicorn to run with more workers. https://www.uvicorn.org/deployment/

$ uvicorn example:app --port 5000 --workers THE_AMOUNT_OF_WORKERS
or
uvicorn.run("example:app", host="127.0.0.1", port=5000, workers=THE_AMOUNT_OF_WORKERS)
  • This does not seems to be a correct approach because with this, if i run 4 workers and have 20 users, then at a time only 4 of them will be able to run their task because this was a blocking call. – udit Nov 29 '21 at 04:27
  • It should handle it right if you are using async await in python right. Check that the thing u do support async work. I would recommend u to think about using python celery, my team use it to run async tasks for a large number of users. Depends on your amount of users and on the amount of time required to complete the task i would recommend scaling your service. – Yair Siman Tov Dec 01 '21 at 15:23
-1

What I have learned from the github issues,

  • You can't use async def for task functions (Which will run in background.)
  • As in background process you can't access the coroutine, So, your async/await will not work.
  • You can still try without async/await. If that also doesn't work then you should go for alternative.

Alternative Background Solution

  • Celery is production ready task scheduler. So, you can easily configure and run the background task using your_task_function.delay(*args, **kwargs)
  • Note that, Celery also doesn't support async in background task. So, whatever you need to write is sync code to run in background.

Good Luck :)