pythonpython-requestspython-asyncioaiohttp

Python - Run multiple async functions simultaneously


I'm essentially making a pinger, that makes has a 2d list, of key / webhook pairs, and after pinging a key, send the response to a webhook

the 2d list goes as follows:

some_list = [["key1", "webhook1"], ["key2", "webhook2"]]

My program is essentially a loop, and I'm not too sure how I can rotate the some_list data, in the function.

Here's a little demo of what my script looks like:

async def do_ping(some_pair):
    async with aiohttp.ClientSession() as s:
        tasks = await gen_tasks(s, some_pair)
        results = await asyncio.gather(*tasks*)
        sleep(10)
        await do_ping(some_pair)

I've tried:

async def main(): 
    for entry in some_list: 
        asyncio.run(do_ping(entry))

but due to the do_ping function being a self-calling loop, it just calls the first one over and over again, and never gets to the ones after it. Hoping to find a solution to this, whether it's threading or alike, and if you have a better way of structuring some_list values (which I assume would be a dictionary), feel free to drop that feedback as well


Solution

  • You made your method recursive await do_ping(some_pair), it never ends for the loop in main to continue. I would restructure the application like this:

    async def do_ping(some_pair):
        async with aiohttp.ClientSession() as s:
            while True:
                tasks = await gen_tasks(s, some_pair)
                results = await asyncio.gather(*tasks)
                await asyncio.sleep(10)
    
    
    async def main(): 
        tasks = [do_ping(entry) for entry in some_list]
        await asyncio.gather(*tasks)
    
    
    if __name__ == "__main__":
        asyncio.run(main())
    

    Alternatively you could move the repeat and sleeping logic into the main:

    async def do_ping(some_pair):
        async with aiohttp.ClientSession() as s:
            tasks = await gen_tasks(s, some_pair)
            results = await asyncio.gather(*tasks)
    
    
    async def main(): 
        while True:
            tasks = [do_ping(entry) for entry in some_list]
            await asyncio.gather(*tasks)
            await asyncio.sleep(10)
    
    
    if __name__ == "__main__":
        asyncio.run(main())
    

    You could also start the tasks before doing a call to sleep, and gather them afterwards. That would make the pings more consistently start at 10 second intervals instead of being 10 seconds + the time it takes to gather the results:

    async def main(): 
        while True:
            tasks = [
                asyncio.create_task(do_ping(entry))
                for entry in some_list
            ]
            await asyncio.sleep(10)
            await asyncio.wait(tasks)
    

    EDIT As pointed out by creolo you should only create a single ClientSession object. See https://docs.aiohttp.org/en/stable/client_reference.html

    Session encapsulates a connection pool (connector instance) and supports keepalives by default. Unless you are connecting to a large, unknown number of different servers over the lifetime of your application, it is suggested you use a single session for the lifetime of your application to benefit from connection pooling.

    async def do_ping(session, some_pair):
        tasks = await gen_tasks(session, some_pair)
        results = await asyncio.gather(*tasks)
    
    async def main(): 
        async with aiohttp.ClientSession() as session:
            while True:
                tasks = [
                    asyncio.create_task(do_ping(session, entry))
                    for entry in some_list
                ]
                await asyncio.sleep(10)
                await asyncio.wait(tasks)