I am writing a telegram bot that will generate images using this stable diffusion model: https://replicate.com/mbentley124/openjourney-img2img
I am using the replicate module. however, the code snippet below slows down the bot when someone sends a request.
async def txt2img(message, prompt, negative, stepMode, width, height):
print('Generating image for ' + str(message.chat.id))
output = replicate.run(
"tstramer/midjourney-diffusion:436b051ebd8f68d23e83d22de5e198e0995357afef113768c20f0b6fcef23c8b",
input={"prompt": "mdjrny-v4 " + prompt, "negative_prompt": negative, "num_inference_steps": stepMode*10, "width": width, "height": height}
)
print(output[0] + '\n')
return await message.answer_photo(output[0])
Please help to make this code asynchronous and it is desirable to add a queue.
txt2img
is not asynchronous, because the replicate.run
function you're using is a blocking function, lets use a library like concurrent.futures
to call replicate.run
in a separate thread.
Like in my example below, dont forget to put the loop somewhere in your code loop = asyncio.get_event_loop()
import concurrent.futures
async def txt2img(message, prompt, negative, stepMode, width, height):
print('Generating image for ' + str(message.chat.id))
def blocking_code():
return replicate.run(
"tstramer/midjourney-diffusion:436b051ebd8f68d23e83d22de5e198e0995357afef113768c20f0b6fcef23c8b",
input={"prompt": "mdjrny-v4 " + prompt, "negative_prompt": negative,
"num_inference_steps": stepMode*10, "width": width, "height": height}
)
with concurrent.futures.ThreadPoolExecutor() as executor:
future = executor.submit(blocking_code)
output = await loop.run_in_executor(None, future.result)
print(output[0] + '\n')
return await message.answer_photo(output[0])