pythonchatbotms-autogen

Streaming function in Microsoft Autogen


I want to know how to use the Streaming function in Autogen.

The code below uses Autogen to enable the Agent to have streaming function.

llm_config = {
    "config_list" : config_list,
    "timeout" : 120,
    "seed" : random.randint(1, 100),
    "stream" : True
}

user_proxy.initiate_chat(
    manager,
    message="hello, the weather is nice today",
    llm_config=llm_config,
) 

The Streaming function that I know of is to print and display each letter as it is created before the sentence is completed, but even if I use the option("stream" : True), the result is that the entire text is output on the terminal.

I'm not sure if I'm misunderstanding the streaming function or if I know how to use it properly. Could you please help? Thank you :)


Solution

  • Currently, there isn't a direct solution provided by AutoGen for the streaming issue you've described. This is evidenced by the lack of solutions in their open issues on GitHub, specifically related to streaming (https://github.com/microsoft/autogen/issues?q=is%3Aissue+is%3Aopen+streaming).

    However, a workaround exists that involves using "monkey patching" to modify the behavior of the function responsible for printing received messages. This method allows you to intercept and alter the output behavior to achieve the streaming effect you're looking for.

    The monkey patching technique can be implemented by overwriting the _print_received_message method in the library. This approach is detailed in https://youtu.be/iNPCB6b5gvA, which provides step-by-step instructions on how to apply this method effectively.