Starting with AutoGen — Automated multi-agent

Ravindra Elicherla
Generative AI
Published in
4 min readNov 5, 2023

--

“let the agents speak to each other and solve problems without human interaction”

What is AutoGen?

AutoGen was announced by Microsoft on September 25th, 2023. While the use cases are still at a primitive level but as per Doug Burger, Technical Fellow, Microsoft “AutoGen is poised to fundamentally transform and extend what large language models are capable of. This is one of the most exciting developments I have seen in AI recently”

AutoGen is a framework that enables the development of LLM applications using multiple agents that can converse with each other to solve tasks. AutoGen agents are customizable, conversable, and seamlessly allow human participation. They can operate in various modes that employ combinations of LLMs, human inputs, and tools.

Pic: Ref https://github.com/microsoft/autogen

This enables building next-gen LLM applications based on multi-agent conversations. What does this mean? We can make agent1 to speak to agent2 and get the work done. It is same as two Robo’s speaking to each other just like humans and completing the work they were allocated. Imagine that you have a maid and cook at home. How is your life if they speak to each other and finish cooking and yet keep the kitchen tidy.

There are two types of agents Autogen provides

  • Conversable: Agents in AutoGen are conversable, which means that any agent can send and receive messages from other agents to initiate or continue a conversation
  • Customizable: Agents in AutoGen can be customized to integrate LLMs, humans, tools, or a combination of them.

In this blog let’s create a multi-agent bot

python3 -m venv autogentest
source autogentest/bin/activate

You can also use Conda to create virtual environment

Let’s install pyautogen.

pip install pyautogen

This enables Next-Gen LLM Applications via Multi-Agent Conversation Framework. pyautogen requires Open AI.

  1. Import Autogen
import autogen

2. Create a config list

config_list = [
{
"model": 'gpt-3.5-turbo-16k',
"api_key": 'sk-asv4bePDDgxxxxxxxxxxxxxxxxxxxxx',
}
]

3. llm configuration

llm_config = {
"request_timeout": 600,
"seed": 42,
"config_list": config_list,
"temperature": 0,
}

Seed is for caching and reproducibility

Temperature for sampling. 0- is low on creativity and 1 is high in creativity. if we need definitive answers giving 0 is good

4. Define Assistant

assistant = autogen.AssistantAgent(
name ="Assistant",
llm_config = llm_config,
system_message = "I am Assistant",
)

5. Create user proxy. This is user agent ans works as proxy for user

user_proxy = autogen.UserProxyAgent(
name="user_proxy",
human_input_mode="NEVER",
max_consecutive_auto_reply=10,
is_termination_msg=lambda x: x.get("content", "").rstrip().endswith("TERMINATE"),
llm_config=llm_config,
code_execution_config={
"work_dir": "coding",
"use_docker": False, # set to True or image name like "python:3" to use docker
},
system_message = "Reply TERMINATE if the task has been solved for your full satisfaction. Or reply CONTINUE if the task is not solved yet",
)

Plan is to not have any user input and the discussion will get terminated if the proxy user says “TERMINATE”.

6. We are now good with the code. One final step is ask agent to do somthing useful. I have asked the agent to summarise the ICC World cup match that India won few minutes ago (on 5th November) against South Africa.

user_proxy.initiate_chat(
assistant,
message="""Give me summary of this news item https://timesofindia.indiatimes.com/sports/cricket/icc-world-cup/news/how-indias-juggernaut-continued-at-world-cup-with-rout-of-south-africa/articleshow/104991030.cms """,

)

7. Now run the program

python autogentest.py

Agent did fairly well. below is the output.

Use case: This agent can be used to create news shorts for quick consumtion of news.

Ref:

  1. https://www.microsoft.com/en-us/research/blog/autogen-enabling-next-generation-large-language-model-applications/
  2. https://github.com/microsoft/autogen

This story is published on Generative AI. Connect with us on LinkedIn and follow Zeniteq to stay in the loop with the latest AI stories. Let’s shape the future of AI together!

--

--

Geek, Painter, Fitness enthusiast, Book worm, Options expert and Simple human