top of page
Writer's pictureRevanth Reddy Tondapu

Running AI Agents Locally with OpenAI Swarm and Ollama LM Studio



Running AI Agents Locally with OpenAI Swarm and Ollama LM Studio
Running AI Agents Locally with OpenAI Swarm and Ollama LM Studio

Artificial Intelligence (AI) is transforming the way we tackle complex tasks, making it easier to automate repetitive processes and increase productivity. One of the latest innovations in this field is the OpenAI Swarm framework, which streamlines the creation and coordination of AI agents. These agents are independent AI systems that can work together to solve complex problems. In this blog post, we'll walk you through setting up and running OpenAI Swarm locally using Ollama LM Studio, ensuring your AI activities remain private and secure.


Understanding OpenAI Swarm

OpenAI Swarm is a powerful framework designed to make it easier to create and manage AI agents. These agents can independently or collaboratively perform tasks such as conducting research or editing documents. The framework quickly gained popularity due to its ease of use and robust capabilities, becoming a favorite among AI enthusiasts.


Setting Up OpenAI Swarm Locally

Running OpenAI Swarm locally on your computer allows you to maintain privacy and control over your AI processes. Here’s a comprehensive guide on how to set it up using Ollama LM Studio:


Step 1: Install Required Packages

Begin by installing the OpenAI Swarm package along with the DuckDuckGo search tool for internet queries. Open your terminal and execute the following command:

pip install git+https://github.com/openai/swarm.git duckduckgo-search

Step 2: Configure the Llama Model

Download the Llama 3.2 model using Ollama, which will serve as the core model for your AI agents:

ollama pull llama3.2

Set up your environment variables to point to the local instance of your Llama model:

export OPENAI_API_KEY=fake-key
export OPENAI_MODEL_NAME=llama3.2
export OPENAI_BASE_URL=http://localhost:11434/v1

Step 3: Create AI Agents

With Python, you can define and execute AI agents. Here’s a basic setup for two agents: a news search agent and an editor agent.

  1. News Agent: This agent uses DuckDuckGo to fetch the latest news articles on a specified topic.

from duckduckgo_search import DDGS
from swarm import Swarm, Agent
from datetime import datetime

current_date = datetime.now().strftime("%Y-%m")
client = Swarm()

def get_news_articles(topic):
    print(f"Running DuckDuckGo news search for {topic}...")
    ddg_api = DDGS()
    results = ddg_api.text(f"{topic} {current_date}", max_results=5)
    if results:
        news_results = "\n\n".join([f"Title: {result['title']}\nURL: {result['href']}\nDescription: {result['body']}" for result in results])
        return news_results
    else:
        return f"Could not find news results for {topic}."

news_agent = Agent(
    name="News Assistant",
    instructions="You provide the latest news articles for a given topic using DuckDuckGo search.",
    functions=[get_news_articles],
    model="llama3.2"
)
  1. Editor Agent: This agent refines and formats the news articles for publication.

editor_agent = Agent(
    name="Editor Assistant",
    instructions="Rewrite and give me as news article ready for publishing. Each News story in separate section.",
    model="llama3.2"
)

Step 4: Develop a Workflow

Integrate the agents into a workflow that automates the process of fetching and editing news articles:

def run_news_workflow(topic):
    print("Running news Agent workflow...")
    
    # Step 1: Fetch news
    news_response = client.run(
        agent=news_agent,
        messages=[{"role": "user", "content": f"Get me the news about {topic} on {current_date}"}],
    )
    
    raw_news = news_response.messages[-1]["content"]
    
    # Step 2: Pass news to editor for final review
    edited_news_response = client.run(
        agent=editor_agent,
        messages=[{"role": "user", "content": raw_news }],
    )
    
    return edited_news_response.messages[-1]["content"]

# Example of running the news workflow for a given topic
print(run_news_workflow("AI"))

Running the Workflow

To execute your AI agents and workflow, simply run the Python script from your terminal:

python app.py

This command will initiate the news search and editing process, providing you with a summarized news article based on your specified topic.


Conclusion

Using OpenAI Swarm locally with Ollama LM Studio allows you to harness advanced AI capabilities while maintaining control and privacy. This setup is particularly useful for tasks that require automation and intelligence, such as internet research and content creation. By following the steps outlined above, you can easily deploy AI agents on your local machine and explore the vast potential of AI in a secure environment. secure environment.

121 views0 comments

תגובות


bottom of page