top of page
  • Writer's pictureRevanth Reddy Tondapu

Creating a Chatbot Interface with Google Mesop: A Step-by-Step Guide

Updated: Jun 19


Google Mesop
Google Mesop

Creating user interfaces for AI models can be daunting, especially for those who are not well-versed in frontend development. Fortunately, Google Mesop simplifies this process, allowing you to create sophisticated user interfaces with just a few lines of code. In this blog post, we will walk through creating a chatbot that provides a daily meal plan using Google Mesop and various AI models.


Setting Up the Environment

Before diving into the code, you need to set up your development environment. Follow these steps:

  • Create and Activate a Virtual Environment:

conda create -p venv python=3.10
conda activate venv/
  • Add Required Packages to requirements.txt:

mesop
openai
google-generativeai
ollama
groq
llama-index
llama-index-llms-anthropic
langchain-community
python-dotenv
  • Install the Required Packages:

pip install -r requirements.txt

Building the Chatbot Interface with Google Mesop

1. Using Groq

Let's start by creating a chatbot using Google Mesop and Groq. We'll begin by importing the necessary packages and setting up the basic structure.

import groq
import mesop as me
import mesop.labs as mel
from mesop import stateclass
from dotenv import load_dotenv
import os

# Load environment variables from .env file
load_dotenv()

# Initialize the Groq client
client = groq.Groq()
client = groq.Groq(api_key=os.environ["GROQ_API_KEY"])

@stateclass
class State:
    pass

@me.page(
    security_policy=me.SecurityPolicy(
        allowed_iframe_parents=["https://google.github.io"]
    ),
    path="/",
    title="Mesop Demo Chat",
)
def page():
    mel.chat(transform, title="Groq Chat", bot_user="Mesop Bot")

def transform(input: str, history: list[mel.ChatMessage]):
    # Construct the messages list with a system prompt and the chat history
    messages = [{"role": "system", "content": "You are a helpful assistant."}]
    messages.extend([{"role": "user", "content": message.content} for message in history])
    messages.append({"role": "user", "content": input})

    # Call the Groq API to generate a response
    stream = client.chat.completions.create(
        messages=messages,
        model="llama3-8b-8192",
        temperature=0.5,
        max_tokens=1024,
        top_p=1,
        stop=None,
        stream=True,
    )

    # Stream the response back to the user interface
    for chunk in stream:
        content = chunk.choices[0].delta.content
        if content:
            yield content

if __name__ == "__main__":
    me.run()

Running the Application

To run the application, execute the following command in your terminal:

mesop app.py

This will start a local server, and you can open the provided URL in your web browser to interact with the chatbot.


Integrating Other AI Models

2. Using Ollama

You can also integrate Ollama with Mesop. Here's how you can do it:

import ollama
import mesop as me
import mesop.labs as mel
from dotenv import load_dotenv

# Load environment variables from .env file
load_dotenv()

@me.page(
    security_policy=me.SecurityPolicy(
        allowed_iframe_parents=["https://google.github.io"]
    ),
    path="/",
    title="Mesop Demo Chat",
)
def page():
    mel.chat(transform, title="Ollama Chat", bot_user="Mesop Bot")

def transform(input: str, history: list[mel.ChatMessage]):
    messages = [{"role": "user", "content": message.content} for message in history]
    messages.append({"role": "user", "content": input})
    
    stream = ollama.chat(model='llama3', messages=messages, stream=True)
    
    for chunk in stream:
        content = chunk.get('message', {}).get('content', '')
        if content:
            yield content

if __name__ == "__main__":
    me.run()

Running the Application

To run the application, execute the following command in your terminal:

mesop app.py

This will start a local server, and you can open the provided URL in your web browser to interact with the chatbot.


3. Using Google Generative AI

To integrate Google Generative AI, follow this example:

import time
import google.generativeai as genai
import mesop as me
import mesop.labs as mel
import os
from dotenv import load_dotenv

# Load environment variables from .env file
load_dotenv()

genai.configure(api_key=os.environ["GEMINI_API_KEY"])

@me.page(
    security_policy=me.SecurityPolicy(
        allowed_iframe_parents=["https://google.github.io"]
    ),
    path="/",
    title="Mesop Demo Chat",
)
def page():
    mel.chat(transform, title="Gemini Chat", bot_user="Mesop Bot")

generation_config = {
    "temperature": 0.5,
    "top_p": 0.90,
    "top_k": 65,
    "max_output_tokens": 8192,
    "response_mime_type": "text/plain",
}

model = genai.GenerativeModel(
    model_name="gemini-1.5-flash",
    generation_config=generation_config,
    system_instruction="You are a helpful assistant, you provide helpful answers."
)

def transform(input: str, history: list[mel.ChatMessage]):
    chat_history = "\n".join(message.content for message in history)
    full_input = f"{chat_history}\n{input}"
    response = model.generate_content(full_input, stream=True)
    for chunk in response:
        yield chunk.text

Running the Application

To run the application, execute the following command in your terminal:

mesop app.py

This will start a local server, and you can open the provided URL in your web browser to interact with the chatbot


4. Using Llama3

To integrate Llama3, follow this example:

import mesop as me
import mesop.labs as mel
from langchain_community.llms import Ollama
from mesop import stateclass
from dotenv import load_dotenv

# Load environment variables from .env file
load_dotenv()

# Initialize the Llama3 model
llm = Ollama(model="llama3")

@stateclass
class State:
    pass

@me.page(
    security_policy=me.SecurityPolicy(
        allowed_iframe_parents=["https://google.github.io"]
    ),
    path="/",
    title="Mesop Demo Chat",
)
def page():
    mel.chat(transform, title="LangChain Chat", bot_user="Mesop Bot")

def transform(input: str, history: list[mel.ChatMessage]):
    # Construct the message history for the Llama3 model
    messages = [f"System: You are a helpful assistant."]
    messages.extend([f"User: {message.content}" for message in history])
    messages.append(f"User: {input}")

    query = "\n".join(messages)
    # Query the Llama3 model and stream the response
    resp = llm.stream(query)
    
    for chunk in resp:
        if chunk:
            yield chunk

if __name__ == "__main__":
    me.run()

Running the Application

To run the application, execute the following command in your terminal:

mesop app.py

This will start a local server, and you can open the provided URL in your web browser to interact with the chatbot


Conclusion

Creating a chatbot with Google Mesop is straightforward and requires minimal code. By leveraging AI models and Mesop's intuitive interface, you can build powerful applications quickly. Whether you're using Groq, Ollama, Google Generative AI, or any other model, the process remains largely the same, making it easy to switch between different AI solutions.

Feel free to experiment with different models and configurations to see what works best for your needs. Happy coding!

I hope you found this guide helpful. If you have any questions or run into issues, feel free to leave a comment below. Happy coding!

76 views0 comments

Comments


bottom of page