top of page
Writer's pictureRevanth Reddy Tondapu

Part 5: Building a Conversational Q&A Chatbot with Gemini Pro API


Conversational Q&A Chatbot with Gemini Pro
Conversational Q&A Chatbot with Gemini Pro

Introduction

In this guide, we’ll walk you through creating a conversational Q&A chatbot using the Gemini Pro API. Our chatbot will not only respond to user queries but also maintain a history of the conversation. This feature is particularly useful for applications where context is important, and it allows us to review past interactions seamlessly.


Setting Up Your Environment

Before we get started, ensure you have the following prerequisites:

  • Python Version: Python 3.9 or higher.

  • API Key: Make sure you have your Gemini Pro API key ready. You can generate one from the Gemini API website.


Step 1: Create a Virtual Environment

Let's start by setting up a virtual environment. This will help manage our dependencies and keep our project organized.

conda create -p venv python=3.10
conda activate venv/

Step 2: Install Required Packages

Next, create a requirements.txt file with the following content:

streamlit
google-generativeai
python-dotenv

Install the packages by running:

pip install -r requirements.txt

Writing the Code

  • Load Environment Variables:

from dotenv import load_dotenv
import os

load_dotenv()
GOOGLE_API_KEY = os.getenv('GOOGLE_API_KEY')
  • Import Necessary Libraries:

import streamlit as st 
import google.generativeai as genai
  • Configure the Gemini Pro API:

gen_ai.configure(api_key=GOOGLE_API_KEY)
  • Define Function to Get Responses from Gemini Pro:

chat_model = gen_ai.GenerativeModel(model_name="gemini-pro")
chat=model.start_chat(history=[])

def get_gemini_response(question):
    response = chat_model.send_message(question=question, stream=True)
    return response
  • Initialize Streamlit App:

st.set_page_config(page_title="Q&A Chatbot")
st.header("Conversational Q&A Chatbot")

if 'chat_history' not in st.session_state:
    st.session_state['chat_history'] = []

input_text = st.text_input("Ask a question:", key='input')
submit_button = st.button("Submit")
  • Handle User Input and Display Response:

if submit_button and input_text:
    response = get_gemini_response(input_text)
    st.session_state['chat_history'].append(('You', input_text))
	st.subheader("Response:")
	for chunk in response:
		st.write(chunk.text)
		st.session_state['chat_history'].append(('Bot', chunk.text))

st.subheader("Chat History:")
for role, message in st.session_state['chat_history']:
    st.write(f"{role}: {message}")

Running the Application

To run your Streamlit app, open your terminal and execute:

streamlit run your_script_name.py

Replace your_script_name.py with the actual name of your Python script.


Testing the Chatbot

Once the app is running, open the Streamlit interface in your web browser. Type a greeting or a question, and observe how the chatbot responds and updates the chat history in real-time. For example:

  1. User: Hi Bot: Hello, how can I assist you today?

  2. User: What is generative AI? Bot: Generative AI, also known as...


Conclusion

Congratulations! You've successfully built a conversational Q&A chatbot using the Gemini Pro API. This chatbot not only provides real-time responses but also maintains a detailed history of the conversation. This project showcases the power and versatility of generative AI in creating interactive applications.

Next Steps

In future projects, you can expand this chatbot’s capabilities by integrating it with a database to store conversation histories permanently or by adding additional functionalities like document Q&A using advanced embedding techniques.

Stay tuned for more tutorials and keep experimenting with AI technologies!

Happy coding!

5 views0 comments

Recent Posts

See All

Comments


bottom of page