top of page
  • Writer's pictureRevanth Reddy Tondapu

Integrating Semantic Kernel with Your Python Application


Semantic Kernel
Semantic Kernel

Hello everyone! In this blog post, we'll explore an exciting new project called Semantic Kernel. This toolkit allows developers to integrate cutting-edge language models (LLMs) like OpenAI, Azure OpenAI, and Hugging Face into their applications seamlessly. Semantic Kernel is essentially a software development kit (SDK) that enables you to define plugins and chain them together with just a few lines of code, making it easier to build AI-powered applications.


What is Semantic Kernel?

Semantic Kernel is a toolkit designed to integrate LLMs with conventional programming languages like Python, C#, and Java. It provides a way to define plugins that can be orchestrated by AI, allowing you to create powerful applications with minimal effort. One of the standout features of Semantic Kernel is its ability to automatically generate and execute plans based on user goals.

In this post, we'll walk you through the steps to install Semantic Kernel on your local system and integrate it into a Python application using the OpenAI API. Let's get started!


Setting Up Your Environment

Step 1: Clone the Semantic Kernel Repository

First, clone the Semantic Kernel repository to your local system:

git clone https://github.com/microsoft/semantic-kernel
cd semantic-kernel

Step 2: Configure Your Environment

Next, create a .env file in your project directory to store your API keys. This file will hold your OpenAI or Azure OpenAI credentials.

touch .env

Open the .env file and add your API keys:

OPENAI_API_KEY=your_openai_api_key
AZURE_OPENAI_API_KEY=your_azure_openai_api_key

Replace your_openai_api_key and your_azure_openai_api_key with your actual API keys.


Step 3: Install Dependencies

Create a virtual environment and install the necessary dependencies:

python -m venv venv
source venv/bin/activate  # On Windows use `venv\Scripts\activate`
pip install -r requirements.txt

Step 4: Install Semantic Kernel

Install Semantic Kernel using the following command:

pip install semantic-kernel

Writing the Python Code

Now that we have our environment set up, let's write the code to integrate Semantic Kernel with our Python application.

Step 5: Write the Code

Create a new Python file, hello_world.py, and add the following code:

import asyncio
from semantic_kernel import Kernel
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion
from semantic_kernel.prompt_template import PromptTemplateConfig

# Load environment variables from .env file
from dotenv import load_dotenv, find_dotenv
_ = load_dotenv(find_dotenv())

# Initialize the Semantic Kernel
kernel = Kernel()

# Prepare OpenAI service using credentials stored in the `.env` file
service_id = "chat-gpt"
openai_api_key = os.getenv("OPENAI_API_KEY")
kernel.add_service(OpenAIChatCompletion(service_id=service_id, ai_model_id="gpt-3.5-turbo", api_key=openai_api_key))

# Define the request settings
req_settings = kernel.get_prompt_execution_settings_from_service_id(service_id)
req_settings.max_tokens = 2000
req_settings.temperature = 0.7
req_settings.top_p = 0.8

# Define the prompt
prompt = """
1) A robot may not injure a human being or, through inaction,
allow a human being to come harm.

2) A robot must obey orders given it by human beings except where
such orders would conflict with the First Law.

3) A robot must protect its own existence as long as protection
does not conflict with the First or Second Law.

Give me the TLDR in exactly 5 words.
"""

prompt_template_config = PromptTemplateConfig(
    template=prompt,
    name="tldr",
    template_format="semantic-kernel",
    execution_settings=req_settings,
)

# Add the function to the kernel
function = kernel.add_function(
    function_name="tldr_function",
    plugin_name="tldr_plugin",
    prompt_template_config=prompt_template_config,
)

# Run the prompt
# Note: functions are run asynchronously
async def main():
    result = await kernel.invoke(function)
    print(result)  # => Robots must not harm humans.

if __name__ == "__main__":
    asyncio.run(main())

Explanation of the Code

  1. Import Libraries: We start by importing the necessary libraries and modules.

  2. Load Environment Variables: We use dotenv to load our API keys from the .env file.

  3. Initialize Semantic Kernel: We create an instance of the Kernel class.

  4. Add OpenAI Service: We add the OpenAI service using the API key.

  5. Set Request Settings: We define the request settings, such as the maximum number of tokens, temperature, and top-p.

  6. Define the Prompt: We create a prompt template that the model will use to generate responses.

  7. Add Function: We add a function to the kernel that uses the prompt template.

  8. Run the Prompt: We run the prompt asynchronously and print the result.

Running the Code

Finally, run the Python script:

python hello_world.py

You should see the output generated by the model, which in this case is a summary in exactly five words.


Conclusion

In this post, we explored how to set up and integrate Semantic Kernel with a Python application using the OpenAI API. Semantic Kernel makes it incredibly easy to add powerful language models to your applications, allowing you to create AI-powered features with minimal effort.

Feel free to try out the code and let us know how it works for you. If you have any questions or run into any issues, drop a comment below. Happy coding!

I hope you found this guide helpful. If you enjoyed the content, consider subscribing to the blog and sharing it with your network. Thanks for reading!

2 views0 comments

Комментарии


bottom of page