semantic-kernel


Namesemantic-kernel JSON
Version 1.35.0 PyPI version JSON
download
home_pageNone
SummarySemantic Kernel Python SDK
upload_time2025-07-16 00:33:47
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Get Started with Semantic Kernel Python

Highlights
- Flexible Agent Framework: build, orchestrate, and deploy AI agents and multi-agent systems
- Multi-Agent Systems: Model workflows and collaboration between AI specialists
- Plugin Ecosystem: Extend with Python, OpenAPI, Model Context Protocol (MCP), and more
- LLM Support: OpenAI, Azure OpenAI, Hugging Face, Mistral, Vertex AI, ONNX, Ollama, NVIDIA NIM, and others
- Vector DB Support: Azure AI Search, Elasticsearch, Chroma, and more
- Process Framework: Build structured business processes with workflow modeling
- Multimodal: Text, vision, audio

## Quick Install

```bash
pip install --upgrade semantic-kernel
# Optional: Add integrations
pip install --upgrade semantic-kernel[hugging_face]
pip install --upgrade semantic-kernel[all]
```

Supported Platforms:
- Python: 3.10+
- OS: Windows, macOS, Linux

## 1. Setup API Keys

Set as environment variables, or create a .env file at your project root:

```bash
OPENAI_API_KEY=sk-...
OPENAI_CHAT_MODEL_ID=...
...
AZURE_OPENAI_API_KEY=...
AZURE_OPENAI_ENDPOINT=...
AZURE_OPENAI_CHAT_DEPLOYMENT_NAME=...
...
```

You can also override environment variables by explicitly passing configuration parameters to the AI service constructor:

```python
chat_service = AzureChatCompletion(
    api_key=...,
    endpoint=...,
    deployment_name=...,
    api_version=...,
)
```

See the following [setup guide](https://github.com/microsoft/semantic-kernel/tree/main/python/samples/concepts/setup) for more information.

## 2. Use the Kernel for Prompt Engineering

Create prompt functions and invoke them via the `Kernel`:

```python
import asyncio
from semantic_kernel import Kernel
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion
from semantic_kernel.functions import KernelArguments

kernel = Kernel()
kernel.add_service(OpenAIChatCompletion())

prompt = """
1) A robot may not injure a human being...
2) A robot must obey orders given it by human beings...
3) A robot must protect its own existence...

Give me the TLDR in exactly {{$num_words}} words."""


async def main():
    result = await kernel.invoke_prompt(prompt, arguments=KernelArguments(num_words=5))
    print(result)


asyncio.run(main())
# Output: Protect humans, obey, self-preserve, prioritized.
```

## 3. Directly Use AI Services (No Kernel Required)

You can use the AI service classes directly for advanced workflows:

```python
import asyncio
import asyncio

from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion, OpenAIChatPromptExecutionSettings
from semantic_kernel.contents import ChatHistory


async def main():
    service = OpenAIChatCompletion()
    settings = OpenAIChatPromptExecutionSettings()

    chat_history = ChatHistory(system_message="You are a helpful assistant.")
    chat_history.add_user_message("Write a haiku about Semantic Kernel.")
    response = await service.get_chat_message_content(chat_history=chat_history, settings=settings)
    print(response.content)

    """
    Output:

    Thoughts weave through context,  
    Semantic threads interlace—  
    Kernel sparks meaning.
    """


asyncio.run(main())
```

## 4. Build an Agent with Plugins and Tools

Add Python functions as plugins or Pydantic models as structured outputs;

Enhance your agent with custom tools (plugins) and structured output:

```python
import asyncio
from typing import Annotated
from pydantic import BaseModel
from semantic_kernel.agents import ChatCompletionAgent
from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion, OpenAIChatPromptExecutionSettings
from semantic_kernel.functions import kernel_function, KernelArguments

class MenuPlugin:
    @kernel_function(description="Provides a list of specials from the menu.")
    def get_specials(self) -> Annotated[str, "Returns the specials from the menu."]:
        return """
        Special Soup: Clam Chowder
        Special Salad: Cobb Salad
        Special Drink: Chai Tea
        """

    @kernel_function(description="Provides the price of the requested menu item.")
    def get_item_price(
        self, menu_item: Annotated[str, "The name of the menu item."]
    ) -> Annotated[str, "Returns the price of the menu item."]:
        return "$9.99"

class MenuItem(BaseModel):
    # Used for structured outputs
    price: float
    name: str

async def main():
    # Configure structured outputs format
    settings = OpenAIChatPromptExecutionSettings()
    settings.response_format = MenuItem

    # Create agent with plugin and settings
    agent = ChatCompletionAgent(
        service=AzureChatCompletion(),
        name="SK-Assistant",
        instructions="You are a helpful assistant.",
        plugins=[MenuPlugin()],
        arguments=KernelArguments(settings)
    )

    response = await agent.get_response("What is the price of the soup special?")
    print(response.content)

    # Output:
    # The price of the Clam Chowder, which is the soup special, is $9.99.

asyncio.run(main()) 
```

You can explore additional getting started agent samples [here](https://github.com/microsoft/semantic-kernel/tree/main/python/samples/getting_started_with_agents).

## 5. Multi-Agent Orchestration

Coordinate a group of agents to iteratively solve a problem or refine content together:

```python
import asyncio
from semantic_kernel.agents import ChatCompletionAgent, GroupChatOrchestration, RoundRobinGroupChatManager
from semantic_kernel.agents.runtime import InProcessRuntime
from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion

def get_agents():
    return [
        ChatCompletionAgent(
            name="Writer",
            instructions="You are a creative content writer. Generate and refine slogans based on feedback.",
            service=AzureChatCompletion(),
        ),
        ChatCompletionAgent(
            name="Reviewer",
            instructions="You are a critical reviewer. Provide detailed feedback on proposed slogans.",
            service=AzureChatCompletion(),
        ),
    ]

async def main():
    agents = get_agents()
    group_chat = GroupChatOrchestration(
        members=agents,
        manager=RoundRobinGroupChatManager(max_rounds=5),
    )
    runtime = InProcessRuntime()
    runtime.start()
    result = await group_chat.invoke(
        task="Create a slogan for a new electric SUV that is affordable and fun to drive.",
        runtime=runtime,
    )
    value = await result.get()
    print(f"Final Slogan: {value}")

    # Example Output:
    # Final Slogan: "Feel the Charge: Adventure Meets Affordability in Your New Electric SUV!"

    await runtime.stop_when_idle()

if __name__ == "__main__":
    asyncio.run(main())
```

For orchestration-focused examples, see [these orchestration samples](https://github.com/microsoft/semantic-kernel/tree/main/python/samples/getting_started_with_agents/multi_agent_orchestration).

## More Examples & Notebooks

- [Getting Started with Agents](https://github.com/microsoft/semantic-kernel/tree/main/python/samples/getting_started_with_agents): Practical agent orchestration and tool use  
- [Getting Started with Processes](https://github.com/microsoft/semantic-kernel/tree/main/python/samples/getting_started_with_processes): Modeling structured workflows with the Process framework  
- [Concept Samples](https://github.com/microsoft/semantic-kernel/tree/main/python/samples/concepts): Advanced scenarios, integrations, and SK patterns  
- [Getting Started Notebooks](https://github.com/microsoft/semantic-kernel/tree/main/python/samples/getting_started): Interactive Python notebooks for rapid experimentation  

## Semantic Kernel Documentation

- [Getting Started with Semantic Kernel Python](https://learn.microsoft.com/en-us/semantic-kernel/get-started/quick-start-guide?pivots=programming-language-python)  
- [Agent Framework Guide](https://learn.microsoft.com/en-us/semantic-kernel/frameworks/agent/?pivots=programming-language-python)  
- [Process Framework Guide](https://learn.microsoft.com/en-us/semantic-kernel/frameworks/process/process-framework)

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "semantic-kernel",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": "Microsoft <SK-Support@microsoft.com>",
    "download_url": "https://files.pythonhosted.org/packages/bc/5c/4d761ff412c211260415f0e6683d22139b4ab990d9010c9962d1ec35d1b8/semantic_kernel-1.35.0.tar.gz",
    "platform": null,
    "description": "# Get Started with Semantic Kernel Python\n\nHighlights\n- Flexible Agent Framework: build, orchestrate, and deploy AI agents and multi-agent systems\n- Multi-Agent Systems: Model workflows and collaboration between AI specialists\n- Plugin Ecosystem: Extend with Python, OpenAPI, Model Context Protocol (MCP), and more\n- LLM Support: OpenAI, Azure OpenAI, Hugging Face, Mistral, Vertex AI, ONNX, Ollama, NVIDIA NIM, and others\n- Vector DB Support: Azure AI Search, Elasticsearch, Chroma, and more\n- Process Framework: Build structured business processes with workflow modeling\n- Multimodal: Text, vision, audio\n\n## Quick Install\n\n```bash\npip install --upgrade semantic-kernel\n# Optional: Add integrations\npip install --upgrade semantic-kernel[hugging_face]\npip install --upgrade semantic-kernel[all]\n```\n\nSupported Platforms:\n- Python: 3.10+\n- OS: Windows, macOS, Linux\n\n## 1. Setup API Keys\n\nSet as environment variables, or create a .env file at your project root:\n\n```bash\nOPENAI_API_KEY=sk-...\nOPENAI_CHAT_MODEL_ID=...\n...\nAZURE_OPENAI_API_KEY=...\nAZURE_OPENAI_ENDPOINT=...\nAZURE_OPENAI_CHAT_DEPLOYMENT_NAME=...\n...\n```\n\nYou can also override environment variables by explicitly passing configuration parameters to the AI service constructor:\n\n```python\nchat_service = AzureChatCompletion(\n    api_key=...,\n    endpoint=...,\n    deployment_name=...,\n    api_version=...,\n)\n```\n\nSee the following [setup guide](https://github.com/microsoft/semantic-kernel/tree/main/python/samples/concepts/setup) for more information.\n\n## 2. Use the Kernel for Prompt Engineering\n\nCreate prompt functions and invoke them via the `Kernel`:\n\n```python\nimport asyncio\nfrom semantic_kernel import Kernel\nfrom semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion\nfrom semantic_kernel.functions import KernelArguments\n\nkernel = Kernel()\nkernel.add_service(OpenAIChatCompletion())\n\nprompt = \"\"\"\n1) A robot may not injure a human being...\n2) A robot must obey orders given it by human beings...\n3) A robot must protect its own existence...\n\nGive me the TLDR in exactly {{$num_words}} words.\"\"\"\n\n\nasync def main():\n    result = await kernel.invoke_prompt(prompt, arguments=KernelArguments(num_words=5))\n    print(result)\n\n\nasyncio.run(main())\n# Output: Protect humans, obey, self-preserve, prioritized.\n```\n\n## 3. Directly Use AI Services (No Kernel Required)\n\nYou can use the AI service classes directly for advanced workflows:\n\n```python\nimport asyncio\nimport asyncio\n\nfrom semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion, OpenAIChatPromptExecutionSettings\nfrom semantic_kernel.contents import ChatHistory\n\n\nasync def main():\n    service = OpenAIChatCompletion()\n    settings = OpenAIChatPromptExecutionSettings()\n\n    chat_history = ChatHistory(system_message=\"You are a helpful assistant.\")\n    chat_history.add_user_message(\"Write a haiku about Semantic Kernel.\")\n    response = await service.get_chat_message_content(chat_history=chat_history, settings=settings)\n    print(response.content)\n\n    \"\"\"\n    Output:\n\n    Thoughts weave through context,  \n    Semantic threads interlace\u2014  \n    Kernel sparks meaning.\n    \"\"\"\n\n\nasyncio.run(main())\n```\n\n## 4. Build an Agent with Plugins and Tools\n\nAdd Python functions as plugins or Pydantic models as structured outputs;\n\nEnhance your agent with custom tools (plugins) and structured output:\n\n```python\nimport asyncio\nfrom typing import Annotated\nfrom pydantic import BaseModel\nfrom semantic_kernel.agents import ChatCompletionAgent\nfrom semantic_kernel.connectors.ai.open_ai import AzureChatCompletion, OpenAIChatPromptExecutionSettings\nfrom semantic_kernel.functions import kernel_function, KernelArguments\n\nclass MenuPlugin:\n    @kernel_function(description=\"Provides a list of specials from the menu.\")\n    def get_specials(self) -> Annotated[str, \"Returns the specials from the menu.\"]:\n        return \"\"\"\n        Special Soup: Clam Chowder\n        Special Salad: Cobb Salad\n        Special Drink: Chai Tea\n        \"\"\"\n\n    @kernel_function(description=\"Provides the price of the requested menu item.\")\n    def get_item_price(\n        self, menu_item: Annotated[str, \"The name of the menu item.\"]\n    ) -> Annotated[str, \"Returns the price of the menu item.\"]:\n        return \"$9.99\"\n\nclass MenuItem(BaseModel):\n    # Used for structured outputs\n    price: float\n    name: str\n\nasync def main():\n    # Configure structured outputs format\n    settings = OpenAIChatPromptExecutionSettings()\n    settings.response_format = MenuItem\n\n    # Create agent with plugin and settings\n    agent = ChatCompletionAgent(\n        service=AzureChatCompletion(),\n        name=\"SK-Assistant\",\n        instructions=\"You are a helpful assistant.\",\n        plugins=[MenuPlugin()],\n        arguments=KernelArguments(settings)\n    )\n\n    response = await agent.get_response(\"What is the price of the soup special?\")\n    print(response.content)\n\n    # Output:\n    # The price of the Clam Chowder, which is the soup special, is $9.99.\n\nasyncio.run(main()) \n```\n\nYou can explore additional getting started agent samples [here](https://github.com/microsoft/semantic-kernel/tree/main/python/samples/getting_started_with_agents).\n\n## 5. Multi-Agent Orchestration\n\nCoordinate a group of agents to iteratively solve a problem or refine content together:\n\n```python\nimport asyncio\nfrom semantic_kernel.agents import ChatCompletionAgent, GroupChatOrchestration, RoundRobinGroupChatManager\nfrom semantic_kernel.agents.runtime import InProcessRuntime\nfrom semantic_kernel.connectors.ai.open_ai import AzureChatCompletion\n\ndef get_agents():\n    return [\n        ChatCompletionAgent(\n            name=\"Writer\",\n            instructions=\"You are a creative content writer. Generate and refine slogans based on feedback.\",\n            service=AzureChatCompletion(),\n        ),\n        ChatCompletionAgent(\n            name=\"Reviewer\",\n            instructions=\"You are a critical reviewer. Provide detailed feedback on proposed slogans.\",\n            service=AzureChatCompletion(),\n        ),\n    ]\n\nasync def main():\n    agents = get_agents()\n    group_chat = GroupChatOrchestration(\n        members=agents,\n        manager=RoundRobinGroupChatManager(max_rounds=5),\n    )\n    runtime = InProcessRuntime()\n    runtime.start()\n    result = await group_chat.invoke(\n        task=\"Create a slogan for a new electric SUV that is affordable and fun to drive.\",\n        runtime=runtime,\n    )\n    value = await result.get()\n    print(f\"Final Slogan: {value}\")\n\n    # Example Output:\n    # Final Slogan: \"Feel the Charge: Adventure Meets Affordability in Your New Electric SUV!\"\n\n    await runtime.stop_when_idle()\n\nif __name__ == \"__main__\":\n    asyncio.run(main())\n```\n\nFor orchestration-focused examples, see [these orchestration samples](https://github.com/microsoft/semantic-kernel/tree/main/python/samples/getting_started_with_agents/multi_agent_orchestration).\n\n## More Examples & Notebooks\n\n- [Getting Started with Agents](https://github.com/microsoft/semantic-kernel/tree/main/python/samples/getting_started_with_agents): Practical agent orchestration and tool use  \n- [Getting Started with Processes](https://github.com/microsoft/semantic-kernel/tree/main/python/samples/getting_started_with_processes): Modeling structured workflows with the Process framework  \n- [Concept Samples](https://github.com/microsoft/semantic-kernel/tree/main/python/samples/concepts): Advanced scenarios, integrations, and SK patterns  \n- [Getting Started Notebooks](https://github.com/microsoft/semantic-kernel/tree/main/python/samples/getting_started): Interactive Python notebooks for rapid experimentation  \n\n## Semantic Kernel Documentation\n\n- [Getting Started with Semantic Kernel Python](https://learn.microsoft.com/en-us/semantic-kernel/get-started/quick-start-guide?pivots=programming-language-python)  \n- [Agent Framework Guide](https://learn.microsoft.com/en-us/semantic-kernel/frameworks/agent/?pivots=programming-language-python)  \n- [Process Framework Guide](https://learn.microsoft.com/en-us/semantic-kernel/frameworks/process/process-framework)\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Semantic Kernel Python SDK",
    "version": "1.35.0",
    "project_urls": {
        "homepage": "https://learn.microsoft.com/en-us/semantic-kernel/overview/",
        "issues": "https://github.com/microsoft/semantic-kernel/issues",
        "release_notes": "https://github.com/microsoft/semantic-kernel/releases?q=tag%3Apython-1&expanded=true",
        "source": "https://github.com/microsoft/semantic-kernel/tree/main/python"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "b014b0ddf679dae28393cf068401e8f953602adf78d1fe17504479ddf9f7afdf",
                "md5": "9c026c85715ca80446d13f52b7d0824c",
                "sha256": "ce2b9c313d53841448059833e885f082d136c54a113e687359b14c5e358c0e66"
            },
            "downloads": -1,
            "filename": "semantic_kernel-1.35.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "9c026c85715ca80446d13f52b7d0824c",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 875792,
            "upload_time": "2025-07-16T00:33:45",
            "upload_time_iso_8601": "2025-07-16T00:33:45.891030Z",
            "url": "https://files.pythonhosted.org/packages/b0/14/b0ddf679dae28393cf068401e8f953602adf78d1fe17504479ddf9f7afdf/semantic_kernel-1.35.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "bc5c4d761ff412c211260415f0e6683d22139b4ab990d9010c9962d1ec35d1b8",
                "md5": "6a1f832de7301bf32446240f72020720",
                "sha256": "7fe49faaf7086263d3ac4cb42ec5d0b2344dcc21f0759bd6b79a92a7b4f8533f"
            },
            "downloads": -1,
            "filename": "semantic_kernel-1.35.0.tar.gz",
            "has_sig": false,
            "md5_digest": "6a1f832de7301bf32446240f72020720",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 572339,
            "upload_time": "2025-07-16T00:33:47",
            "upload_time_iso_8601": "2025-07-16T00:33:47.948613Z",
            "url": "https://files.pythonhosted.org/packages/bc/5c/4d761ff412c211260415f0e6683d22139b4ab990d9010c9962d1ec35d1b8/semantic_kernel-1.35.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-16 00:33:47",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "microsoft",
    "github_project": "semantic-kernel",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "semantic-kernel"
}
        
Elapsed time: 2.77990s