# Common AI Core
A flexible Python framework for building AI chat applications with support for multiple LLM providers.
## Features
- 🤖 Support for multiple LLM providers:
- OpenAI (GPT-3.5, GPT-4) - included by default
- Anthropic (Claude) - optional
- Llama (local models) - optional
- 💾 Flexible memory management:
- Token-based memory limits
- Prompt-based memory limits
- System prompt preservation
- 🔄 Multiple chat modes:
- Streaming responses
- Completion responses
- 📊 Token counting and cost estimation
- 🎨 Pretty-printed chat history
- 🔍 Content parsing utilities:
- JSON structure extraction from LLM outputs
- Python code parsing
## Installation
```bash
# Install the complete framework (includes all features, OpenAI provider ready to use)
pip install common-ai-core
# Add support for Anthropic's Claude (requires anthropic package)
pip install "common-ai-core[anthropic]"
# Add support for Google's Gemini (requires google-generativeai package)
pip install "common-ai-core[gemini]"
# Add support for DeepSeek (uses OpenAI client, no extra package needed)
pip install "common-ai-core[deepseek]"
# Install with all cloud providers (OpenAI, Anthropic, Gemini, DeepSeek)
pip install "common-ai-core[all-cloud]"
# Add support for local Llama models (requires llama-cpp-python package)
pip install "common-ai-core[llama]"
# Install with all providers including Llama
pip install "common-ai-core[all]"
# Development installation (includes testing tools)
pip install "common-ai-core[dev]"
```
## Quick Start
```python
from common_ai_core import ProviderBuilder, ProviderType, SystemTokenLimitedMemory, CompletionChat
# Create a provider (using OpenAI by default)
provider = ProviderBuilder(ProviderType.OPENAI).build()
# Create memory with system prompt
memory = SystemTokenLimitedMemory.from_provider(
provider=provider,
system_prompt="You are a helpful assistant.",
max_tokens=1000
)
# Create chat interface
chatbot = CompletionChat(provider, memory)
# Chat!
response = chatbot.chat("Tell me about Python!")
print(response)
```
## Memory Types
- `TokenLimitedMemory`: Limits conversation by token count
- `PromptLimitedMemory`: Limits conversation by number of exchanges
- `SystemTokenLimitedMemory`: Token-limited with preserved system prompt
- `SystemPromptLimitedMemory`: Prompt-limited with preserved system prompt
## Providers
- **OpenAI** (included by default)
- Supports GPT-4o-mini (default), GPT-4o, and GPT-3.5 models
- Includes token counting
- Streaming support
- **Anthropic** (optional)
- Supports Claude models
- Install with: `pip install "common-ai-core[anthropic]"`
```python
provider = ProviderBuilder(ProviderType.ANTHROPIC).build()
```
- **Llama** (optional)
- Supports local models
- Install with: `pip install "common-ai-core[llama]"`
```python
provider = (ProviderBuilder(ProviderType.LLAMA)
.set_model_path("path/to/model.gguf")
.build())
```
- **DeepSeek** (optional)
- Supports DeepSeek models including reasoning models
- Install with: `pip install "common-ai-core[deepseek]"`
```python
provider = ProviderBuilder(ProviderType.DEEPSEEK).build()
```
- **Gemini** (optional)
- Supports Google's Gemini models
- Install with: `pip install "common-ai-core[gemini]"`
```python
provider = ProviderBuilder(ProviderType.GEMINI).build()
```
## Error Handling
Common AI Core provides clear error messages when optional dependencies are missing:
```python
from common_ai_core import ProviderBuilder, ProviderType
try:
# This will work if you have openai installed
provider = ProviderBuilder(ProviderType.OPENAI).build()
print("OpenAI provider created successfully!")
except Exception as e:
print(f"Error: {e}")
try:
# This will fail with a clear message if anthropic is not installed
provider = ProviderBuilder(ProviderType.ANTHROPIC).build()
except Exception as e:
print(f"Error: {e}")
# Output: Error: Anthropic package not installed: No module named 'anthropic'
# Solution: pip install "common-ai-core[anthropic]"
```
## Parsers
Common AI Core includes utilities for parsing and extracting structured content from LLM outputs:
### JSON Parser
Extract valid JSON structures from LLM text outputs:
```python
from common_ai_core.parsers.json_parser import JsonParser
# Extract JSON from LLM output
llm_output = """This is some text with embedded JSON:
{\"key\": \"value\", \"nested\": {\"data\": 123}}
and more text after."""
parser = JsonParser(llm_output)
json_objects = parser.extract_json_structures()
# Process extracted JSON objects
for json_obj in json_objects:
print(json_obj) # {'key': 'value', 'nested': {'data': 123}}
```
The JSON parser can extract JSON objects even when they're embedded in markdown code blocks or surrounded by other text.
## Development
```bash
# Clone the repository
git clone https://github.com/commonai/common-ai-core.git
# Install development dependencies
pip install -e ".[dev]"
# Run tests
pytest
```
## License
This project is licensed under the MIT License - see the LICENSE file for details.
Raw data
{
"_id": null,
"home_page": null,
"name": "common-ai-core",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "Helge Moden <helgemod@gmail.com>",
"keywords": "ai, anthropic, artificial-intelligence, chat, claude, deepseek, gemini, gpt, json-parsing, language-model, llama, llm, memory, openai, reasoning, streaming, token-counting",
"author": null,
"author_email": "Helge Moden <helgemod@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/2b/9e/88c42e883e56bca3e775990fab20dfca7835ee4eab933d0663f2f93cf837/common_ai_core-0.1.4.tar.gz",
"platform": null,
"description": "# Common AI Core\n\nA flexible Python framework for building AI chat applications with support for multiple LLM providers.\n\n## Features\n\n- \ud83e\udd16 Support for multiple LLM providers:\n - OpenAI (GPT-3.5, GPT-4) - included by default\n - Anthropic (Claude) - optional\n - Llama (local models) - optional\n- \ud83d\udcbe Flexible memory management:\n - Token-based memory limits\n - Prompt-based memory limits\n - System prompt preservation\n- \ud83d\udd04 Multiple chat modes:\n - Streaming responses\n - Completion responses\n- \ud83d\udcca Token counting and cost estimation\n- \ud83c\udfa8 Pretty-printed chat history\n- \ud83d\udd0d Content parsing utilities:\n - JSON structure extraction from LLM outputs\n - Python code parsing\n\n## Installation\n\n```bash\n# Install the complete framework (includes all features, OpenAI provider ready to use)\npip install common-ai-core\n\n# Add support for Anthropic's Claude (requires anthropic package)\npip install \"common-ai-core[anthropic]\"\n\n# Add support for Google's Gemini (requires google-generativeai package)\npip install \"common-ai-core[gemini]\"\n\n# Add support for DeepSeek (uses OpenAI client, no extra package needed)\npip install \"common-ai-core[deepseek]\"\n\n# Install with all cloud providers (OpenAI, Anthropic, Gemini, DeepSeek)\npip install \"common-ai-core[all-cloud]\"\n\n# Add support for local Llama models (requires llama-cpp-python package)\npip install \"common-ai-core[llama]\"\n\n# Install with all providers including Llama\npip install \"common-ai-core[all]\"\n\n# Development installation (includes testing tools)\npip install \"common-ai-core[dev]\"\n```\n\n## Quick Start\n\n```python\nfrom common_ai_core import ProviderBuilder, ProviderType, SystemTokenLimitedMemory, CompletionChat\n\n# Create a provider (using OpenAI by default)\nprovider = ProviderBuilder(ProviderType.OPENAI).build()\n\n# Create memory with system prompt\nmemory = SystemTokenLimitedMemory.from_provider(\n provider=provider,\n system_prompt=\"You are a helpful assistant.\",\n max_tokens=1000\n)\n\n# Create chat interface\nchatbot = CompletionChat(provider, memory)\n\n# Chat!\nresponse = chatbot.chat(\"Tell me about Python!\")\nprint(response)\n```\n\n## Memory Types\n\n- `TokenLimitedMemory`: Limits conversation by token count\n- `PromptLimitedMemory`: Limits conversation by number of exchanges\n- `SystemTokenLimitedMemory`: Token-limited with preserved system prompt\n- `SystemPromptLimitedMemory`: Prompt-limited with preserved system prompt\n\n## Providers\n\n- **OpenAI** (included by default)\n - Supports GPT-4o-mini (default), GPT-4o, and GPT-3.5 models\n - Includes token counting\n - Streaming support\n\n- **Anthropic** (optional)\n - Supports Claude models\n - Install with: `pip install \"common-ai-core[anthropic]\"`\n ```python\n provider = ProviderBuilder(ProviderType.ANTHROPIC).build()\n ```\n\n- **Llama** (optional)\n - Supports local models\n - Install with: `pip install \"common-ai-core[llama]\"`\n ```python\n provider = (ProviderBuilder(ProviderType.LLAMA)\n .set_model_path(\"path/to/model.gguf\")\n .build())\n ```\n\n- **DeepSeek** (optional)\n - Supports DeepSeek models including reasoning models\n - Install with: `pip install \"common-ai-core[deepseek]\"`\n ```python\n provider = ProviderBuilder(ProviderType.DEEPSEEK).build()\n ```\n\n- **Gemini** (optional)\n - Supports Google's Gemini models\n - Install with: `pip install \"common-ai-core[gemini]\"`\n ```python\n provider = ProviderBuilder(ProviderType.GEMINI).build()\n ```\n\n## Error Handling\n\nCommon AI Core provides clear error messages when optional dependencies are missing:\n\n```python\nfrom common_ai_core import ProviderBuilder, ProviderType\n\ntry:\n # This will work if you have openai installed\n provider = ProviderBuilder(ProviderType.OPENAI).build()\n print(\"OpenAI provider created successfully!\")\nexcept Exception as e:\n print(f\"Error: {e}\")\n\ntry:\n # This will fail with a clear message if anthropic is not installed\n provider = ProviderBuilder(ProviderType.ANTHROPIC).build()\nexcept Exception as e:\n print(f\"Error: {e}\")\n # Output: Error: Anthropic package not installed: No module named 'anthropic'\n # Solution: pip install \"common-ai-core[anthropic]\"\n```\n\n## Parsers\n\nCommon AI Core includes utilities for parsing and extracting structured content from LLM outputs:\n\n### JSON Parser\n\nExtract valid JSON structures from LLM text outputs:\n\n```python\nfrom common_ai_core.parsers.json_parser import JsonParser\n\n# Extract JSON from LLM output\nllm_output = \"\"\"This is some text with embedded JSON: \n{\\\"key\\\": \\\"value\\\", \\\"nested\\\": {\\\"data\\\": 123}} \nand more text after.\"\"\"\n\nparser = JsonParser(llm_output)\njson_objects = parser.extract_json_structures()\n\n# Process extracted JSON objects\nfor json_obj in json_objects:\n print(json_obj) # {'key': 'value', 'nested': {'data': 123}}\n```\n\nThe JSON parser can extract JSON objects even when they're embedded in markdown code blocks or surrounded by other text.\n\n## Development\n\n```bash\n# Clone the repository\ngit clone https://github.com/commonai/common-ai-core.git\n\n# Install development dependencies\npip install -e \".[dev]\"\n\n# Run tests\npytest\n```\n\n## License\n\nThis project is licensed under the MIT License - see the LICENSE file for details.\n",
"bugtrack_url": null,
"license": null,
"summary": "A flexible Python framework for building AI chat applications with support for multiple LLM providers including OpenAI, Anthropic, Gemini, DeepSeek, and local Llama models",
"version": "0.1.4",
"project_urls": null,
"split_keywords": [
"ai",
" anthropic",
" artificial-intelligence",
" chat",
" claude",
" deepseek",
" gemini",
" gpt",
" json-parsing",
" language-model",
" llama",
" llm",
" memory",
" openai",
" reasoning",
" streaming",
" token-counting"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "8e032ac2669729ca001d9a6d7634b42bc53162e72af1bd342f20da645bb4e67b",
"md5": "605e502348996eceac85c2d852d84184",
"sha256": "7e1dd71d0140bf1b037d90c764846adb0468de2375007ebbe42aa80843185791"
},
"downloads": -1,
"filename": "common_ai_core-0.1.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "605e502348996eceac85c2d852d84184",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 27733,
"upload_time": "2025-08-08T07:06:35",
"upload_time_iso_8601": "2025-08-08T07:06:35.647572Z",
"url": "https://files.pythonhosted.org/packages/8e/03/2ac2669729ca001d9a6d7634b42bc53162e72af1bd342f20da645bb4e67b/common_ai_core-0.1.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "2b9e88c42e883e56bca3e775990fab20dfca7835ee4eab933d0663f2f93cf837",
"md5": "3a5ede0a63ea332371fdbabb0972b1c9",
"sha256": "13c13382902d961d4f7a6fd6e33f1dfc2e79eea55be04e1014aef9feba98f610"
},
"downloads": -1,
"filename": "common_ai_core-0.1.4.tar.gz",
"has_sig": false,
"md5_digest": "3a5ede0a63ea332371fdbabb0972b1c9",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 37810,
"upload_time": "2025-08-08T07:06:37",
"upload_time_iso_8601": "2025-08-08T07:06:37.176055Z",
"url": "https://files.pythonhosted.org/packages/2b/9e/88c42e883e56bca3e775990fab20dfca7835ee4eab933d0663f2f93cf837/common_ai_core-0.1.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-08 07:06:37",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "common-ai-core"
}