# 🤖 cnoe-agent-utils
[](https://pypi.org/project/cnoe-agent-utils/)
[](https://github.com/cnoe-io/cnoe-agent-utils/actions/workflows/pypi.yml)
[](https://github.com/cnoe-io/cnoe-agent-utils/actions/workflows/conventional_commits.yml)
**cnoe-agent-utils** is an open-source Python library providing utility functions and abstractions for building agent-based systems, including LLM (Large Language Model) factories and integrations.
---
## ✨ Features
- 🏭 **LLM Factory** for easy model instantiation across:
- ☁️ AWS
- ☁️ Azure
- ☁️ GCP Vertex
- 🤖 Google Gemini
- 🤖 Anthropic Claude
- 🤖 OpenAI
---
## 🚀 Getting Started
### 🛡️ Create and Activate a Virtual Environment
It is recommended to use a virtual environment to manage dependencies:
```bash
python3 -m venv .venv
source .venv/bin/activate
```
### ⚡ Prerequisite: Install `uv`
Before running the examples, install [`uv`](https://github.com/astral-sh/uv):
```bash
pip install uv
```
### 📦 Installation
```bash
pip install cnoe-agent-utils
```
Or, if you are developing locally:
```bash
git clone https://github.com/cnoe-agent-utils/cnoe-agent-utils.git
cd cnoe-agent-utils
poetry build
poetry install
```
---
## 🧑💻 Usage
To test integration with different LLM providers, configure the required environment variables for each provider as shown below. Then, run the corresponding example script using `uv`.
---
### 🤖 Anthropic
Set the following environment variables:
```bash
export ANTHROPIC_API_KEY=<your_anthropic_api_key>
export ANTHROPIC_MODEL_NAME=<model_name>
```
Run the example:
```bash
uv run examples/test_anthropic.py
```
---
### ☁️ AWS Bedrock (Anthropic Claude)
Set the following environment variables:
```bash
export AWS_PROFILE=<your_aws_profile>
export AWS_REGION=<your_aws_region>
export AWS_BEDROCK_MODEL_ID="us.anthropic.claude-3-7-sonnet-20250219-v1:0"
export AWS_BEDROCK_PROVIDER="anthropic"
```
Run the example:
```bash
uv run examples/test_aws_bedrock_claude.py
```
---
### ☁️ Azure OpenAI
Set the following environment variables:
```bash
export AZURE_OPENAI_API_KEY=<your_azure_openai_api_key>
export AZURE_OPENAI_API_VERSION=<api_version>
export AZURE_OPENAI_DEPLOYMENT=gpt-4.1
export AZURE_OPENAI_ENDPOINT=<your_azure_openai_endpoint>
```
Run the example:
```bash
uv run examples/test_azure_openai.py
```
---
### 🤖 OpenAI
Set the following environment variables:
```bash
export OPENAI_API_KEY=<your_openai_api_key>
export OPENAI_ENDPOINT=https://api.openai.com/v1
export OPENAI_MODEL_NAME=gpt-4.1
```
Run the example:
```bash
uv run examples/test_openai.py
```
---
### 🤖 Google Gemini
Set the following environment variable:
```bash
export GOOGLE_API_KEY=<your_google_api_key>
```
Run the example:
```bash
uv run examples/test_google_gemini.py
```
---
### ☁️ GCP Vertex AI
Set the following environment variables:
```bash
export GOOGLE_APPLICATION_CREDENTIALS=~/.config/gcp.json
export VERTEXAI_MODEL_NAME="gemini-2.0-flash-001"
```
Run the example:
```bash
uv run examples/test_gcp_vertexai.py
```
This demonstrates how to use the LLM Factory and other utilities provided by the library.
---
## 📜 License
Apache 2.0 (see [LICENSE](./LICENSE))
---
## 👥 Maintainers
See [MAINTAINERS.md](MAINTAINERS.md)
- Contributions welcome via PR or issue!
Raw data
{
"_id": null,
"home_page": null,
"name": "cnoe-agent-utils",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.13",
"maintainer_email": null,
"keywords": "cnoe, agents, llm, tracing, observability",
"author": "CNOE Contributors",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/a3/b8/f8a2a597c3d74447b6dbdcb6f0461d28b4b15ab92800fc56be602f536b24/cnoe_agent_utils-0.2.0.tar.gz",
"platform": null,
"description": "# \ud83e\udd16 cnoe-agent-utils\n\n[](https://pypi.org/project/cnoe-agent-utils/)\n[](https://github.com/cnoe-io/cnoe-agent-utils/actions/workflows/pypi.yml)\n[](https://github.com/cnoe-io/cnoe-agent-utils/actions/workflows/conventional_commits.yml)\n\n**cnoe-agent-utils** is an open-source Python library providing utility functions and abstractions for building agent-based systems, including LLM (Large Language Model) factories and integrations.\n\n---\n\n## \u2728 Features\n\n- \ud83c\udfed **LLM Factory** for easy model instantiation across:\n - \u2601\ufe0f AWS\n - \u2601\ufe0f Azure\n - \u2601\ufe0f GCP Vertex\n - \ud83e\udd16 Google Gemini\n - \ud83e\udd16 Anthropic Claude\n - \ud83e\udd16 OpenAI\n\n---\n\n## \ud83d\ude80 Getting Started\n\n### \ud83d\udee1\ufe0f Create and Activate a Virtual Environment\n\nIt is recommended to use a virtual environment to manage dependencies:\n\n```bash\npython3 -m venv .venv\nsource .venv/bin/activate\n```\n\n### \u26a1 Prerequisite: Install `uv`\n\nBefore running the examples, install [`uv`](https://github.com/astral-sh/uv):\n\n```bash\npip install uv\n```\n\n### \ud83d\udce6 Installation\n\n```bash\npip install cnoe-agent-utils\n```\n\nOr, if you are developing locally:\n\n```bash\ngit clone https://github.com/cnoe-agent-utils/cnoe-agent-utils.git\ncd cnoe-agent-utils\npoetry build\npoetry install\n```\n\n---\n\n## \ud83e\uddd1\u200d\ud83d\udcbb Usage\n\nTo test integration with different LLM providers, configure the required environment variables for each provider as shown below. Then, run the corresponding example script using `uv`.\n\n---\n\n### \ud83e\udd16 Anthropic\n\nSet the following environment variables:\n\n```bash\nexport ANTHROPIC_API_KEY=<your_anthropic_api_key>\nexport ANTHROPIC_MODEL_NAME=<model_name>\n```\n\nRun the example:\n\n```bash\nuv run examples/test_anthropic.py\n```\n\n---\n\n### \u2601\ufe0f AWS Bedrock (Anthropic Claude)\n\nSet the following environment variables:\n\n```bash\nexport AWS_PROFILE=<your_aws_profile>\nexport AWS_REGION=<your_aws_region>\nexport AWS_BEDROCK_MODEL_ID=\"us.anthropic.claude-3-7-sonnet-20250219-v1:0\"\nexport AWS_BEDROCK_PROVIDER=\"anthropic\"\n```\n\nRun the example:\n\n```bash\nuv run examples/test_aws_bedrock_claude.py\n```\n\n---\n\n### \u2601\ufe0f Azure OpenAI\n\nSet the following environment variables:\n\n```bash\nexport AZURE_OPENAI_API_KEY=<your_azure_openai_api_key>\nexport AZURE_OPENAI_API_VERSION=<api_version>\nexport AZURE_OPENAI_DEPLOYMENT=gpt-4.1\nexport AZURE_OPENAI_ENDPOINT=<your_azure_openai_endpoint>\n```\n\nRun the example:\n\n```bash\nuv run examples/test_azure_openai.py\n```\n\n---\n\n### \ud83e\udd16 OpenAI\n\nSet the following environment variables:\n\n```bash\nexport OPENAI_API_KEY=<your_openai_api_key>\nexport OPENAI_ENDPOINT=https://api.openai.com/v1\nexport OPENAI_MODEL_NAME=gpt-4.1\n```\n\nRun the example:\n\n```bash\nuv run examples/test_openai.py\n```\n\n---\n\n### \ud83e\udd16 Google Gemini\n\nSet the following environment variable:\n\n```bash\nexport GOOGLE_API_KEY=<your_google_api_key>\n```\n\nRun the example:\n\n```bash\nuv run examples/test_google_gemini.py\n```\n\n---\n\n### \u2601\ufe0f GCP Vertex AI\n\nSet the following environment variables:\n\n```bash\nexport GOOGLE_APPLICATION_CREDENTIALS=~/.config/gcp.json\nexport VERTEXAI_MODEL_NAME=\"gemini-2.0-flash-001\"\n```\n\nRun the example:\n\n```bash\nuv run examples/test_gcp_vertexai.py\n```\n\nThis demonstrates how to use the LLM Factory and other utilities provided by the library.\n\n---\n\n## \ud83d\udcdc License\n\nApache 2.0 (see [LICENSE](./LICENSE))\n\n---\n\n## \ud83d\udc65 Maintainers\n\nSee [MAINTAINERS.md](MAINTAINERS.md)\n\n- Contributions welcome via PR or issue!\n",
"bugtrack_url": null,
"license": null,
"summary": "Core utilities for CNOE agents including LLM factory and tracing",
"version": "0.2.0",
"project_urls": null,
"split_keywords": [
"cnoe",
" agents",
" llm",
" tracing",
" observability"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "c7e8b741c3a43a17c800baa3ca654b0591ce5a0a654b9125bb9541e6e6335dda",
"md5": "f21030a919d3dd6d31a492dbdf459612",
"sha256": "d8cfcca3553d441c71ae3de8abe5e51251ad7638f04e52dccf49a257519f1f9b"
},
"downloads": -1,
"filename": "cnoe_agent_utils-0.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "f21030a919d3dd6d31a492dbdf459612",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.13",
"size": 19116,
"upload_time": "2025-07-24T04:50:39",
"upload_time_iso_8601": "2025-07-24T04:50:39.440636Z",
"url": "https://files.pythonhosted.org/packages/c7/e8/b741c3a43a17c800baa3ca654b0591ce5a0a654b9125bb9541e6e6335dda/cnoe_agent_utils-0.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "a3b8f8a2a597c3d74447b6dbdcb6f0461d28b4b15ab92800fc56be602f536b24",
"md5": "23387f9266ac0b43a6e41ac4b5498cea",
"sha256": "6d1e02dfd1c8626b42f2bd12eee8607c2af0076f315fa41e2faee339cf034c96"
},
"downloads": -1,
"filename": "cnoe_agent_utils-0.2.0.tar.gz",
"has_sig": false,
"md5_digest": "23387f9266ac0b43a6e41ac4b5498cea",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.13",
"size": 16438,
"upload_time": "2025-07-24T04:50:40",
"upload_time_iso_8601": "2025-07-24T04:50:40.468527Z",
"url": "https://files.pythonhosted.org/packages/a3/b8/f8a2a597c3d74447b6dbdcb6f0461d28b4b15ab92800fc56be602f536b24/cnoe_agent_utils-0.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-24 04:50:40",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "cnoe-agent-utils"
}