# Cogency
[](https://badge.fury.io/py/cogency)
[](https://opensource.org/licenses/Apache-2.0)
[](https://www.python.org/downloads/)
**Context-driven agents that work out of the box.**
```python
from cogency import Agent
agent = Agent()
result = await agent("Search for Python best practices and summarize")
```
**Zero ceremony. Maximum capability.**
- **🌐 Web-enabled** - Search and scrape with zero configuration
- **🔌 Multi-provider** - OpenAI, Anthropic, Gemini support
- **🛠️ Tool orchestration** - Files, shell, web tools auto-compose
- **🧠 Context injection** - Automatic assembly of relevant information
- **⚡️ Streaming** - Event-coordinated ReAct reasoning
## Quick Start
```python
import asyncio
from cogency import Agent
async def main():
agent = Agent()
response = await agent("What are the benefits of async/await in Python?")
print(response)
# Run with: python -m asyncio your_script.py
asyncio.run(main())
```
## Installation
```bash
pip install cogency
```
Set your API key:
```bash
export OPENAI_API_KEY="your-api-key-here"
```
## Examples
### Basic Agent
```python
from cogency import Agent
agent = Agent()
response = await agent("Explain quantum computing in simple terms")
```
### Agent with Tools
```python
from cogency import Agent, BASIC_TOOLS
from cogency.tools import Search, Scrape
# Web-enabled agent
agent = Agent(tools=[Search(), Scrape()])
result = await agent("Search for Python best practices and summarize key points")
# All basic tools (Files, Shell)
agent = Agent(tools=BASIC_TOOLS)
result = await agent("Create a Python script that calculates factorial of 10")
```
### User-Specific Context
```python
from cogency import Agent, profile
# Set user preferences (optional)
profile("alice",
name="Alice Johnson",
preferences=["Python", "Machine Learning"],
context="Senior data scientist working on NLP projects")
agent = Agent()
response = await agent("Recommend a good ML library for text processing", user_id="alice")
```
### Custom Knowledge Base
```python
from cogency.storage import add_document
# Add documents to knowledge base (optional)
add_document("python_guide", "Python is a high-level programming language...")
add_document("ml_basics", "Machine learning is a subset of artificial intelligence...")
# Agent automatically searches relevant documents for context
agent = Agent()
response = await agent("What's the difference between Python and machine learning?")
```
## Architecture
Context-driven agents work by injecting relevant information before each query:
```python
async def agent_call(query: str, user_id: str = "default") -> str:
ctx = context(query, user_id) # Assembles relevant context
prompt = f"{ctx}\n\nQuery: {query}"
return await llm.generate(prompt)
```
Context sources include:
- **System**: Base instructions
- **Conversation**: Recent message history
- **Knowledge**: Semantic search results
- **Memory**: User profile and preferences
- **Working**: Tool execution history
## Design Principles
- **Zero writes** during reasoning - no database operations in the hot path
- **Pure functions** for context assembly - deterministic and testable
- **Read-only** context sources - graceful degradation on failures
- **Optional persistence** - conversation history saved asynchronously
## API Reference
### Agent
Simple conversational agent with context injection.
```python
agent = Agent()
response = await agent(query: str, user_id: str = "default") -> str
```
### Streaming
```python
from cogency import Agent
agent = Agent()
async for event in agent.stream("Complex research task requiring multiple steps"):
if event["type"] == "reasoning":
print(f"Thinking: {event['content'][:100]}...")
elif event["type"] == "complete":
print(f"Final: {event['answer']}")
```
### Agent with Multiple Providers
```python
from cogency import Agent
from cogency.lib.providers import Anthropic, Nomic
# Provider agnostic: Mix any LLM with any Embedder
agent = Agent(llm=Anthropic(), embedder=Nomic())
result = await agent("Compare Python and Rust for systems programming")
```
### Context Functions
```python
from cogency import profile
from cogency.storage import add_document
# User profiles
profile(user_id, name=None, preferences=None, context=None)
# Knowledge base
add_document(doc_id: str, content: str, metadata: dict = None)
```
## Testing
```bash
# Install dev dependencies
poetry install
# Run tests
pytest tests/
```
## Documentation
See `docs/blueprint.md` for complete technical specification.
**That's it.** No configuration, no setup, just working agents.
*v2.1.0: Web capabilities, multi-provider support, Result types - zero ceremony preserved.*
Raw data
{
"_id": null,
"home_page": "https://github.com/iteebz/cogency",
"name": "cogency",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.10",
"maintainer_email": null,
"keywords": "ai, agents, reasoning, react, memory, tools, tracing, cognition, streaming",
"author": "Tyson Chan",
"author_email": "tyson.chan@proton.me",
"download_url": "https://files.pythonhosted.org/packages/70/0c/b35307ac826182553836c09ebed27f3e8e2326ad0d6f813590081a42f104/cogency-2.1.0.tar.gz",
"platform": null,
"description": "# Cogency\n\n[](https://badge.fury.io/py/cogency)\n[](https://opensource.org/licenses/Apache-2.0)\n[](https://www.python.org/downloads/)\n\n**Context-driven agents that work out of the box.**\n\n```python\nfrom cogency import Agent\n\nagent = Agent()\nresult = await agent(\"Search for Python best practices and summarize\")\n```\n\n**Zero ceremony. Maximum capability.**\n\n- **\ud83c\udf10 Web-enabled** - Search and scrape with zero configuration\n- **\ud83d\udd0c Multi-provider** - OpenAI, Anthropic, Gemini support\n- **\ud83d\udee0\ufe0f Tool orchestration** - Files, shell, web tools auto-compose\n- **\ud83e\udde0 Context injection** - Automatic assembly of relevant information\n- **\u26a1\ufe0f Streaming** - Event-coordinated ReAct reasoning\n\n## Quick Start\n\n```python\nimport asyncio\nfrom cogency import Agent\n\nasync def main():\n agent = Agent()\n response = await agent(\"What are the benefits of async/await in Python?\")\n print(response)\n\n# Run with: python -m asyncio your_script.py\nasyncio.run(main())\n```\n\n## Installation\n\n```bash\npip install cogency\n```\n\nSet your API key:\n```bash\nexport OPENAI_API_KEY=\"your-api-key-here\"\n```\n\n## Examples\n\n### Basic Agent\n\n```python\nfrom cogency import Agent\n\nagent = Agent()\nresponse = await agent(\"Explain quantum computing in simple terms\")\n```\n\n### Agent with Tools\n\n```python\nfrom cogency import Agent, BASIC_TOOLS\nfrom cogency.tools import Search, Scrape\n\n# Web-enabled agent\nagent = Agent(tools=[Search(), Scrape()])\nresult = await agent(\"Search for Python best practices and summarize key points\")\n\n# All basic tools (Files, Shell)\nagent = Agent(tools=BASIC_TOOLS)\nresult = await agent(\"Create a Python script that calculates factorial of 10\")\n```\n\n### User-Specific Context\n\n```python\nfrom cogency import Agent, profile\n\n# Set user preferences (optional)\nprofile(\"alice\", \n name=\"Alice Johnson\",\n preferences=[\"Python\", \"Machine Learning\"],\n context=\"Senior data scientist working on NLP projects\")\n\nagent = Agent()\nresponse = await agent(\"Recommend a good ML library for text processing\", user_id=\"alice\")\n```\n\n### Custom Knowledge Base\n\n```python\nfrom cogency.storage import add_document\n\n# Add documents to knowledge base (optional)\nadd_document(\"python_guide\", \"Python is a high-level programming language...\")\nadd_document(\"ml_basics\", \"Machine learning is a subset of artificial intelligence...\")\n\n# Agent automatically searches relevant documents for context\nagent = Agent()\nresponse = await agent(\"What's the difference between Python and machine learning?\")\n```\n\n## Architecture\n\nContext-driven agents work by injecting relevant information before each query:\n\n```python\nasync def agent_call(query: str, user_id: str = \"default\") -> str:\n ctx = context(query, user_id) # Assembles relevant context\n prompt = f\"{ctx}\\n\\nQuery: {query}\"\n return await llm.generate(prompt)\n```\n\nContext sources include:\n- **System**: Base instructions\n- **Conversation**: Recent message history \n- **Knowledge**: Semantic search results\n- **Memory**: User profile and preferences\n- **Working**: Tool execution history\n\n## Design Principles\n\n- **Zero writes** during reasoning - no database operations in the hot path\n- **Pure functions** for context assembly - deterministic and testable\n- **Read-only** context sources - graceful degradation on failures\n- **Optional persistence** - conversation history saved asynchronously\n\n## API Reference\n\n### Agent\n\nSimple conversational agent with context injection.\n\n```python\nagent = Agent()\nresponse = await agent(query: str, user_id: str = \"default\") -> str\n```\n\n### Streaming\n\n```python\nfrom cogency import Agent\n\nagent = Agent()\nasync for event in agent.stream(\"Complex research task requiring multiple steps\"):\n if event[\"type\"] == \"reasoning\":\n print(f\"Thinking: {event['content'][:100]}...\")\n elif event[\"type\"] == \"complete\":\n print(f\"Final: {event['answer']}\")\n```\n\n### Agent with Multiple Providers\n\n```python\nfrom cogency import Agent\nfrom cogency.lib.providers import Anthropic, Nomic\n\n# Provider agnostic: Mix any LLM with any Embedder\nagent = Agent(llm=Anthropic(), embedder=Nomic())\nresult = await agent(\"Compare Python and Rust for systems programming\")\n```\n\n### Context Functions\n\n```python\nfrom cogency import profile\nfrom cogency.storage import add_document\n\n# User profiles\nprofile(user_id, name=None, preferences=None, context=None)\n\n# Knowledge base\nadd_document(doc_id: str, content: str, metadata: dict = None)\n```\n\n## Testing\n\n```bash\n# Install dev dependencies\npoetry install\n\n# Run tests\npytest tests/\n```\n\n## Documentation\n\nSee `docs/blueprint.md` for complete technical specification.\n\n**That's it.** No configuration, no setup, just working agents.\n\n*v2.1.0: Web capabilities, multi-provider support, Result types - zero ceremony preserved.*",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "Stateless Context-Driven Agent Framework",
"version": "2.1.0",
"project_urls": {
"Homepage": "https://github.com/iteebz/cogency",
"Repository": "https://github.com/iteebz/cogency"
},
"split_keywords": [
"ai",
" agents",
" reasoning",
" react",
" memory",
" tools",
" tracing",
" cognition",
" streaming"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "e9d7f2811559c6164b874eb4b260db7f81023975393da8a84cb695e82f0ad4dc",
"md5": "f91abc4844ca2e0692ddfc4396b45406",
"sha256": "bf055d10304e8b0f34872e48933a9817793ff26f608de1441b9bf8f645f0e610"
},
"downloads": -1,
"filename": "cogency-2.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "f91abc4844ca2e0692ddfc4396b45406",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.10",
"size": 28986,
"upload_time": "2025-08-18T17:40:03",
"upload_time_iso_8601": "2025-08-18T17:40:03.586644Z",
"url": "https://files.pythonhosted.org/packages/e9/d7/f2811559c6164b874eb4b260db7f81023975393da8a84cb695e82f0ad4dc/cogency-2.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "700cb35307ac826182553836c09ebed27f3e8e2326ad0d6f813590081a42f104",
"md5": "e8f630ac4126b494c2f3bd543496d72f",
"sha256": "e5e8f0f64b77a58cd5cd20ca7bf6903614d1e37de7b712514e5bf5d050a0a34f"
},
"downloads": -1,
"filename": "cogency-2.1.0.tar.gz",
"has_sig": false,
"md5_digest": "e8f630ac4126b494c2f3bd543496d72f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.10",
"size": 21267,
"upload_time": "2025-08-18T17:40:04",
"upload_time_iso_8601": "2025-08-18T17:40:04.771895Z",
"url": "https://files.pythonhosted.org/packages/70/0c/b35307ac826182553836c09ebed27f3e8e2326ad0d6f813590081a42f104/cogency-2.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-18 17:40:04",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "iteebz",
"github_project": "cogency",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "cogency"
}