idun-agent-engine


Nameidun-agent-engine JSON
Version 0.1.0 PyPI version JSON
download
home_pageNone
SummaryPython SDK and runtime to serve AI agents with FastAPI, LangGraph, and observability.
upload_time2025-08-14 13:28:41
maintainerNone
docs_urlNone
authorGeoffrey HARRAZI
requires_python<3.14,>=3.13
licenseMIT
keywords agents langgraph fastapi sdk llm observability
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Idun Agent Engine - User Guide

The Idun Agent Engine provides a simple, powerful way to turn your conversational AI agents into production-ready web services. With just a few lines of code, you can expose your LangGraph, CrewAI, or custom agents through a FastAPI server with built-in features like streaming, persistence, and monitoring.

## 🚀 Quick Start

### Installation
```bash
pip install idun-agent-engine
```

### Basic Usage

```python
from idun_agent_engine import create_app, run_server

# Create your FastAPI app with your agent
app = create_app(config_path="config.yaml")

# Run the server
run_server(app, port=8000)
```

That's it! Your agent is now running at `http://localhost:8000` with full API documentation at `http://localhost:8000/docs`.

## 📋 Configuration

### Option 1: YAML Configuration File

Create a `config.yaml` file:

```yaml
engine:
  api:
    port: 8000
  telemetry:
    provider: "langfuse"

agent:
  type: "langgraph"
  config:
    name: "My Awesome Agent"
    graph_definition: "my_agent.py:graph"
    checkpointer:
      type: "sqlite"
      db_url: "sqlite:///agent.db"
```

### Option 2: Programmatic Configuration

```python
from idun_agent_engine import ConfigBuilder, create_app, run_server

config = (ConfigBuilder()
          .with_api_port(8080)
          .with_langgraph_agent(
              name="My Agent",
              graph_definition="my_agent.py:graph",
              sqlite_checkpointer="agent.db")
          .build())

app = create_app(config_dict=config)
run_server(app)
```

## 🤖 Supported Agent Types

### LangGraph Agents

```python
# Your LangGraph agent file (my_agent.py)
from langgraph.graph import StateGraph, END
from typing import TypedDict

class AgentState(TypedDict):
    messages: list

def my_node(state):
    # Your agent logic here
    return {"messages": [("ai", "Hello from LangGraph!")]}

graph = StateGraph(AgentState)
graph.add_node("agent", my_node)
graph.set_entry_point("agent")
graph.add_edge("agent", END)
```

### Future Agent Types
- CrewAI agents (coming soon)
- AutoGen agents (coming soon)
- Custom agent implementations

## 🌐 API Endpoints

Once your server is running, you get these endpoints automatically:

### POST `/agent/invoke`
Send a single message and get a complete response:

```bash
curl -X POST "http://localhost:8000/agent/invoke" \
  -H "Content-Type: application/json" \
  -d '{
    "query": "Hello, how are you?",
    "session_id": "user-123"
  }'
```

### POST `/agent/stream`
Stream responses in real-time:

```bash
curl -X POST "http://localhost:8000/agent/stream" \
  -H "Content-Type: application/json" \
  -d '{
    "query": "Tell me a story",
    "session_id": "user-123"
  }'
```

### GET `/health`
Health check for monitoring:

```bash
curl "http://localhost:8000/health"
```

## 🔧 Advanced Usage

### Development Mode
```python
# Enable auto-reload for development
run_server(app, reload=True)
```

### Production Deployment
```python
# Run with multiple workers for production
run_server(app, workers=4, host="0.0.0.0", port=8000)
```

### One-Line Server
```python
from idun_agent_engine.core.server_runner import run_server_from_config

# Create and run server in one call
run_server_from_config("config.yaml", port=8080, reload=True)
```

### Custom FastAPI Configuration
```python
from idun_agent_engine import create_app
from fastapi.middleware.cors import CORSMiddleware

app = create_app("config.yaml")

# Add custom middleware
app.add_middleware(
    CORSMiddleware,
    allow_origins=["*"],
    allow_methods=["*"],
    allow_headers=["*"],
)

# Add custom routes
@app.get("/custom")
def custom_endpoint():
    return {"message": "Custom endpoint"}
```

## 🛠️ Configuration Reference

### Engine Configuration
```yaml
engine:
  api:
    port: 8000                    # Server port
  telemetry:
    provider: "langfuse"          # Telemetry provider
```

### LangGraph Agent Configuration
```yaml
agent:
  type: "langgraph"
  config:
    name: "Agent Name"            # Human-readable name
    graph_definition: "path.py:graph"  # Path to your graph
    checkpointer:                 # Optional persistence
      type: "sqlite"
      db_url: "sqlite:///agent.db"
    store:                        # Optional store (future)
      type: "memory"
```

## 📚 Examples

Check out the `examples/` directory for complete working examples:

- **Basic LangGraph Agent**: Simple question-answering agent
- **ConfigBuilder Usage**: Programmatic configuration
- **Custom Middleware**: Adding authentication and CORS
- **Production Setup**: Multi-worker deployment configuration

## 🔍 Validation and Debugging

```python
from idun_agent_engine.utils.validation import validate_config_dict, diagnose_setup

# Validate your configuration
config = {...}
errors = validate_config_dict(config)
if errors:
    print("Configuration errors:", errors)

# Diagnose your setup
diagnosis = diagnose_setup()
print("System diagnosis:", diagnosis)
```

## 🚀 Deployment

### Docker
```dockerfile
FROM python:3.11-slim

COPY requirements.txt .
RUN pip install -r requirements.txt

COPY . .

CMD ["python", "-m", "idun_agent_engine", "run", "config.yaml"]
```

### Cloud Platforms
- Heroku: `Procfile` with `web: python main.py`
- Railway: Deploy with one click
- AWS Lambda: Use with Mangum adapter
- Google Cloud Run: Deploy Docker container

## 🤝 Contributing

The Idun Agent Engine is designed to be extensible. To add support for new agent frameworks:

1. Implement the `BaseAgent` interface
2. Add configuration models for your agent type
3. Register your agent in the factory
4. Submit a pull request!

## 📖 Documentation

- [Full API Documentation](https://docs.idun-agent-engine.com)
- [Agent Framework Guide](https://docs.idun-agent-engine.com/frameworks)
- [Deployment Guide](https://docs.idun-agent-engine.com/deployment)
- [Contributing Guide](https://docs.idun-agent-engine.com/contributing)

## 📄 License

MIT License - see LICENSE file for details.

---

### Release & Publishing

This package is built with Poetry. To publish a new release to PyPI:

1. Update version in `pyproject.toml`.
2. Commit and tag with the pattern `idun-agent-engine-vX.Y.Z`.
3. Push the tag to GitHub. The `Publish idun-agent-engine` workflow will build and publish to PyPI using `PYPI_API_TOKEN` secret.

Manual build (optional):

```bash
cd libs/idun_agent_engine
poetry build
```


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "idun-agent-engine",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.14,>=3.13",
    "maintainer_email": null,
    "keywords": "agents, langgraph, fastapi, sdk, llm, observability",
    "author": "Geoffrey HARRAZI",
    "author_email": "geoffreyharrazi@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/99/b7/a2252365b099d4c47c9b8352616b463bb82221b857346c2568da4c96b3c7/idun_agent_engine-0.1.0.tar.gz",
    "platform": null,
    "description": "# Idun Agent Engine - User Guide\n\nThe Idun Agent Engine provides a simple, powerful way to turn your conversational AI agents into production-ready web services. With just a few lines of code, you can expose your LangGraph, CrewAI, or custom agents through a FastAPI server with built-in features like streaming, persistence, and monitoring.\n\n## \ud83d\ude80 Quick Start\n\n### Installation\n```bash\npip install idun-agent-engine\n```\n\n### Basic Usage\n\n```python\nfrom idun_agent_engine import create_app, run_server\n\n# Create your FastAPI app with your agent\napp = create_app(config_path=\"config.yaml\")\n\n# Run the server\nrun_server(app, port=8000)\n```\n\nThat's it! Your agent is now running at `http://localhost:8000` with full API documentation at `http://localhost:8000/docs`.\n\n## \ud83d\udccb Configuration\n\n### Option 1: YAML Configuration File\n\nCreate a `config.yaml` file:\n\n```yaml\nengine:\n  api:\n    port: 8000\n  telemetry:\n    provider: \"langfuse\"\n\nagent:\n  type: \"langgraph\"\n  config:\n    name: \"My Awesome Agent\"\n    graph_definition: \"my_agent.py:graph\"\n    checkpointer:\n      type: \"sqlite\"\n      db_url: \"sqlite:///agent.db\"\n```\n\n### Option 2: Programmatic Configuration\n\n```python\nfrom idun_agent_engine import ConfigBuilder, create_app, run_server\n\nconfig = (ConfigBuilder()\n          .with_api_port(8080)\n          .with_langgraph_agent(\n              name=\"My Agent\",\n              graph_definition=\"my_agent.py:graph\",\n              sqlite_checkpointer=\"agent.db\")\n          .build())\n\napp = create_app(config_dict=config)\nrun_server(app)\n```\n\n## \ud83e\udd16 Supported Agent Types\n\n### LangGraph Agents\n\n```python\n# Your LangGraph agent file (my_agent.py)\nfrom langgraph.graph import StateGraph, END\nfrom typing import TypedDict\n\nclass AgentState(TypedDict):\n    messages: list\n\ndef my_node(state):\n    # Your agent logic here\n    return {\"messages\": [(\"ai\", \"Hello from LangGraph!\")]}\n\ngraph = StateGraph(AgentState)\ngraph.add_node(\"agent\", my_node)\ngraph.set_entry_point(\"agent\")\ngraph.add_edge(\"agent\", END)\n```\n\n### Future Agent Types\n- CrewAI agents (coming soon)\n- AutoGen agents (coming soon)\n- Custom agent implementations\n\n## \ud83c\udf10 API Endpoints\n\nOnce your server is running, you get these endpoints automatically:\n\n### POST `/agent/invoke`\nSend a single message and get a complete response:\n\n```bash\ncurl -X POST \"http://localhost:8000/agent/invoke\" \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\n    \"query\": \"Hello, how are you?\",\n    \"session_id\": \"user-123\"\n  }'\n```\n\n### POST `/agent/stream`\nStream responses in real-time:\n\n```bash\ncurl -X POST \"http://localhost:8000/agent/stream\" \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\n    \"query\": \"Tell me a story\",\n    \"session_id\": \"user-123\"\n  }'\n```\n\n### GET `/health`\nHealth check for monitoring:\n\n```bash\ncurl \"http://localhost:8000/health\"\n```\n\n## \ud83d\udd27 Advanced Usage\n\n### Development Mode\n```python\n# Enable auto-reload for development\nrun_server(app, reload=True)\n```\n\n### Production Deployment\n```python\n# Run with multiple workers for production\nrun_server(app, workers=4, host=\"0.0.0.0\", port=8000)\n```\n\n### One-Line Server\n```python\nfrom idun_agent_engine.core.server_runner import run_server_from_config\n\n# Create and run server in one call\nrun_server_from_config(\"config.yaml\", port=8080, reload=True)\n```\n\n### Custom FastAPI Configuration\n```python\nfrom idun_agent_engine import create_app\nfrom fastapi.middleware.cors import CORSMiddleware\n\napp = create_app(\"config.yaml\")\n\n# Add custom middleware\napp.add_middleware(\n    CORSMiddleware,\n    allow_origins=[\"*\"],\n    allow_methods=[\"*\"],\n    allow_headers=[\"*\"],\n)\n\n# Add custom routes\n@app.get(\"/custom\")\ndef custom_endpoint():\n    return {\"message\": \"Custom endpoint\"}\n```\n\n## \ud83d\udee0\ufe0f Configuration Reference\n\n### Engine Configuration\n```yaml\nengine:\n  api:\n    port: 8000                    # Server port\n  telemetry:\n    provider: \"langfuse\"          # Telemetry provider\n```\n\n### LangGraph Agent Configuration\n```yaml\nagent:\n  type: \"langgraph\"\n  config:\n    name: \"Agent Name\"            # Human-readable name\n    graph_definition: \"path.py:graph\"  # Path to your graph\n    checkpointer:                 # Optional persistence\n      type: \"sqlite\"\n      db_url: \"sqlite:///agent.db\"\n    store:                        # Optional store (future)\n      type: \"memory\"\n```\n\n## \ud83d\udcda Examples\n\nCheck out the `examples/` directory for complete working examples:\n\n- **Basic LangGraph Agent**: Simple question-answering agent\n- **ConfigBuilder Usage**: Programmatic configuration\n- **Custom Middleware**: Adding authentication and CORS\n- **Production Setup**: Multi-worker deployment configuration\n\n## \ud83d\udd0d Validation and Debugging\n\n```python\nfrom idun_agent_engine.utils.validation import validate_config_dict, diagnose_setup\n\n# Validate your configuration\nconfig = {...}\nerrors = validate_config_dict(config)\nif errors:\n    print(\"Configuration errors:\", errors)\n\n# Diagnose your setup\ndiagnosis = diagnose_setup()\nprint(\"System diagnosis:\", diagnosis)\n```\n\n## \ud83d\ude80 Deployment\n\n### Docker\n```dockerfile\nFROM python:3.11-slim\n\nCOPY requirements.txt .\nRUN pip install -r requirements.txt\n\nCOPY . .\n\nCMD [\"python\", \"-m\", \"idun_agent_engine\", \"run\", \"config.yaml\"]\n```\n\n### Cloud Platforms\n- Heroku: `Procfile` with `web: python main.py`\n- Railway: Deploy with one click\n- AWS Lambda: Use with Mangum adapter\n- Google Cloud Run: Deploy Docker container\n\n## \ud83e\udd1d Contributing\n\nThe Idun Agent Engine is designed to be extensible. To add support for new agent frameworks:\n\n1. Implement the `BaseAgent` interface\n2. Add configuration models for your agent type\n3. Register your agent in the factory\n4. Submit a pull request!\n\n## \ud83d\udcd6 Documentation\n\n- [Full API Documentation](https://docs.idun-agent-engine.com)\n- [Agent Framework Guide](https://docs.idun-agent-engine.com/frameworks)\n- [Deployment Guide](https://docs.idun-agent-engine.com/deployment)\n- [Contributing Guide](https://docs.idun-agent-engine.com/contributing)\n\n## \ud83d\udcc4 License\n\nMIT License - see LICENSE file for details.\n\n---\n\n### Release & Publishing\n\nThis package is built with Poetry. To publish a new release to PyPI:\n\n1. Update version in `pyproject.toml`.\n2. Commit and tag with the pattern `idun-agent-engine-vX.Y.Z`.\n3. Push the tag to GitHub. The `Publish idun-agent-engine` workflow will build and publish to PyPI using `PYPI_API_TOKEN` secret.\n\nManual build (optional):\n\n```bash\ncd libs/idun_agent_engine\npoetry build\n```\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Python SDK and runtime to serve AI agents with FastAPI, LangGraph, and observability.",
    "version": "0.1.0",
    "project_urls": {
        "Documentation": "https://github.com/geoffreyharrazi/idun-agent-manager/tree/main/libs/idun_agent_engine",
        "Homepage": "https://github.com/geoffreyharrazi/idun-agent-manager",
        "Issues": "https://github.com/geoffreyharrazi/idun-agent-manager/issues",
        "Repository": "https://github.com/geoffreyharrazi/idun-agent-manager"
    },
    "split_keywords": [
        "agents",
        " langgraph",
        " fastapi",
        " sdk",
        " llm",
        " observability"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "344310fa2916fdf34b8f7fe5a2064316cdee55d9772f1f8a718b15dbd3524022",
                "md5": "8453f8a76d74021967155b0610d8d4b3",
                "sha256": "6cb09e342d45c91f19aa6508c007a1a3a652f5fb38da2925a02d87210e545cc2"
            },
            "downloads": -1,
            "filename": "idun_agent_engine-0.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "8453f8a76d74021967155b0610d8d4b3",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.14,>=3.13",
            "size": 5262,
            "upload_time": "2025-08-14T13:28:39",
            "upload_time_iso_8601": "2025-08-14T13:28:39.504474Z",
            "url": "https://files.pythonhosted.org/packages/34/43/10fa2916fdf34b8f7fe5a2064316cdee55d9772f1f8a718b15dbd3524022/idun_agent_engine-0.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "99b7a2252365b099d4c47c9b8352616b463bb82221b857346c2568da4c96b3c7",
                "md5": "7b2b30ed9e477d05c7237272deb5a1f5",
                "sha256": "2570eed8437c0e74fcb2f496043544817260f39b9cea09533a6576c32c4e6798"
            },
            "downloads": -1,
            "filename": "idun_agent_engine-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "7b2b30ed9e477d05c7237272deb5a1f5",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.14,>=3.13",
            "size": 5177,
            "upload_time": "2025-08-14T13:28:41",
            "upload_time_iso_8601": "2025-08-14T13:28:41.175382Z",
            "url": "https://files.pythonhosted.org/packages/99/b7/a2252365b099d4c47c9b8352616b463bb82221b857346c2568da4c96b3c7/idun_agent_engine-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-14 13:28:41",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "geoffreyharrazi",
    "github_project": "idun-agent-manager",
    "github_not_found": true,
    "lcname": "idun-agent-engine"
}
        
Elapsed time: 1.99406s