langbase


Namelangbase JSON
Version 0.0.3 PyPI version JSON
download
home_pageNone
SummaryPython SDK for the Langbase API
upload_time2025-08-07 00:37:19
maintainerNone
docs_urlNone
authorNone
requires_python>=3.7
licenseApache-2.0
keywords ai langbase agent memory rag mcp pipes workflow llms
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Langbase Python SDK

[![PyPI version](https://badge.fury.io/py/langbase.svg)](https://badge.fury.io/py/langbase)
[![Python 3.7+](https://img.shields.io/badge/python-3.7+-blue.svg)](https://www.python.org/downloads/)
[![License: Apache 2.0](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)

The official Python SDK for [Langbase](https://langbase.com) - Build declarative and composable AI-powered LLM products with ease.

## Documentation

Check the [Langbase SDK documentation](https://langbase.com/docs/sdk) for more details.

The following examples are for reference only. Prefer docs for the latest information.

## Features

- **Simple and intuitive API** - Get started in minutes
- **Streaming support** - Real-time text generation with typed events
- **Type safety** - Full type hints for better IDE support
- **Minimal dependencies** - Only what you need
- **Python 3.7+** - Support for modern Python versions

## Installation

Install Langbase SDK:

```bash
pip install langbase
```

Install dotenv:

```bash
pip install dotenv
```

## Quick Start

### 1. Set up your API key

Create a `.env` file and add your [Langbase API Key](https://langbase.com/docs/api-reference/api-keys).

```bash
LANGBASE_API_KEY="your-api-key"
LLM_API_KEY="your-llm-api-key"
```

---

### 2. Initialize the client

```python
from langbase import Langbase
import os
from dotenv import load_dotenv

load_dotenv()

# Get API key from environment variable
langbase_api_key = os.getenv("LANGBASE_API_KEY")
llm_api_key = os.getenv("LLM_API_KEY")

# Initialize the client
langbase = Langbase(api_key=langbase_api_key)
langbase = Langbase(api_key=langbase_api_key)
```

### 3. Generate text

```python
# Simple generation
response = langbase.agent.run(
    input=[{"role": "user", "content": "Tell me about AI"}],
    model="openai:gpt-4.1-mini",
    api_key=llm_api_key,
)

print(response["output"])
```

---

### 4. Stream text (Simple)

```python
form langbase import get_runner

# Stream text as it's generated
response = langbase.agent.run(
    input=[{"role": "user", "content": "Tell me about AI"}],
    model="openai:gpt-4.1-mini",
    api_key=llm_api_key,
    stream=True,
)

runner = get_runner(response)

for content in runner.text_generator():
    print(content, end="", flush=True)
```

### 5. Stream with typed events (Advanced)

```python
from langbase import StreamEventType, get_typed_runner

response = langbase.agent.run(
    input=[{"role": "user", "content": "What is an AI Engineer?"}],
    model="openai:gpt-4.1-mini",
    api_key=llm_api_key,
    stream=True,
)

# Create typed stream processor
runner = get_typed_runner(response)

# Register event handlers
runner.on(
    StreamEventType.CONNECT,
    lambda event: print(f"✓ Connected! Thread ID: {event['threadId']}\n"),
)

runner.on(
    StreamEventType.CONTENT,
    lambda event: print(event["content"], end="", flush=True),
)

runner.on(
    StreamEventType.TOOL_CALL,
    lambda event: print(
        f"\n🔧 Tool call: {event['toolCall']['function']['name']}"
    ),
)

runner.on(
    StreamEventType.COMPLETION,
    lambda event: print(f"\n\n✓ Completed! Reason: {event['reason']}"),
)

runner.on(
    StreamEventType.ERROR,
    lambda event: print(f"\n❌ Error: {event['message']}"),
)

runner.on(
    StreamEventType.END,
    lambda event: print(f"⏱️  Total duration: {event['duration']:.2f}s"),
)

# Process the stream
runner.process()
```

## Core Features

### Pipes - AI Pipeline Execution

```python
# List all pipes
pipes = langbase.pipes.list()

# Run a pipe
response = langbase.pipes.run(
    name="ai-agent",
    messages=[{"role": "user", "content": "Hello!"}],
    variables={"style": "friendly"},  # Optional variables
    stream=True,  # Enable streaming
)
```

### Memory - Persistent Context Storage

```python
# Create a memory
memory = langbase.memories.create(
    name="product-docs",
    description="Product documentation",
)

# Upload documents
langbase.memories.documents.upload(
    memory_name="product-docs",
    document_name="guide.pdf",
    document=open("guide.pdf", "rb"),
    content_type="application/pdf",
)

# Retrieve relevant context
results = langbase.memories.retrieve(
    query="How do I get started?",
    memory=[{"name": "product-docs"}],
    top_k=3,
)
```

### Agent - LLM Agent Execution

```python
# Run an agent with tools
response = langbase.agent.run(
response = langbase.agent.run(
    model="openai:gpt-4",
    messages=[{"role": "user", "content": "Search for AI news"}],
    tools=[{"type": "function", "function": {...}}],
    tool_choice="auto",
    api_key="your-llm-api-key",
    stream=True,
)
```

### Tools - Built-in Utilities

```python
# Chunk text for processing
chunks = langbase.chunker(
chunks = langbase.chunker(
    content="Long text to split...",
    chunk_max_length=1024,
    chunk_overlap=256,
)

# Generate embeddings
embeddings = langbase.embed(
embeddings = langbase.embed(
    chunks=["Text 1", "Text 2"],
    embedding_model="openai:text-embedding-3-small",
)

# Parse documents
content = langbase.parser(
content = langbase.parser(
    document=open("document.pdf", "rb"),
    document_name="document.pdf",
    content_type="application/pdf",
)
```

## Examples

Explore the [examples](https://github.com/LangbaseInc/langbase-python-sdk/tree/main/examples) directory for complete working examples:

- [Generate text](https://github.com/LangbaseInc/langbase-python-sdk/tree/main/examples/agent/agent.run.py)
- [Stream text](https://github.com/LangbaseInc/langbase-python-sdk/blob/main/examples/agent/agent.run.stream.py)
- [Work with memory](https://github.com/LangbaseInc/langbase-python-sdk/tree/main/examples/memory/)
- [Agent with tools](https://github.com/LangbaseInc/langbase-python-sdk/blob/main/examples/agent/agent.run.tool.py)
- [Document processing](https://github.com/LangbaseInc/langbase-python-sdk/tree/main/examples/parser/)
- [Workflow automation](https://github.com/LangbaseInc/langbase-python-sdk/tree/main/examples/workflow/)

## SDK Reference

For detailed SDK documentation, visit [langbase.com/docs/sdk](https://langbase.com/docs/sdk).

## Contributing

We welcome contributions! Please see our [Contributing Guide](https://github.com/LangbaseInc/langbase-python-sdk/tree/main/CONTRIBUTING.md) for details.

## Support

- [Documentation](https://langbase.com/docs)
- [Discord Community](https://langbase.com/discord)
- [Issue Tracker](https://github.com/LangbaseInc/langbase-python-sdk/issues)

## License

See the [LICENSE](https://github.com/LangbaseInc/langbase-python-sdk/blob/main/LICENCE) file for details.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "langbase",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": "ai, langbase, agent, memory, rag, mcp, pipes, workflow, llms",
    "author": null,
    "author_email": "Ahmad Awais <aa@ahmadawais.com>, Ahmad Bilal <bilal@langbase.com>, Ashar Irfan <ashar@langbase.com>, Saad Irfan <saad@langbase.com>, Ankit Kumar <ankit@langbase.com>, Saqib Ameen <saqib@langbase.com>",
    "download_url": "https://files.pythonhosted.org/packages/27/a3/e0e0faed11be69990066dc932bc49bf3875bed7f1aa2ee2bcdb193cdd5f3/langbase-0.0.3.tar.gz",
    "platform": null,
    "description": "# Langbase Python SDK\n\n[![PyPI version](https://badge.fury.io/py/langbase.svg)](https://badge.fury.io/py/langbase)\n[![Python 3.7+](https://img.shields.io/badge/python-3.7+-blue.svg)](https://www.python.org/downloads/)\n[![License: Apache 2.0](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)\n\nThe official Python SDK for [Langbase](https://langbase.com) - Build declarative and composable AI-powered LLM products with ease.\n\n## Documentation\n\nCheck the [Langbase SDK documentation](https://langbase.com/docs/sdk) for more details.\n\nThe following examples are for reference only. Prefer docs for the latest information.\n\n## Features\n\n- **Simple and intuitive API** - Get started in minutes\n- **Streaming support** - Real-time text generation with typed events\n- **Type safety** - Full type hints for better IDE support\n- **Minimal dependencies** - Only what you need\n- **Python 3.7+** - Support for modern Python versions\n\n## Installation\n\nInstall Langbase SDK:\n\n```bash\npip install langbase\n```\n\nInstall dotenv:\n\n```bash\npip install dotenv\n```\n\n## Quick Start\n\n### 1. Set up your API key\n\nCreate a `.env` file and add your [Langbase API Key](https://langbase.com/docs/api-reference/api-keys).\n\n```bash\nLANGBASE_API_KEY=\"your-api-key\"\nLLM_API_KEY=\"your-llm-api-key\"\n```\n\n---\n\n### 2. Initialize the client\n\n```python\nfrom langbase import Langbase\nimport os\nfrom dotenv import load_dotenv\n\nload_dotenv()\n\n# Get API key from environment variable\nlangbase_api_key = os.getenv(\"LANGBASE_API_KEY\")\nllm_api_key = os.getenv(\"LLM_API_KEY\")\n\n# Initialize the client\nlangbase = Langbase(api_key=langbase_api_key)\nlangbase = Langbase(api_key=langbase_api_key)\n```\n\n### 3. Generate text\n\n```python\n# Simple generation\nresponse = langbase.agent.run(\n    input=[{\"role\": \"user\", \"content\": \"Tell me about AI\"}],\n    model=\"openai:gpt-4.1-mini\",\n    api_key=llm_api_key,\n)\n\nprint(response[\"output\"])\n```\n\n---\n\n### 4. Stream text (Simple)\n\n```python\nform langbase import get_runner\n\n# Stream text as it's generated\nresponse = langbase.agent.run(\n    input=[{\"role\": \"user\", \"content\": \"Tell me about AI\"}],\n    model=\"openai:gpt-4.1-mini\",\n    api_key=llm_api_key,\n    stream=True,\n)\n\nrunner = get_runner(response)\n\nfor content in runner.text_generator():\n    print(content, end=\"\", flush=True)\n```\n\n### 5. Stream with typed events (Advanced)\n\n```python\nfrom langbase import StreamEventType, get_typed_runner\n\nresponse = langbase.agent.run(\n    input=[{\"role\": \"user\", \"content\": \"What is an AI Engineer?\"}],\n    model=\"openai:gpt-4.1-mini\",\n    api_key=llm_api_key,\n    stream=True,\n)\n\n# Create typed stream processor\nrunner = get_typed_runner(response)\n\n# Register event handlers\nrunner.on(\n    StreamEventType.CONNECT,\n    lambda event: print(f\"\u2713 Connected! Thread ID: {event['threadId']}\\n\"),\n)\n\nrunner.on(\n    StreamEventType.CONTENT,\n    lambda event: print(event[\"content\"], end=\"\", flush=True),\n)\n\nrunner.on(\n    StreamEventType.TOOL_CALL,\n    lambda event: print(\n        f\"\\n\ud83d\udd27 Tool call: {event['toolCall']['function']['name']}\"\n    ),\n)\n\nrunner.on(\n    StreamEventType.COMPLETION,\n    lambda event: print(f\"\\n\\n\u2713 Completed! Reason: {event['reason']}\"),\n)\n\nrunner.on(\n    StreamEventType.ERROR,\n    lambda event: print(f\"\\n\u274c Error: {event['message']}\"),\n)\n\nrunner.on(\n    StreamEventType.END,\n    lambda event: print(f\"\u23f1\ufe0f  Total duration: {event['duration']:.2f}s\"),\n)\n\n# Process the stream\nrunner.process()\n```\n\n## Core Features\n\n### Pipes - AI Pipeline Execution\n\n```python\n# List all pipes\npipes = langbase.pipes.list()\n\n# Run a pipe\nresponse = langbase.pipes.run(\n    name=\"ai-agent\",\n    messages=[{\"role\": \"user\", \"content\": \"Hello!\"}],\n    variables={\"style\": \"friendly\"},  # Optional variables\n    stream=True,  # Enable streaming\n)\n```\n\n### Memory - Persistent Context Storage\n\n```python\n# Create a memory\nmemory = langbase.memories.create(\n    name=\"product-docs\",\n    description=\"Product documentation\",\n)\n\n# Upload documents\nlangbase.memories.documents.upload(\n    memory_name=\"product-docs\",\n    document_name=\"guide.pdf\",\n    document=open(\"guide.pdf\", \"rb\"),\n    content_type=\"application/pdf\",\n)\n\n# Retrieve relevant context\nresults = langbase.memories.retrieve(\n    query=\"How do I get started?\",\n    memory=[{\"name\": \"product-docs\"}],\n    top_k=3,\n)\n```\n\n### Agent - LLM Agent Execution\n\n```python\n# Run an agent with tools\nresponse = langbase.agent.run(\nresponse = langbase.agent.run(\n    model=\"openai:gpt-4\",\n    messages=[{\"role\": \"user\", \"content\": \"Search for AI news\"}],\n    tools=[{\"type\": \"function\", \"function\": {...}}],\n    tool_choice=\"auto\",\n    api_key=\"your-llm-api-key\",\n    stream=True,\n)\n```\n\n### Tools - Built-in Utilities\n\n```python\n# Chunk text for processing\nchunks = langbase.chunker(\nchunks = langbase.chunker(\n    content=\"Long text to split...\",\n    chunk_max_length=1024,\n    chunk_overlap=256,\n)\n\n# Generate embeddings\nembeddings = langbase.embed(\nembeddings = langbase.embed(\n    chunks=[\"Text 1\", \"Text 2\"],\n    embedding_model=\"openai:text-embedding-3-small\",\n)\n\n# Parse documents\ncontent = langbase.parser(\ncontent = langbase.parser(\n    document=open(\"document.pdf\", \"rb\"),\n    document_name=\"document.pdf\",\n    content_type=\"application/pdf\",\n)\n```\n\n## Examples\n\nExplore the [examples](https://github.com/LangbaseInc/langbase-python-sdk/tree/main/examples) directory for complete working examples:\n\n- [Generate text](https://github.com/LangbaseInc/langbase-python-sdk/tree/main/examples/agent/agent.run.py)\n- [Stream text](https://github.com/LangbaseInc/langbase-python-sdk/blob/main/examples/agent/agent.run.stream.py)\n- [Work with memory](https://github.com/LangbaseInc/langbase-python-sdk/tree/main/examples/memory/)\n- [Agent with tools](https://github.com/LangbaseInc/langbase-python-sdk/blob/main/examples/agent/agent.run.tool.py)\n- [Document processing](https://github.com/LangbaseInc/langbase-python-sdk/tree/main/examples/parser/)\n- [Workflow automation](https://github.com/LangbaseInc/langbase-python-sdk/tree/main/examples/workflow/)\n\n## SDK Reference\n\nFor detailed SDK documentation, visit [langbase.com/docs/sdk](https://langbase.com/docs/sdk).\n\n## Contributing\n\nWe welcome contributions! Please see our [Contributing Guide](https://github.com/LangbaseInc/langbase-python-sdk/tree/main/CONTRIBUTING.md) for details.\n\n## Support\n\n- [Documentation](https://langbase.com/docs)\n- [Discord Community](https://langbase.com/discord)\n- [Issue Tracker](https://github.com/LangbaseInc/langbase-python-sdk/issues)\n\n## License\n\nSee the [LICENSE](https://github.com/LangbaseInc/langbase-python-sdk/blob/main/LICENCE) file for details.\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "Python SDK for the Langbase API",
    "version": "0.0.3",
    "project_urls": {
        "Documentation": "https://docs.langbase.com",
        "Homepage": "https://langbase.com",
        "Issues": "https://github.com/LangbaseInc/langbase-python-sdk/issues",
        "Repository": "https://github.com/LangbaseInc/langbase-python-sdk"
    },
    "split_keywords": [
        "ai",
        " langbase",
        " agent",
        " memory",
        " rag",
        " mcp",
        " pipes",
        " workflow",
        " llms"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "29e6c5b37f025febbc8e3c4cad4ea1783f139178fbb4888e1f089a9ed6cd3efd",
                "md5": "3225c719428aa891567717d25573b970",
                "sha256": "9f2510b37778c84507614a960021015e82613018442f75aa91068a383155c257"
            },
            "downloads": -1,
            "filename": "langbase-0.0.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "3225c719428aa891567717d25573b970",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 34454,
            "upload_time": "2025-08-07T00:37:18",
            "upload_time_iso_8601": "2025-08-07T00:37:18.165866Z",
            "url": "https://files.pythonhosted.org/packages/29/e6/c5b37f025febbc8e3c4cad4ea1783f139178fbb4888e1f089a9ed6cd3efd/langbase-0.0.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "27a3e0e0faed11be69990066dc932bc49bf3875bed7f1aa2ee2bcdb193cdd5f3",
                "md5": "d56c8d109f4f46e6af4901d513288d98",
                "sha256": "494524c4c8caa71a95bef4ca4eff928330c6112438e27c8d34b194028693f238"
            },
            "downloads": -1,
            "filename": "langbase-0.0.3.tar.gz",
            "has_sig": false,
            "md5_digest": "d56c8d109f4f46e6af4901d513288d98",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 37532,
            "upload_time": "2025-08-07T00:37:19",
            "upload_time_iso_8601": "2025-08-07T00:37:19.052459Z",
            "url": "https://files.pythonhosted.org/packages/27/a3/e0e0faed11be69990066dc932bc49bf3875bed7f1aa2ee2bcdb193cdd5f3/langbase-0.0.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-07 00:37:19",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "LangbaseInc",
    "github_project": "langbase-python-sdk",
    "github_not_found": true,
    "lcname": "langbase"
}
        
Elapsed time: 0.55525s