planai


Nameplanai JSON
Version 0.1.5 PyPI version JSON
download
home_pagehttps://github.com/provos/planai
SummaryA simple framework for coordinating classical compute and LLM-based tasks.
upload_time2024-09-16 05:34:27
maintainerNone
docs_urlNone
authorNiels Provos
requires_python<4.0,>=3.10
licenseApache-2.0
keywords ai automation workflow llm
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # PlanAI

[![PyPI version](https://badge.fury.io/py/planai.svg)](https://badge.fury.io/py/planai)
[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)
[![Python Versions](https://img.shields.io/pypi/pyversions/planai.svg)](https://pypi.org/project/planai/)
[![Documentation Status](https://readthedocs.org/projects/planai/badge/?version=latest)](https://docs.getplanai.com/en/latest/?badge=latest)

**PlanAI** is an innovative system designed for complex task automation through a sophisticated graph-based architecture. It integrates traditional computations and cutting-edge AI technologies to enable versatile and efficient workflow management.

## Table of Contents
- [Key Features](#key-features)
- [Requirements](#requirements)
- [Installation](#installation)
- [Usage](#usage)
- [Example: Textbook Q&A Generation](#example-textbook-qa-generation)
- [Monitoring Dashboard](#monitoring-dashboard)
- [Advanced Features](#advanced-features)
- [Documentation](#documentation)
- [Contributing](#contributing)
- [License](#license)

## Key Features

- **Graph-Based Architecture**: Construct dynamic workflows comprising interconnected TaskWorkers for highly customizable automation.
- **Hybrid TaskWorkers**: Combine conventional computations (e.g., API calls) with powerful LLM-driven operations, leveraging Retrieval-Augmented Generation (RAG) capabilities.
- **Type Safety with Pydantic**: Ensure data integrity and type consistency across workflows with Pydantic-validated input and output.
- **Intelligent Data Routing**: Utilize type-aware routing to efficiently manage data flow between nodes, adapting to multiple downstream consumers.
- **Input Provenance Tracking**: Trace the lineage and origin of each Task as it flows through the workflow, enabling detailed analysis and debugging of complex processes.
- **Automatic Prompt Optimization**: Improve your LLM prompts using data and AI-driven optimization

## Requirements

- Python 3.10+
- Poetry (for development)

## Installation

You can install PlanAI using pip:

```bash
pip install planai
```

For development, clone the repository and install dependencies:

```bash
git clone https://github.com/provos/planai.git
cd planai
poetry install
```

## Usage

PlanAI allows you to create complex, AI-enhanced workflows using a graph-based architecture. Here's a basic example:

```python
from planai import Graph, TaskWorker, Task, LLMTaskWorker, llm_from_config

# Define custom TaskWorkers
class CustomDataProcessor(TaskWorker):
    output_types: List[Type[Task]] = [ProcessedData]


    def consume_work(self, task: RawData):
        processed_data = self.process(task.data)
        self.publish_work(ProcessedData(data=processed_data))

# Define an LLM-powered task
class AIAnalyzer(LLMTaskWorker):
    prompt: str ="Analyze the provided data and derive insights"
    output_types: List[Type[Task]] = [AnalysisResult]


    def consume_work(self, task: ProcessedData):
        super().consume_work(task)

# Create and run the workflow
graph = Graph(name="Data Analysis Workflow")
data_processor = CustomDataProcessor()
ai_analyzer = AIAnalyzer(
   llm=llm_from_config(provider="openai", model_name="gpt-4"))

graph.add_workers(data_processor, ai_analyzer)
graph.set_dependency(data_processor, ai_analyzer)

initial_data = RawData(data="Some raw data")
graph.run(initial_tasks=[(data_processor, initial_data)])
```

## Example: Textbook Q&A Generation

PlanAI has been used to create a system for generating high-quality question and answer pairs from textbook content. This example demonstrates PlanAI's capability to manage complex, multi-step workflows involving AI-powered text processing and content generation. The application processes textbook content through a series of steps including text cleaning, relevance filtering, question generation and evaluation, and answer generation and selection. For a detailed walkthrough of this example, including code and explanation, please see the [examples/textbook](examples/textbook) directory. The resulting dataset, generated from "World History Since 1500: An Open and Free Textbook," is available in our [World History 1500 Q&A repository](https://github.com/provos/world-history-1500-qa), showcasing the practical application of PlanAI in educational content processing and dataset creation.

## Monitoring Dashboard

PlanAI includes a built-in web-based monitoring dashboard that provides real-time insights into your graph execution. This feature can be enabled by setting `run_dashboard=True` when calling the `graph.run()` method.

Key features of the monitoring dashboard:

- **Real-time Updates**: The dashboard uses server-sent events (SSE) to provide live updates on task statuses without requiring page refreshes.
- **Task Categories**: Tasks are organized into three categories: Queued, Active, and Completed, allowing for easy tracking of workflow progress.
- **Detailed Task Information**: Each task displays its ID, type, and assigned worker. Users can click on a task to view additional details such as provenance and input provenance.

To enable the dashboard:

```python
graph.run(initial_tasks, run_dashboard=True)
```

When enabled, the dashboard will be accessible at `http://localhost:5000` by default. The application will continue running until manually terminated, allowing for ongoing monitoring of long-running workflows.

Note: Enabling the dashboard will block the main thread, so it's recommended for development and debugging purposes. For production use, consider implementing a separate monitoring solution.

## Advanced Features

PlanAI supports advanced features like:

- Caching results with `CachedTaskWorker`
- Joining multiple task results with `JoinedTaskWorker`
- Integrating with various LLM providers (OpenAI, Ollama, etc.)
- **Automatic Prompt Optimization**: Improve your LLMTaskWorker prompts using AI-driven optimization. [Learn more](PROMPT_OPTIMIZATION.md)


For more detailed examples and advanced usage, please refer to the `examples/` directory in the repository.

## Documentation

Full documentation for PlanAI is available at [https://docs.getplanai.com/](https://docs.getplanai.com/)

## Contributing

We welcome contributions to PlanAI! Please see our [Contributing Guide](CONTRIBUTING.md) for more details on how to get started.

## License

This project is licensed under the Apache License 2.0 - see the [LICENSE](LICENSE) file for details.

For any questions or support, please open an issue on our [GitHub issue tracker](https://github.com/provos/planai/issues).
            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/provos/planai",
    "name": "planai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.10",
    "maintainer_email": null,
    "keywords": "ai, automation, workflow, llm",
    "author": "Niels Provos",
    "author_email": "planai@provos.org",
    "download_url": "https://files.pythonhosted.org/packages/36/55/f5ec278715d8326e01aac1020ef98fd077093ac8480a2c6dd30089e36e93/planai-0.1.5.tar.gz",
    "platform": null,
    "description": "# PlanAI\n\n[![PyPI version](https://badge.fury.io/py/planai.svg)](https://badge.fury.io/py/planai)\n[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)\n[![Python Versions](https://img.shields.io/pypi/pyversions/planai.svg)](https://pypi.org/project/planai/)\n[![Documentation Status](https://readthedocs.org/projects/planai/badge/?version=latest)](https://docs.getplanai.com/en/latest/?badge=latest)\n\n**PlanAI** is an innovative system designed for complex task automation through a sophisticated graph-based architecture. It integrates traditional computations and cutting-edge AI technologies to enable versatile and efficient workflow management.\n\n## Table of Contents\n- [Key Features](#key-features)\n- [Requirements](#requirements)\n- [Installation](#installation)\n- [Usage](#usage)\n- [Example: Textbook Q&A Generation](#example-textbook-qa-generation)\n- [Monitoring Dashboard](#monitoring-dashboard)\n- [Advanced Features](#advanced-features)\n- [Documentation](#documentation)\n- [Contributing](#contributing)\n- [License](#license)\n\n## Key Features\n\n- **Graph-Based Architecture**: Construct dynamic workflows comprising interconnected TaskWorkers for highly customizable automation.\n- **Hybrid TaskWorkers**: Combine conventional computations (e.g., API calls) with powerful LLM-driven operations, leveraging Retrieval-Augmented Generation (RAG) capabilities.\n- **Type Safety with Pydantic**: Ensure data integrity and type consistency across workflows with Pydantic-validated input and output.\n- **Intelligent Data Routing**: Utilize type-aware routing to efficiently manage data flow between nodes, adapting to multiple downstream consumers.\n- **Input Provenance Tracking**: Trace the lineage and origin of each Task as it flows through the workflow, enabling detailed analysis and debugging of complex processes.\n- **Automatic Prompt Optimization**: Improve your LLM prompts using data and AI-driven optimization\n\n## Requirements\n\n- Python 3.10+\n- Poetry (for development)\n\n## Installation\n\nYou can install PlanAI using pip:\n\n```bash\npip install planai\n```\n\nFor development, clone the repository and install dependencies:\n\n```bash\ngit clone https://github.com/provos/planai.git\ncd planai\npoetry install\n```\n\n## Usage\n\nPlanAI allows you to create complex, AI-enhanced workflows using a graph-based architecture. Here's a basic example:\n\n```python\nfrom planai import Graph, TaskWorker, Task, LLMTaskWorker, llm_from_config\n\n# Define custom TaskWorkers\nclass CustomDataProcessor(TaskWorker):\n    output_types: List[Type[Task]] = [ProcessedData]\n\n\n    def consume_work(self, task: RawData):\n        processed_data = self.process(task.data)\n        self.publish_work(ProcessedData(data=processed_data))\n\n# Define an LLM-powered task\nclass AIAnalyzer(LLMTaskWorker):\n    prompt: str =\"Analyze the provided data and derive insights\"\n    output_types: List[Type[Task]] = [AnalysisResult]\n\n\n    def consume_work(self, task: ProcessedData):\n        super().consume_work(task)\n\n# Create and run the workflow\ngraph = Graph(name=\"Data Analysis Workflow\")\ndata_processor = CustomDataProcessor()\nai_analyzer = AIAnalyzer(\n   llm=llm_from_config(provider=\"openai\", model_name=\"gpt-4\"))\n\ngraph.add_workers(data_processor, ai_analyzer)\ngraph.set_dependency(data_processor, ai_analyzer)\n\ninitial_data = RawData(data=\"Some raw data\")\ngraph.run(initial_tasks=[(data_processor, initial_data)])\n```\n\n## Example: Textbook Q&A Generation\n\nPlanAI has been used to create a system for generating high-quality question and answer pairs from textbook content. This example demonstrates PlanAI's capability to manage complex, multi-step workflows involving AI-powered text processing and content generation. The application processes textbook content through a series of steps including text cleaning, relevance filtering, question generation and evaluation, and answer generation and selection. For a detailed walkthrough of this example, including code and explanation, please see the [examples/textbook](examples/textbook) directory. The resulting dataset, generated from \"World History Since 1500: An Open and Free Textbook,\" is available in our [World History 1500 Q&A repository](https://github.com/provos/world-history-1500-qa), showcasing the practical application of PlanAI in educational content processing and dataset creation.\n\n## Monitoring Dashboard\n\nPlanAI includes a built-in web-based monitoring dashboard that provides real-time insights into your graph execution. This feature can be enabled by setting `run_dashboard=True` when calling the `graph.run()` method.\n\nKey features of the monitoring dashboard:\n\n- **Real-time Updates**: The dashboard uses server-sent events (SSE) to provide live updates on task statuses without requiring page refreshes.\n- **Task Categories**: Tasks are organized into three categories: Queued, Active, and Completed, allowing for easy tracking of workflow progress.\n- **Detailed Task Information**: Each task displays its ID, type, and assigned worker. Users can click on a task to view additional details such as provenance and input provenance.\n\nTo enable the dashboard:\n\n```python\ngraph.run(initial_tasks, run_dashboard=True)\n```\n\nWhen enabled, the dashboard will be accessible at `http://localhost:5000` by default. The application will continue running until manually terminated, allowing for ongoing monitoring of long-running workflows.\n\nNote: Enabling the dashboard will block the main thread, so it's recommended for development and debugging purposes. For production use, consider implementing a separate monitoring solution.\n\n## Advanced Features\n\nPlanAI supports advanced features like:\n\n- Caching results with `CachedTaskWorker`\n- Joining multiple task results with `JoinedTaskWorker`\n- Integrating with various LLM providers (OpenAI, Ollama, etc.)\n- **Automatic Prompt Optimization**: Improve your LLMTaskWorker prompts using AI-driven optimization. [Learn more](PROMPT_OPTIMIZATION.md)\n\n\nFor more detailed examples and advanced usage, please refer to the `examples/` directory in the repository.\n\n## Documentation\n\nFull documentation for PlanAI is available at [https://docs.getplanai.com/](https://docs.getplanai.com/)\n\n## Contributing\n\nWe welcome contributions to PlanAI! Please see our [Contributing Guide](CONTRIBUTING.md) for more details on how to get started.\n\n## License\n\nThis project is licensed under the Apache License 2.0 - see the [LICENSE](LICENSE) file for details.\n\nFor any questions or support, please open an issue on our [GitHub issue tracker](https://github.com/provos/planai/issues).",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "A simple framework for coordinating classical compute and LLM-based tasks.",
    "version": "0.1.5",
    "project_urls": {
        "Documentation": "https://planai.readthedocs.io",
        "Homepage": "https://github.com/provos/planai",
        "Repository": "https://github.com/provos/planai"
    },
    "split_keywords": [
        "ai",
        " automation",
        " workflow",
        " llm"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "91480e7a3a0635e14d531334f9dd397e58b9af1462feaa7f2a26053568442e40",
                "md5": "8da6c75d973b89fa0e8998297e521280",
                "sha256": "ac82e4725f0e03ee2a1ad4942bd51643ce0ff4dec92edde6bf678fd3ba90f50d"
            },
            "downloads": -1,
            "filename": "planai-0.1.5-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "8da6c75d973b89fa0e8998297e521280",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.10",
            "size": 68391,
            "upload_time": "2024-09-16T05:34:25",
            "upload_time_iso_8601": "2024-09-16T05:34:25.653378Z",
            "url": "https://files.pythonhosted.org/packages/91/48/0e7a3a0635e14d531334f9dd397e58b9af1462feaa7f2a26053568442e40/planai-0.1.5-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3655f5ec278715d8326e01aac1020ef98fd077093ac8480a2c6dd30089e36e93",
                "md5": "295108473d61d68c3fb1898310dcd85d",
                "sha256": "c95b5d3951de13fd7ba6ca7a0e3fad5b1b6f8d3439d2e9674b9ee94d2ced0f18"
            },
            "downloads": -1,
            "filename": "planai-0.1.5.tar.gz",
            "has_sig": false,
            "md5_digest": "295108473d61d68c3fb1898310dcd85d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.10",
            "size": 57693,
            "upload_time": "2024-09-16T05:34:27",
            "upload_time_iso_8601": "2024-09-16T05:34:27.282631Z",
            "url": "https://files.pythonhosted.org/packages/36/55/f5ec278715d8326e01aac1020ef98fd077093ac8480a2c6dd30089e36e93/planai-0.1.5.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-09-16 05:34:27",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "provos",
    "github_project": "planai",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "planai"
}
        
Elapsed time: 0.99590s