| Name | black-langcube JSON |
| Version |
0.1.1
JSON |
| download |
| home_page | None |
| Summary | A LangGraph-based extension framework for complex workflow applications, enabling the integration of various AI models and tools into a cohesive system. |
| upload_time | 2025-09-09 07:15:21 |
| maintainer | None |
| docs_url | None |
| author | None |
| requires_python | >=3.9 |
| license | None |
| keywords |
langgraph
workflow
llm
langchain
ai
|
| VCS |
 |
| bugtrack_url |
|
| requirements |
No requirements were recorded.
|
| Travis-CI |
No Travis.
|
| coveralls test coverage |
No coveralls.
|
# Black LangCube
A LangGraph-based extension framework designed to facilitate the development of complex applications by providing a structured way to define and manage workflows.
## ๐ Features
- **BaseGraph Framework**: Foundational interface for constructing, compiling, and executing stateful workflow graphs
- **Data Structures**: Pydantic models for scientific article metadata, search strategies, outlines, and more
- **LLM Nodes**: Pre-built nodes for common language model operations
- **Helper Utilities**: Token counting, result processing, file management, and workflow utilities
- **Subgraph System**: Modular subworkflows for translation, output generation, and specialized tasks
- **Extensible Architecture**: Easy to extend with custom nodes and workflows
## ๐ฆ Installation
### From PyPI (when published):
```bash
pip install black_langcube
```
### Development Installation:
```bash
git clone https://github.com/cerna-kostka/black-langcube.git
cd black-langcube
pip install -e .
```
### With optional dependencies:
```bash
pip install black_langcube[dev,examples]
```
## ๐๏ธ Core Components
### BaseGraph
The foundation for building stateful workflow graphs using LangGraph:
```python
from black_langcube.graf.graph_base import BaseGraph, GraphState
class MyCustomGraph(BaseGraph):
def __init__(self, user_message, folder_name, language):
super().__init__(MyGraphState, user_message, folder_name, language)
self.build_graph()
def build_graph(self):
# Add nodes and edges to your workflow
self.add_node("my_node", my_node_function)
self.add_edge(START, "my_node")
self.add_edge("my_node", END)
@property
def workflow_name(self):
return "my_custom_graph"
```
### LLMNode
A base class for defining nodes that interact with language models:
```python
from black_langcube.llm_modules.LLMNodes.LLMNode import LLMNode
class MyCustomNode(LLMNode):
def generate_messages(self):
return [
("system", "You are a helpful assistant"),
("human", self.state.get("user_input", ""))
]
def execute(self, extra_input=None):
result, tokens = self.run_chain(extra_input)
return {"output": result, "tokens": tokens}
```
### Data Structures
Pydantic models for structured data handling:
```python
from black_langcube.data_structures.data_structures import Article, Strategies, Outline
# Use pre-defined data structures
article = Article(topic="AI Research", language="English")
strategies = Strategies(strategy1="Search academic papers", strategy2="Analyze trends")
```
### LLM Nodes
Pre-built nodes for language model operations:
```python
from black_langcube.llm_modules.LLMNodes.LLMNode import LLMNode
class MyCustomNode(LLMNode):
def generate_messages(self):
return [
("system", "You are a helpful assistant"),
("human", self.state.get("user_input", ""))
]
def execute(self, extra_input=None):
result, tokens = self.run_chain(extra_input)
return {"output": result, "tokens": tokens}
```
## ๐ Architecture
The library is organized into several key modules:
- **`graf/`**: Core graph classes and workflow definitions
- **`data_structures/`**: Pydantic models for data validation
- **`llm_modules/`**: Language model integration and node definitions
- **`helper_modules/`**: Utility functions and helper classes
- **`messages/`**: Message formatting and composition utilities
- **`prompts/`**: Prompt templates and configurations
- **`format_instructions/`**: Output formatting utilities
## ๐ ๏ธ Usage Examples
### Basic Workflow
```python
from black_langcube.graf.graph_base import BaseGraph, GraphState
from langgraph.graph import START, END
class SimpleWorkflow(BaseGraph):
def __init__(self, message, folder, language):
super().__init__(GraphState, message, folder, language)
self.build_graph()
def build_graph(self):
def process_message(state):
return {"result": f"Processed: {state['messages'][-1].content}"}
self.add_node("process", process_message)
self.add_edge(START, "process")
self.add_edge("process", END)
@property
def workflow_name(self):
return "simple_workflow"
# Usage
workflow = SimpleWorkflow("Hello, world!", "output", "English")
result = workflow.run()
```
### Using Subgraphs
```python
from black_langcube.graf.subgrafs.translator_en_subgraf import TranslatorEnSubgraf
# Translation subgraph
translator = TranslatorEnSubgraf(config, subfolder="translations")
result = translator.run(extra_input={
"translation_input": "Bonjour le monde",
"language": "French"
})
```
## ๐ง Configuration
The library uses environment variables for configuration. Create a `.env` file:
```env
OPENAI_API_KEY=your_openai_api_key_here
# optional: LangChain configuration
LANGCHAIN_API_KEY=your_langchain_api_key_here
LANGCHAIN_TRACING_V2=true
```
## ๐ Examples
See the `examples/` directory for complete working examples:
- **Basic Graph**: Simple workflow with custom nodes
- **Translation Pipeline**: Multi-language processing workflow
- **Scientific Article Processing**: Complex multi-step analysis pipeline
- **Custom Data Structures**: Extending the framework with your own models
## ๐งช Development
### Setting up development environment:
```bash
git clone https://github.com/cerna-kostka/black-langcube.git
cd black-langcube
pip install -e .[dev]
```
### Running tests:
```bash
pytest
```
### Code formatting:
```bash
black .
isort .
```
## ๐ Requirements
- Python 3.9+
- LangChain >= 0.3.24
- LangGraph >= 0.3.7
- Pydantic >= 2.0.0
- OpenAI API access
## ๐ค Contributing
This is a work in progress and contributions are welcome! Please feel free to:
1. Fork the repository
2. Create a feature branch
3. Make your changes
4. Add tests if applicable
5. Submit a pull request
## ๐ License
MIT License (MIT)
## โ ๏ธ Note
This library is intended to be used within a larger application context. The code is provided as-is and is actively being improved. Take it with a grain of salt and feel free to contribute improvements!
## ๐ Links
- [LangGraph Documentation](https://langchain-ai.github.io/langgraph/)
- [LangChain Documentation](https://python.langchain.com/)
- [Examples and Tutorials](./src/black_langcube/examples/)
Raw data
{
"_id": null,
"home_page": null,
"name": "black-langcube",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "langgraph, workflow, llm, langchain, ai",
"author": null,
"author_email": "Martin Vanek <martin.vanek@cerna-kostka.cz>",
"download_url": "https://files.pythonhosted.org/packages/0b/c8/fc1a8138fc75199a564a149fdb62dde3c05624b51cdfb6f96872735e4b4e/black_langcube-0.1.1.tar.gz",
"platform": null,
"description": "# Black LangCube\n\nA LangGraph-based extension framework designed to facilitate the development of complex applications by providing a structured way to define and manage workflows.\n\n## \ud83d\ude80 Features\n\n- **BaseGraph Framework**: Foundational interface for constructing, compiling, and executing stateful workflow graphs\n- **Data Structures**: Pydantic models for scientific article metadata, search strategies, outlines, and more\n- **LLM Nodes**: Pre-built nodes for common language model operations\n- **Helper Utilities**: Token counting, result processing, file management, and workflow utilities\n- **Subgraph System**: Modular subworkflows for translation, output generation, and specialized tasks\n- **Extensible Architecture**: Easy to extend with custom nodes and workflows\n\n## \ud83d\udce6 Installation\n\n### From PyPI (when published):\n```bash\npip install black_langcube\n```\n\n### Development Installation:\n```bash\ngit clone https://github.com/cerna-kostka/black-langcube.git\ncd black-langcube\npip install -e .\n```\n\n### With optional dependencies:\n```bash\npip install black_langcube[dev,examples]\n```\n\n## \ud83c\udfd7\ufe0f Core Components\n\n### BaseGraph\nThe foundation for building stateful workflow graphs using LangGraph:\n\n```python\nfrom black_langcube.graf.graph_base import BaseGraph, GraphState\n\nclass MyCustomGraph(BaseGraph):\n def __init__(self, user_message, folder_name, language):\n super().__init__(MyGraphState, user_message, folder_name, language)\n self.build_graph()\n \n def build_graph(self):\n # Add nodes and edges to your workflow\n self.add_node(\"my_node\", my_node_function)\n self.add_edge(START, \"my_node\")\n self.add_edge(\"my_node\", END)\n \n @property\n def workflow_name(self):\n return \"my_custom_graph\"\n```\n\n### LLMNode\nA base class for defining nodes that interact with language models:\n\n```python\nfrom black_langcube.llm_modules.LLMNodes.LLMNode import LLMNode\n\nclass MyCustomNode(LLMNode):\n def generate_messages(self):\n return [\n (\"system\", \"You are a helpful assistant\"),\n (\"human\", self.state.get(\"user_input\", \"\"))\n ]\n\n def execute(self, extra_input=None):\n result, tokens = self.run_chain(extra_input)\n return {\"output\": result, \"tokens\": tokens}\n```\n\n### Data Structures\nPydantic models for structured data handling:\n\n```python\nfrom black_langcube.data_structures.data_structures import Article, Strategies, Outline\n\n# Use pre-defined data structures\narticle = Article(topic=\"AI Research\", language=\"English\")\nstrategies = Strategies(strategy1=\"Search academic papers\", strategy2=\"Analyze trends\")\n```\n\n### LLM Nodes\nPre-built nodes for language model operations:\n\n```python\nfrom black_langcube.llm_modules.LLMNodes.LLMNode import LLMNode\n\nclass MyCustomNode(LLMNode):\n def generate_messages(self):\n return [\n (\"system\", \"You are a helpful assistant\"),\n (\"human\", self.state.get(\"user_input\", \"\"))\n ]\n \n def execute(self, extra_input=None):\n result, tokens = self.run_chain(extra_input)\n return {\"output\": result, \"tokens\": tokens}\n```\n\n## \ud83d\udcda Architecture\n\nThe library is organized into several key modules:\n\n- **`graf/`**: Core graph classes and workflow definitions\n- **`data_structures/`**: Pydantic models for data validation\n- **`llm_modules/`**: Language model integration and node definitions\n- **`helper_modules/`**: Utility functions and helper classes\n- **`messages/`**: Message formatting and composition utilities\n- **`prompts/`**: Prompt templates and configurations\n- **`format_instructions/`**: Output formatting utilities\n\n## \ud83d\udee0\ufe0f Usage Examples\n\n### Basic Workflow\n\n```python\nfrom black_langcube.graf.graph_base import BaseGraph, GraphState\nfrom langgraph.graph import START, END\n\nclass SimpleWorkflow(BaseGraph):\n def __init__(self, message, folder, language):\n super().__init__(GraphState, message, folder, language)\n self.build_graph()\n \n def build_graph(self):\n def process_message(state):\n return {\"result\": f\"Processed: {state['messages'][-1].content}\"}\n \n self.add_node(\"process\", process_message)\n self.add_edge(START, \"process\")\n self.add_edge(\"process\", END)\n \n @property\n def workflow_name(self):\n return \"simple_workflow\"\n\n# Usage\nworkflow = SimpleWorkflow(\"Hello, world!\", \"output\", \"English\")\nresult = workflow.run()\n```\n\n### Using Subgraphs\n\n```python\nfrom black_langcube.graf.subgrafs.translator_en_subgraf import TranslatorEnSubgraf\n\n# Translation subgraph\ntranslator = TranslatorEnSubgraf(config, subfolder=\"translations\")\nresult = translator.run(extra_input={\n \"translation_input\": \"Bonjour le monde\",\n \"language\": \"French\"\n})\n```\n\n## \ud83d\udd27 Configuration\n\nThe library uses environment variables for configuration. Create a `.env` file:\n\n```env\nOPENAI_API_KEY=your_openai_api_key_here\n\n# optional: LangChain configuration\nLANGCHAIN_API_KEY=your_langchain_api_key_here\nLANGCHAIN_TRACING_V2=true\n```\n\n## \ud83d\udcd6 Examples\n\nSee the `examples/` directory for complete working examples:\n\n- **Basic Graph**: Simple workflow with custom nodes\n- **Translation Pipeline**: Multi-language processing workflow\n- **Scientific Article Processing**: Complex multi-step analysis pipeline\n- **Custom Data Structures**: Extending the framework with your own models\n\n## \ud83e\uddea Development\n\n### Setting up development environment:\n\n```bash\ngit clone https://github.com/cerna-kostka/black-langcube.git\ncd black-langcube\npip install -e .[dev]\n```\n\n### Running tests:\n\n```bash\npytest\n```\n\n### Code formatting:\n\n```bash\nblack .\nisort .\n```\n\n## \ud83d\udccb Requirements\n\n- Python 3.9+\n- LangChain >= 0.3.24\n- LangGraph >= 0.3.7\n- Pydantic >= 2.0.0\n- OpenAI API access\n\n## \ud83e\udd1d Contributing\n\nThis is a work in progress and contributions are welcome! Please feel free to:\n\n1. Fork the repository\n2. Create a feature branch\n3. Make your changes\n4. Add tests if applicable\n5. Submit a pull request\n\n## \ud83d\udcc4 License\n\nMIT License (MIT) \n\n## \u26a0\ufe0f Note\n\nThis library is intended to be used within a larger application context. The code is provided as-is and is actively being improved. Take it with a grain of salt and feel free to contribute improvements!\n\n## \ud83d\udd17 Links\n\n- [LangGraph Documentation](https://langchain-ai.github.io/langgraph/)\n- [LangChain Documentation](https://python.langchain.com/)\n- [Examples and Tutorials](./src/black_langcube/examples/)\n",
"bugtrack_url": null,
"license": null,
"summary": "A LangGraph-based extension framework for complex workflow applications, enabling the integration of various AI models and tools into a cohesive system.",
"version": "0.1.1",
"project_urls": {
"Documentation": "https://github.com/cerna-kostka/black-langcube#readme",
"Homepage": "https://github.com/cerna-kostka/black-langcube",
"Issues": "https://github.com/cerna-kostka/black-langcube/issues",
"Repository": "https://github.com/cerna-kostka/black-langcube"
},
"split_keywords": [
"langgraph",
" workflow",
" llm",
" langchain",
" ai"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "377dbf36c65fa1487666554fda7182d16db77c0be1d1d4d99835e1d14313a012",
"md5": "d526fadd6d4ea8d292b67f83448dd880",
"sha256": "2d47327be5510de5bb1a5f7e548d8733ae6377062c920a53e4a407de888dd9f7"
},
"downloads": -1,
"filename": "black_langcube-0.1.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "d526fadd6d4ea8d292b67f83448dd880",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 65601,
"upload_time": "2025-09-09T07:15:20",
"upload_time_iso_8601": "2025-09-09T07:15:20.665596Z",
"url": "https://files.pythonhosted.org/packages/37/7d/bf36c65fa1487666554fda7182d16db77c0be1d1d4d99835e1d14313a012/black_langcube-0.1.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "0bc8fc1a8138fc75199a564a149fdb62dde3c05624b51cdfb6f96872735e4b4e",
"md5": "3dd2048b41ee325ca958ee91292d76a2",
"sha256": "5df6219d3326f3f461b55a563a440a541cb6f21807847763706527dcae56461c"
},
"downloads": -1,
"filename": "black_langcube-0.1.1.tar.gz",
"has_sig": false,
"md5_digest": "3dd2048b41ee325ca958ee91292d76a2",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 46156,
"upload_time": "2025-09-09T07:15:21",
"upload_time_iso_8601": "2025-09-09T07:15:21.971644Z",
"url": "https://files.pythonhosted.org/packages/0b/c8/fc1a8138fc75199a564a149fdb62dde3c05624b51cdfb6f96872735e4b4e/black_langcube-0.1.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-09-09 07:15:21",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "cerna-kostka",
"github_project": "black-langcube#readme",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "black-langcube"
}