langviz-studio


Namelangviz-studio JSON
Version 0.1.2 PyPI version JSON
download
home_pageNone
SummaryLangViz-Studio is an observability toolkit for LangGraph workflows.
upload_time2025-01-18 13:40:36
maintainerNone
docs_urlNone
authortubone
requires_python<4.0,>=3.9.0
licenseMIT
keywords langchain observability langgraph graphviz
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LangViz-Studio

LangViz-Studio is an observability toolkit for [LangGraph](https://pypi.org/project/langgraph/) workflows. 

<img src="https://github.com/tubone24/LangViz-studio/blob/main/images/logo.png?raw=true" width="30%" alt="logo"/>

It enables you to track the execution states, node transitions, and conditional edges of your state-based flows or agent pipelines in real-time, sending all data asynchronously to a customizable endpoint (e.g., a Next.js server) for visualization.

## Key Features

- GraphObservability
  - Automatically records and sends node start/end events, edge transitions, and conditional branch usage. 
  - Maintains a unique graphId (UUID by default) and an optional friendly graph_name. 
  - Sends data to a configured endpoint (http://localhost:3000 by default).

- ObservabilityStateGraph
  - A subclass of langgraph.graph.StateGraph that automatically wraps node functions and edge definitions so all observability data is captured with no extra code. 
  - Just use add_node(...), add_edge(...), add_conditional_edges(...) as normal, and your graph’s runtime states will be sent in real-time for visualization.

- Asynchronous and Non-blocking
  - Uses an internal background thread running an asyncio event loop to POST updates via httpx.AsyncClient, ensuring minimal impact on your main workflow thread.

## Installation

```bash
pip install langviz-studio
```

(Also ensure you have installed langgraph and other dependencies like langchain_core, if applicable.)

## Usage

Below is a minimal example showing how to use LangViz-Studio with [LangGraph](https://pypi.org/project/langgraph/).
In this example, we define two simple asynchronous node functions (run_start and run_end) and connect them in a StateGraph. Observability logs are sent automatically.

```python
import asyncio
from langviz_studio.observability import GraphObservability, ObservabilityStateGraph

async def run_start(state, config=None):
    print("=== [start] node invoked ===")
    state["msg"] = "Hello from start!"
    return state

async def run_end(state, config=None):
    print("=== [end] node invoked ===")
    state["result"] = "Done"
    return state

async def main():
    # 1. Set your Observability
    obs = GraphObservability(
        graph_name="MyExampleGraph"
    )

    # 2. Using ObservabilityStateGraph instead of StateGraph
    workflow = ObservabilityStateGraph(obs, state_type=dict)

    # 3. Nodes are added in the same manner as StateGraph
    workflow.add_node("start", run_start)
    workflow.add_node("end", run_end)
    # Edges / Conditional edges are also added in the same manner as StateGraph
    workflow.add_edge("start", "end")

    workflow.set_entry_point("start")

    # 4) Compile the graph
    compiled = workflow.compile()

    initial_state = {}
    # 5) Run the workflow
    result = await compiled.ainvoke(initial_state)
    print("=== Workflow finished ===")
    print("Final State:", result)

if __name__ == "__main__":
    asyncio.run(main())
```

## Server-Side Visualization

By default, GraphObservability calls two endpoints on your server:

- POST /api/graph/start — triggered once when the first node starts, to initialize a new graph record.
- POST /api/graph/ingest — triggered after every node start/end or edge creation, sending updates.

## How it Works Internally
1. A background thread holds an asyncio event loop (via _AsyncLoopThread), so we can await httpx.AsyncClient.post(...) without blocking the main process or requiring the user to manage async.

2. Each node start/end or edge creation calls a short, synchronous method (_post_to_nextjs) which enqueues a coroutine in the background event loop to do an HTTP POST.

3. This ensures minimal overhead and “fire-and-forget” updates to your server.

## License
MIT License. See [LICENSE](./LICENSE) for details.

## Contributing
Pull requests and issues are welcome! If you have suggestions or find bugs, please open an issue or submit a PR on the GitHub repository.
            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "langviz-studio",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9.0",
    "maintainer_email": null,
    "keywords": "langchain, observability, langgraph, graphviz",
    "author": "tubone",
    "author_email": "tubo.yyyuuu@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/7d/1f/35848c3d15bedf3294791ccc470c1f78bebe574fa0b065fabe33095adc16/langviz_studio-0.1.2.tar.gz",
    "platform": null,
    "description": "# LangViz-Studio\n\nLangViz-Studio is an observability toolkit for [LangGraph](https://pypi.org/project/langgraph/) workflows. \n\n<img src=\"https://github.com/tubone24/LangViz-studio/blob/main/images/logo.png?raw=true\" width=\"30%\" alt=\"logo\"/>\n\nIt enables you to track the execution states, node transitions, and conditional edges of your state-based flows or agent pipelines in real-time, sending all data asynchronously to a customizable endpoint (e.g., a Next.js server) for visualization.\n\n## Key Features\n\n- GraphObservability\n  - Automatically records and sends node start/end events, edge transitions, and conditional branch usage. \n  - Maintains a unique graphId (UUID by default) and an optional friendly graph_name. \n  - Sends data to a configured endpoint (http://localhost:3000 by default).\n\n- ObservabilityStateGraph\n  - A subclass of langgraph.graph.StateGraph that automatically wraps node functions and edge definitions so all observability data is captured with no extra code. \n  - Just use add_node(...), add_edge(...), add_conditional_edges(...) as normal, and your graph\u2019s runtime states will be sent in real-time for visualization.\n\n- Asynchronous and Non-blocking\n  - Uses an internal background thread running an asyncio event loop to POST updates via httpx.AsyncClient, ensuring minimal impact on your main workflow thread.\n\n## Installation\n\n```bash\npip install langviz-studio\n```\n\n(Also ensure you have installed langgraph and other dependencies like langchain_core, if applicable.)\n\n## Usage\n\nBelow is a minimal example showing how to use LangViz-Studio with [LangGraph](https://pypi.org/project/langgraph/).\nIn this example, we define two simple asynchronous node functions (run_start and run_end) and connect them in a StateGraph. Observability logs are sent automatically.\n\n```python\nimport asyncio\nfrom langviz_studio.observability import GraphObservability, ObservabilityStateGraph\n\nasync def run_start(state, config=None):\n    print(\"=== [start] node invoked ===\")\n    state[\"msg\"] = \"Hello from start!\"\n    return state\n\nasync def run_end(state, config=None):\n    print(\"=== [end] node invoked ===\")\n    state[\"result\"] = \"Done\"\n    return state\n\nasync def main():\n    # 1. Set your Observability\n    obs = GraphObservability(\n        graph_name=\"MyExampleGraph\"\n    )\n\n    # 2. Using ObservabilityStateGraph instead of StateGraph\n    workflow = ObservabilityStateGraph(obs, state_type=dict)\n\n    # 3. Nodes are added in the same manner as StateGraph\n    workflow.add_node(\"start\", run_start)\n    workflow.add_node(\"end\", run_end)\n    # Edges / Conditional edges are also added in the same manner as StateGraph\n    workflow.add_edge(\"start\", \"end\")\n\n    workflow.set_entry_point(\"start\")\n\n    # 4) Compile the graph\n    compiled = workflow.compile()\n\n    initial_state = {}\n    # 5) Run the workflow\n    result = await compiled.ainvoke(initial_state)\n    print(\"=== Workflow finished ===\")\n    print(\"Final State:\", result)\n\nif __name__ == \"__main__\":\n    asyncio.run(main())\n```\n\n## Server-Side Visualization\n\nBy default, GraphObservability calls two endpoints on your server:\n\n- POST /api/graph/start \u2014 triggered once when the first node starts, to initialize a new graph record.\n- POST /api/graph/ingest \u2014 triggered after every node start/end or edge creation, sending updates.\n\n## How it Works Internally\n1. A background thread holds an asyncio event loop (via _AsyncLoopThread), so we can await httpx.AsyncClient.post(...) without blocking the main process or requiring the user to manage async.\n\n2. Each node start/end or edge creation calls a short, synchronous method (_post_to_nextjs) which enqueues a coroutine in the background event loop to do an HTTP POST.\n\n3. This ensures minimal overhead and \u201cfire-and-forget\u201d updates to your server.\n\n## License\nMIT License. See [LICENSE](./LICENSE) for details.\n\n## Contributing\nPull requests and issues are welcome! If you have suggestions or find bugs, please open an issue or submit a PR on the GitHub repository.",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "LangViz-Studio is an observability toolkit for LangGraph workflows. ",
    "version": "0.1.2",
    "project_urls": null,
    "split_keywords": [
        "langchain",
        " observability",
        " langgraph",
        " graphviz"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "aa38223df5f9957f54aa44f635bc7982f3047b0e668502859c09ce2a21226ece",
                "md5": "5895c5f879fea8d430d262438b5ef6b9",
                "sha256": "a6718825f8bac65fcdf399a113c0cdc711f6f818f5f8375821480b4b91a83052"
            },
            "downloads": -1,
            "filename": "langviz_studio-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "5895c5f879fea8d430d262438b5ef6b9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9.0",
            "size": 7208,
            "upload_time": "2025-01-18T13:40:34",
            "upload_time_iso_8601": "2025-01-18T13:40:34.072824Z",
            "url": "https://files.pythonhosted.org/packages/aa/38/223df5f9957f54aa44f635bc7982f3047b0e668502859c09ce2a21226ece/langviz_studio-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "7d1f35848c3d15bedf3294791ccc470c1f78bebe574fa0b065fabe33095adc16",
                "md5": "b680b06fe8fddd6f97c03f3f623c5112",
                "sha256": "f907e8e3e8be112cca2603264e1faea36b41418d6d98f0c20f1c49201f2f4d74"
            },
            "downloads": -1,
            "filename": "langviz_studio-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "b680b06fe8fddd6f97c03f3f623c5112",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9.0",
            "size": 6448,
            "upload_time": "2025-01-18T13:40:36",
            "upload_time_iso_8601": "2025-01-18T13:40:36.448619Z",
            "url": "https://files.pythonhosted.org/packages/7d/1f/35848c3d15bedf3294791ccc470c1f78bebe574fa0b065fabe33095adc16/langviz_studio-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-01-18 13:40:36",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "langviz-studio"
}
        
Elapsed time: 0.38529s