struct-strm


Namestruct-strm JSON
Version 0.0.3 PyPI version JSON
download
home_pageNone
SummaryStream partial json generated by LLMs into valid json responses
upload_time2025-07-15 01:11:18
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseNone
keywords htmx json llm streaming web components
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
<div align="center">

<img src="https://raw.githubusercontent.com/PrestonBlackburn/structured_streamer/refs/heads/gh-pages/img/logo_bg_wide.png" alt="Struct Strm Logo" width="750" role="img">

[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://github.com/PrestonBlackburn/structured_streamer/blob/main/LICENSE)
[![Codestyle Black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
![Build Status](https://github.com/PrestonBlackburn/structured_streamer/actions/workflows/build_and_publish.yaml/badge.svg?branch=main)
![Coverage](https://PrestonBlackburn.github.io/structured_streamer/assets/coverage.svg)

</div>

# Structured Streamer

**struct_strm** (structured streamer) is a Python package that makes it easy to stream partial json generated by LLMs into valid json responses. This enables partial rendering of UI components without needing to wait for a full response, drastically reducing the time to the first word on the user's screen.

## Why Use Structured Streamer?

JSON format is the standard when dealing with structured responses from LLMs. In the early days of LLM structured generation we had to validate the JSON response only after the whole JSON response had been returned. Modern approaches use constrained decoding to ensure that only valid json is returned, eliminating the need for post generation validation, and allowing us to use the response imediately. However, the streamed json response is incomplete, so it can't be parsed using traditional methods. This library aims to make it easier to handle this partially generated json to provide a better end user experience.   

<br/>

You can learn more about constrained decoding and context free grammar here: [XGrammar - Achieving Efficient, Flexible, and Portable Structured Generation with XGrammar](https://blog.mlc.ai/2024/11/22/achieving-efficient-flexible-portable-structured-generation-with-xgrammar)   

<br/>



## Main Features

The primary feature is to wrap LLM outputs to produce valid incremental JSON from partial invalid JSON based on user provided structures. Effectively this acts as a wrapper for your LLM calls. Due to the nature of this library (it is primarily inteded for use in web servers), it is expected that it will be used in async workflows, and is async first.   

The library also provides simple HTML templates that serve as examples of how you can integrate the streams in your own components.  

Due to the nature of partial json streaming, there can be "wrong" ways to stream responses that are not effective for partial rendering of responeses in the UI. The library also provides examples of tested ways to apply the library to get good results.   

**High Level Flow**  
![High level flow](docs/img/high_level_flow.png)



## Example Component
This is an example of a form component being incrementally rendered. By using a structured query response from an LLM, in this case a form with form field names and field placeholders, we can stream the form results directly to a HTML component. This drastically reduces the time to first token, and the precieved time that a user needs to wait. More advanced components are under development. 

```python

class DefaultFormItem(BaseModel):
    field_name: str
    field_placeholder: str

class DefaultFormStruct(BaseModel):
    form_fields: List[DefaultFormItem]

# a typical openai structured response stream may look like: 
...
async with client.beta.chat.completions.stream(
    model="gpt-4.1",
    messages=messages,
    response_format=DefaultFormStruct,
    temperature=0.0,
) as stream:
    async for event in stream:
        ...
# where the resulting stream is used to incrementally build the component
# (shown below)
```

![Example Form Streaming](docs/img/form_struct_strm.gif)

## Contributing

### Test
```bash
pytest
```

### Format
```bash
python -m black ./
```

### Docs
```bash
mkdocs serve
```

## Other

I started **struct_strm** to support another project I'm working on to provide an easy entrypoint for Teachers to use LLM tools in their workflows. Check it out if you're interested - [Teachers PET](https://www.teacherspet.tech/)
            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "struct-strm",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "htmx, json, llm, streaming, web components",
    "author": null,
    "author_email": "Preston Blackburn <prestonblckbrn@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/71/cb/9a192d4e9b5df704d1387027bf64daf0760ccbb1a96d5d64b44dc32fd704/struct_strm-0.0.3.tar.gz",
    "platform": null,
    "description": "\n<div align=\"center\">\n\n<img src=\"https://raw.githubusercontent.com/PrestonBlackburn/structured_streamer/refs/heads/gh-pages/img/logo_bg_wide.png\" alt=\"Struct Strm Logo\" width=\"750\" role=\"img\">\n\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://github.com/PrestonBlackburn/structured_streamer/blob/main/LICENSE)\n[![Codestyle Black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)\n![Build Status](https://github.com/PrestonBlackburn/structured_streamer/actions/workflows/build_and_publish.yaml/badge.svg?branch=main)\n![Coverage](https://PrestonBlackburn.github.io/structured_streamer/assets/coverage.svg)\n\n</div>\n\n# Structured Streamer\n\n**struct_strm** (structured streamer) is a Python package that makes it easy to stream partial json generated by LLMs into valid json responses. This enables partial rendering of UI components without needing to wait for a full response, drastically reducing the time to the first word on the user's screen.\n\n## Why Use Structured Streamer?\n\nJSON format is the standard when dealing with structured responses from LLMs. In the early days of LLM structured generation we had to validate the JSON response only after the whole JSON response had been returned. Modern approaches use constrained decoding to ensure that only valid json is returned, eliminating the need for post generation validation, and allowing us to use the response imediately. However, the streamed json response is incomplete, so it can't be parsed using traditional methods. This library aims to make it easier to handle this partially generated json to provide a better end user experience.   \n\n<br/>\n\nYou can learn more about constrained decoding and context free grammar here: [XGrammar - Achieving Efficient, Flexible, and Portable Structured Generation with XGrammar](https://blog.mlc.ai/2024/11/22/achieving-efficient-flexible-portable-structured-generation-with-xgrammar)   \n\n<br/>\n\n\n\n## Main Features\n\nThe primary feature is to wrap LLM outputs to produce valid incremental JSON from partial invalid JSON based on user provided structures. Effectively this acts as a wrapper for your LLM calls. Due to the nature of this library (it is primarily inteded for use in web servers), it is expected that it will be used in async workflows, and is async first.   \n\nThe library also provides simple HTML templates that serve as examples of how you can integrate the streams in your own components.  \n\nDue to the nature of partial json streaming, there can be \"wrong\" ways to stream responses that are not effective for partial rendering of responeses in the UI. The library also provides examples of tested ways to apply the library to get good results.   \n\n**High Level Flow**  \n![High level flow](docs/img/high_level_flow.png)\n\n\n\n## Example Component\nThis is an example of a form component being incrementally rendered. By using a structured query response from an LLM, in this case a form with form field names and field placeholders, we can stream the form results directly to a HTML component. This drastically reduces the time to first token, and the precieved time that a user needs to wait. More advanced components are under development. \n\n```python\n\nclass DefaultFormItem(BaseModel):\n    field_name: str\n    field_placeholder: str\n\nclass DefaultFormStruct(BaseModel):\n    form_fields: List[DefaultFormItem]\n\n# a typical openai structured response stream may look like: \n...\nasync with client.beta.chat.completions.stream(\n    model=\"gpt-4.1\",\n    messages=messages,\n    response_format=DefaultFormStruct,\n    temperature=0.0,\n) as stream:\n    async for event in stream:\n        ...\n# where the resulting stream is used to incrementally build the component\n# (shown below)\n```\n\n![Example Form Streaming](docs/img/form_struct_strm.gif)\n\n## Contributing\n\n### Test\n```bash\npytest\n```\n\n### Format\n```bash\npython -m black ./\n```\n\n### Docs\n```bash\nmkdocs serve\n```\n\n## Other\n\nI started **struct_strm** to support another project I'm working on to provide an easy entrypoint for Teachers to use LLM tools in their workflows. Check it out if you're interested - [Teachers PET](https://www.teacherspet.tech/)",
    "bugtrack_url": null,
    "license": null,
    "summary": "Stream partial json generated by LLMs into valid json responses",
    "version": "0.0.3",
    "project_urls": {
        "Documentation": "https://prestonblackburn.github.io/structured_streamer/",
        "Homepage": "https://prestonblackburn.github.io/structured_streamer/",
        "Issues": "https://github.com/PrestonBlackburn/structured_streamer/issues",
        "Repository": "https://github.com/PrestonBlackburn/structured_streamer"
    },
    "split_keywords": [
        "htmx",
        " json",
        " llm",
        " streaming",
        " web components"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "17315896a61594749cdd2939df3bc892a3e9c0cf83098fe47f5ca1aab858a89e",
                "md5": "bd1b7e26ed8222c89f85c216b4478319",
                "sha256": "f8dbbb55143d44e56222bd2eaaa112a76a46b8b10ac0283aebdc7b13e5540558"
            },
            "downloads": -1,
            "filename": "struct_strm-0.0.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "bd1b7e26ed8222c89f85c216b4478319",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 18041,
            "upload_time": "2025-07-15T01:11:17",
            "upload_time_iso_8601": "2025-07-15T01:11:17.782267Z",
            "url": "https://files.pythonhosted.org/packages/17/31/5896a61594749cdd2939df3bc892a3e9c0cf83098fe47f5ca1aab858a89e/struct_strm-0.0.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "71cb9a192d4e9b5df704d1387027bf64daf0760ccbb1a96d5d64b44dc32fd704",
                "md5": "338f1261f0ba3b2cce125c6fae7c92e2",
                "sha256": "3574ce6ed8de3a6da6be8ee1f3e118803f304e3c9a4598ac2a158e5f4ae62bf8"
            },
            "downloads": -1,
            "filename": "struct_strm-0.0.3.tar.gz",
            "has_sig": false,
            "md5_digest": "338f1261f0ba3b2cce125c6fae7c92e2",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 11004,
            "upload_time": "2025-07-15T01:11:18",
            "upload_time_iso_8601": "2025-07-15T01:11:18.827359Z",
            "url": "https://files.pythonhosted.org/packages/71/cb/9a192d4e9b5df704d1387027bf64daf0760ccbb1a96d5d64b44dc32fd704/struct_strm-0.0.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-15 01:11:18",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "PrestonBlackburn",
    "github_project": "structured_streamer",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "struct-strm"
}
        
Elapsed time: 2.12656s