# Code generator for FastStream
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->
`faststream-gen` is a Python library that uses generative AI to
automatically generate
<a href="https://faststream.airt.ai" target="_blank">FastStream</a>
applications. Simply describe your application requirements, and
`faststream-gen` will generate a production-grade FastStream project
that is ready to deploy in no time.
![PyPI](https://img.shields.io/pypi/v/faststream-gen.png) ![PyPI -
Downloads](https://img.shields.io/pypi/dm/faststream-gen.png) ![PyPI -
Python
Version](https://img.shields.io/pypi/pyversions/faststream-gen.png)
![GitHub Workflow
Status](https://img.shields.io/github/actions/workflow/status/airtai/fastkafka-gen/test.yaml)
![GitHub](https://img.shields.io/github/license/airtai/fastkafka-gen.png)
------------------------------------------------------------------------
**Documentation**: https://faststream-gen.airt.ai
**Source Code**: https://github.com/airtai/faststream-gen
------------------------------------------------------------------------
## Getting Started
The code generator for
<a href="https://faststream.airt.ai" target="_blank">FastStream</a> is a
Python library that automates the process of creating FastStream
applications. It works by taking your application requirements and
swiftly turning them into a ready-to-deploy FastStream application.
The key features are:
- **Automatic FastStream project generation**: `faststream-gen` enables
you to easily generate complete FastStream application with minimal
effort. This library allows you to outline your application
requirements, and it will quickly transform them into a fully-fledged
FastStream project.
- **Tested code**: `faststream-gen` provides dependable code through
rigorous testing, including pre-implemented integration tests,
ensuring stability and functionality, saving development time, and
preventing common bugs.
- **Script Templates**: Streamline the deployment of your FastStream
application using faststream-gen’s built-in scripts, tailored for
initiating, subscribing to Kafka topic and shutting down the local
Kafka broker.
- **GitHub workflow files**: `faststream-gen` integrates seamlessly with
your version control and continuous integration pipeline through its
GitHub workflow files. These predefined configuration files are
optimized for FastStream projects, enabling smooth integration with
GitHub Actions. You can automate tasks such as code validation,
testing, and deployment, ensuring that your FastStream application
remains in top shape throughout its development lifecycle.
[![faststream-gen example](https://github.com/airtai/faststream-gen/blob/main/mkdocs/docs_overrides/images/tutorial-video.png?raw=true)](https://github.com/airtai/faststream-gen/assets/32619626/2731caa1-1108-4f4e-8507-9bf1168419e4 "faststream-gen")
### Quick start
The following quick start guide will walk you through installing and
configuring the `faststream-gen` library, demonstrating the creation of
a new FastStream project in seconds.
#### Install
`faststream-gen` is published as a Python package and can be installed
with pip:
``` shell
pip install faststream-gen
```
If the installation was successful, you should now have the
**faststream-gen** installed on your system. Run the below command from
the terminal to see the full list of available commands:
``` shell
faststream_gen --help
```
Usage: faststream_gen [OPTIONS] [DESCRIPTION]
Effortlessly create a new FastStream project based on the app description.
╭─ Arguments ──────────────────────────────────────────────────────────────────╮
│ description [DESCRIPTION] Summarize your FastStream application in a │
│ few sentences! │
│ │
│ Include details about messages, topics, │
│ servers, and a brief overview of the │
│ intended business logic. │
│ │
│ The simpler and more specific the app │
│ description is, the better the generated │
│ app will be. Please refer to the below │
│ example for inspiration: │
│ │
│ Create a FastStream application using │
│ localhost broker for testing and use the │
│ default port number. It should consume │
│ messages from the "input_data" topic, │
│ where each message is a JSON encoded │
│ object containing a single attribute: │
│ 'data'. For each consumed message, create │
│ a new message object and increment the │
│ value of the data attribute by 1. Finally, │
│ send the modified message to the │
│ 'output_data' topic. │
│ [default: None] │
╰──────────────────────────────────────────────────────────────────────────────╯
╭─ Options ────────────────────────────────────────────────────────────────────╮
│ --input_file -i TEXT The path to the file │
│ with the app │
│ desription. This path │
│ should be relative to │
│ the current working │
│ directory. │
│ If the app description │
│ is passed via both a │
│ --input_file and a │
│ command line argument, │
│ the description from │
│ the command line will │
│ be used to create the │
│ application. │
│ [default: None] │
│ --output_path -o TEXT The path to the output │
│ directory where the │
│ generated project files │
│ will be saved. This │
│ path should be relative │
│ to the current working │
│ directory. │
│ [default: .] │
│ --model -m [gpt-3.5-turbo-16k|gp The OpenAI model that │
│ t-4] will be used to create │
│ the FastStream project. │
│ For better results, we │
│ recommend using │
│ 'gpt-4'. │
│ [default: │
│ gpt-3.5-turbo-16k] │
│ --verbose -v Enable verbose logging │
│ by setting the logger │
│ level to INFO. │
│ --dev -d Save the complete logs │
│ generated by │
│ faststream-gen inside │
│ the output_path │
│ directory. │
│ --install-completion Install completion for │
│ the current shell. │
│ --show-completion Show completion for the │
│ current shell, to copy │
│ it or customize the │
│ installation. │
│ --help Show this message and │
│ exit. │
╰──────────────────────────────────────────────────────────────────────────────╯
#### Generate new project
The **faststream-gen** library uses OpenAI’s model to generate
FastStream projects. In order to use the library, you’ll need to
<a href="https://beta.openai.com/account/api-keys" target="_blank">create
an API key for OpenAI</a>.
Once you have your API key, store it in the **OPENAI_API_KEY**
environment variable. This is a necessary step for the library to work.
We’re now ready to create a new FastStream application with the
`faststream-gen` library.
Simply run the following command to create a new FastStream application
in the `my-awesome-project` directory:
``` shell
faststream_gen "Create a FastStream application using localhost broker for testing and use the default port number. It should consume messages from the 'input_data' topic, where each message is a JSON encoded object containing a single attribute: 'data'. While consuming from the topic, increment the value of the data attribute by 1. Finally, send message to the 'output_data' topic." -o "./my-awesome-project"
```
✨ Generating a new FastStream application!
✔ Application description validated.
✔ FastStream app skeleton code generated. akes around 15 to 45 seconds)...
✔ The app and the tests are generated. around 30 to 90 seconds)...
✔ New FastStream project created.
✔ Integration tests were successfully completed.
Tokens used: 9398
Total Cost (USD): $0.02865
✨ All files were successfully generated!
Here’s a look at the directory hierarchy:
my-awesome-project
├── .github
│ └── workflows
│ ├── deploy_docs.yml
│ └── test.yml
├── .gitignore
├── LICENSE
├── README.md
├── app
│ ├── __init__.py
│ └── application.py
├── dev_requirements.txt
├── requirements.txt
├── scripts
│ ├── services.yml
│ ├── start_kafka_broker_locally.sh
│ ├── stop_kafka_broker_locally.sh
│ └── subscribe_to_kafka_broker_locally.sh
└── tests
└── test_application.py
5 directories, 14 files
Let’s take a quick look at the generated application and test code.
`application.py`:
from faststream import FastStream, Logger
from faststream.kafka import KafkaBroker
broker = KafkaBroker("localhost:9092")
app = FastStream(broker)
to_output_data = broker.publisher("output_data")
@broker.subscriber("input_data")
async def on_input_data(msg: dict, logger: Logger) -> None:
logger.info(f"{msg=}")
incremented_data = msg["data"] + 1
await to_output_data.publish({"data": incremented_data})
`test_application.py`:
import pytest
from faststream import Context
from faststream.kafka import TestKafkaBroker
from app.application import broker, on_input_data
@broker.subscriber("output_data")
async def on_output_data(msg: dict, key: bytes = Context("message.raw_message.key")):
pass
@pytest.mark.asyncio
async def test_data_was_incremented():
async with TestKafkaBroker(broker):
await broker.publish({"data": 1}, "input_data")
on_input_data.mock.assert_called_with({"data": 1})
on_output_data.mock.assert_called_with({"data": 2})
#### Start localhost Kafka broker
In order for `FastStream` applications to publish and consume messages
from the Kafka broker, it is necessary to have a running Kafka broker.
Along with application and test, `faststream-gen` also generated
`scripts` directory. You can start local Kafka broker (inside docker
container) by executing following commands:
``` sh
cd my-awesome-project
# make all shell scripts executable
chmod +x scripts/*.sh
# start local kafka broker
./scripts/start_kafka_broker_locally.sh
```
#### Start application
To start the FastKafka application, run the following command:
``` sh
faststream run app.application:app
```
#### Stop application
To stop the FastKafka application, run the following command:
``` sh
./scripts/stop_kafka_broker_locally.sh
```
## Copyright
Copyright © 2023 onwards airt technologies ltd, Inc.
## License
This project is licensed under the terms of the
<a href="https://github.com/airtai/faststream-gen/blob/main/LICENSE" target="_blank">Apache
License 2.0</a>
Raw data
{
"_id": null,
"home_page": "https://github.com/airtai/faststream-gen",
"name": "faststream-gen",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "",
"keywords": "nbdev jupyter notebook python",
"author": "airt",
"author_email": "info@airt.ai",
"download_url": "https://files.pythonhosted.org/packages/d4/f5/4c647d37e9095abb401ce74190b70964487d46ec016679042e495e9d5873/faststream-gen-0.1.7.tar.gz",
"platform": null,
"description": "# Code generator for FastStream\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n`faststream-gen` is a Python library that uses generative AI to\nautomatically generate\n<a href=\"https://faststream.airt.ai\" target=\"_blank\">FastStream</a>\napplications. Simply describe your application requirements, and\n`faststream-gen` will generate a production-grade FastStream project\nthat is ready to deploy in no time.\n\n![PyPI](https://img.shields.io/pypi/v/faststream-gen.png) ![PyPI -\nDownloads](https://img.shields.io/pypi/dm/faststream-gen.png) ![PyPI -\nPython\nVersion](https://img.shields.io/pypi/pyversions/faststream-gen.png)\n![GitHub Workflow\nStatus](https://img.shields.io/github/actions/workflow/status/airtai/fastkafka-gen/test.yaml)\n![GitHub](https://img.shields.io/github/license/airtai/fastkafka-gen.png)\n\n------------------------------------------------------------------------\n\n**Documentation**: https://faststream-gen.airt.ai\n\n**Source Code**: https://github.com/airtai/faststream-gen\n\n------------------------------------------------------------------------\n\n## Getting Started\n\nThe code generator for\n<a href=\"https://faststream.airt.ai\" target=\"_blank\">FastStream</a> is a\nPython library that automates the process of creating FastStream\napplications. It works by taking your application requirements and\nswiftly turning them into a ready-to-deploy FastStream application.\n\nThe key features are:\n\n- **Automatic FastStream project generation**: `faststream-gen` enables\n you to easily generate complete FastStream application with minimal\n effort. This library allows you to outline your application\n requirements, and it will quickly transform them into a fully-fledged\n FastStream project.\n- **Tested code**: `faststream-gen` provides dependable code through\n rigorous testing, including pre-implemented integration tests,\n ensuring stability and functionality, saving development time, and\n preventing common bugs.\n- **Script Templates**: Streamline the deployment of your FastStream\n application using faststream-gen\u2019s built-in scripts, tailored for\n initiating, subscribing to Kafka topic and shutting down the local\n Kafka broker.\n- **GitHub workflow files**: `faststream-gen` integrates seamlessly with\n your version control and continuous integration pipeline through its\n GitHub workflow files. These predefined configuration files are\n optimized for FastStream projects, enabling smooth integration with\n GitHub Actions. You can automate tasks such as code validation,\n testing, and deployment, ensuring that your FastStream application\n remains in top shape throughout its development lifecycle.\n\n\n[![faststream-gen example](https://github.com/airtai/faststream-gen/blob/main/mkdocs/docs_overrides/images/tutorial-video.png?raw=true)](https://github.com/airtai/faststream-gen/assets/32619626/2731caa1-1108-4f4e-8507-9bf1168419e4 \"faststream-gen\")\n\n### Quick start\n\nThe following quick start guide will walk you through installing and\nconfiguring the `faststream-gen` library, demonstrating the creation of\na new FastStream project in seconds.\n\n#### Install\n\n`faststream-gen` is published as a Python package and can be installed\nwith pip:\n\n``` shell\npip install faststream-gen\n```\n\nIf the installation was successful, you should now have the\n**faststream-gen** installed on your system. Run the below command from\nthe terminal to see the full list of available commands:\n\n``` shell\nfaststream_gen --help\n```\n\n \n Usage: faststream_gen [OPTIONS] [DESCRIPTION] \n \n Effortlessly create a new FastStream project based on the app description. \n \n \u256d\u2500 Arguments \u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n \u2502 description [DESCRIPTION] Summarize your FastStream application in a \u2502\n \u2502 few sentences! \u2502\n \u2502 \u2502\n \u2502 Include details about messages, topics, \u2502\n \u2502 servers, and a brief overview of the \u2502\n \u2502 intended business logic. \u2502\n \u2502 \u2502\n \u2502 The simpler and more specific the app \u2502\n \u2502 description is, the better the generated \u2502\n \u2502 app will be. Please refer to the below \u2502\n \u2502 example for inspiration: \u2502\n \u2502 \u2502\n \u2502 Create a FastStream application using \u2502\n \u2502 localhost broker for testing and use the \u2502\n \u2502 default port number. It should consume \u2502\n \u2502 messages from the \"input_data\" topic, \u2502\n \u2502 where each message is a JSON encoded \u2502\n \u2502 object containing a single attribute: \u2502\n \u2502 'data'. For each consumed message, create \u2502\n \u2502 a new message object and increment the \u2502\n \u2502 value of the data attribute by 1. Finally, \u2502\n \u2502 send the modified message to the \u2502\n \u2502 'output_data' topic. \u2502\n \u2502 [default: None] \u2502\n \u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n \u256d\u2500 Options \u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n \u2502 --input_file -i TEXT The path to the file \u2502\n \u2502 with the app \u2502\n \u2502 desription. This path \u2502\n \u2502 should be relative to \u2502\n \u2502 the current working \u2502\n \u2502 directory. \u2502\n \u2502 If the app description \u2502\n \u2502 is passed via both a \u2502\n \u2502 --input_file and a \u2502\n \u2502 command line argument, \u2502\n \u2502 the description from \u2502\n \u2502 the command line will \u2502\n \u2502 be used to create the \u2502\n \u2502 application. \u2502\n \u2502 [default: None] \u2502\n \u2502 --output_path -o TEXT The path to the output \u2502\n \u2502 directory where the \u2502\n \u2502 generated project files \u2502\n \u2502 will be saved. This \u2502\n \u2502 path should be relative \u2502\n \u2502 to the current working \u2502\n \u2502 directory. \u2502\n \u2502 [default: .] \u2502\n \u2502 --model -m [gpt-3.5-turbo-16k|gp The OpenAI model that \u2502\n \u2502 t-4] will be used to create \u2502\n \u2502 the FastStream project. \u2502\n \u2502 For better results, we \u2502\n \u2502 recommend using \u2502\n \u2502 'gpt-4'. \u2502\n \u2502 [default: \u2502\n \u2502 gpt-3.5-turbo-16k] \u2502\n \u2502 --verbose -v Enable verbose logging \u2502\n \u2502 by setting the logger \u2502\n \u2502 level to INFO. \u2502\n \u2502 --dev -d Save the complete logs \u2502\n \u2502 generated by \u2502\n \u2502 faststream-gen inside \u2502\n \u2502 the output_path \u2502\n \u2502 directory. \u2502\n \u2502 --install-completion Install completion for \u2502\n \u2502 the current shell. \u2502\n \u2502 --show-completion Show completion for the \u2502\n \u2502 current shell, to copy \u2502\n \u2502 it or customize the \u2502\n \u2502 installation. \u2502\n \u2502 --help Show this message and \u2502\n \u2502 exit. \u2502\n \u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n\n#### Generate new project\n\nThe **faststream-gen** library uses OpenAI\u2019s model to generate\nFastStream projects. In order to use the library, you\u2019ll need to\n<a href=\"https://beta.openai.com/account/api-keys\" target=\"_blank\">create\nan API key for OpenAI</a>.\n\nOnce you have your API key, store it in the **OPENAI_API_KEY**\nenvironment variable. This is a necessary step for the library to work.\n\nWe\u2019re now ready to create a new FastStream application with the\n`faststream-gen` library.\n\nSimply run the following command to create a new FastStream application\nin the `my-awesome-project` directory:\n\n``` shell\nfaststream_gen \"Create a FastStream application using localhost broker for testing and use the default port number. It should consume messages from the 'input_data' topic, where each message is a JSON encoded object containing a single attribute: 'data'. While consuming from the topic, increment the value of the data attribute by 1. Finally, send message to the 'output_data' topic.\" -o \"./my-awesome-project\"\n```\n\n \u2728 Generating a new FastStream application!\n \u2714 Application description validated. \n \u2714 FastStream app skeleton code generated. akes around 15 to 45 seconds)...\n \u2714 The app and the tests are generated. around 30 to 90 seconds)...\n \u2714 New FastStream project created. \n \u2714 Integration tests were successfully completed. \n Tokens used: 9398\n Total Cost (USD): $0.02865\n \u2728 All files were successfully generated!\n\nHere\u2019s a look at the directory hierarchy:\n\n my-awesome-project\n \u251c\u2500\u2500 .github\n \u2502\u00a0\u00a0 \u2514\u2500\u2500 workflows\n \u2502\u00a0\u00a0 \u251c\u2500\u2500 deploy_docs.yml\n \u2502\u00a0\u00a0 \u2514\u2500\u2500 test.yml\n \u251c\u2500\u2500 .gitignore\n \u251c\u2500\u2500 LICENSE\n \u251c\u2500\u2500 README.md\n \u251c\u2500\u2500 app\n \u2502\u00a0\u00a0 \u251c\u2500\u2500 __init__.py\n \u2502\u00a0\u00a0 \u2514\u2500\u2500 application.py\n \u251c\u2500\u2500 dev_requirements.txt\n \u251c\u2500\u2500 requirements.txt\n \u251c\u2500\u2500 scripts\n \u2502\u00a0\u00a0 \u251c\u2500\u2500 services.yml\n \u2502\u00a0\u00a0 \u251c\u2500\u2500 start_kafka_broker_locally.sh\n \u2502\u00a0\u00a0 \u251c\u2500\u2500 stop_kafka_broker_locally.sh\n \u2502\u00a0\u00a0 \u2514\u2500\u2500 subscribe_to_kafka_broker_locally.sh\n \u2514\u2500\u2500 tests\n \u2514\u2500\u2500 test_application.py\n\n 5 directories, 14 files\n\nLet\u2019s take a quick look at the generated application and test code.\n\n`application.py`:\n\n\n\n from faststream import FastStream, Logger\n from faststream.kafka import KafkaBroker\n\n broker = KafkaBroker(\"localhost:9092\")\n app = FastStream(broker)\n\n to_output_data = broker.publisher(\"output_data\")\n\n\n @broker.subscriber(\"input_data\")\n async def on_input_data(msg: dict, logger: Logger) -> None:\n logger.info(f\"{msg=}\")\n incremented_data = msg[\"data\"] + 1\n await to_output_data.publish({\"data\": incremented_data})\n\n`test_application.py`:\n\n\n\n import pytest\n\n from faststream import Context\n from faststream.kafka import TestKafkaBroker\n\n from app.application import broker, on_input_data\n\n\n @broker.subscriber(\"output_data\")\n async def on_output_data(msg: dict, key: bytes = Context(\"message.raw_message.key\")):\n pass\n\n\n @pytest.mark.asyncio\n async def test_data_was_incremented():\n async with TestKafkaBroker(broker):\n await broker.publish({\"data\": 1}, \"input_data\")\n on_input_data.mock.assert_called_with({\"data\": 1})\n on_output_data.mock.assert_called_with({\"data\": 2})\n\n#### Start localhost Kafka broker\n\nIn order for `FastStream` applications to publish and consume messages\nfrom the Kafka broker, it is necessary to have a running Kafka broker.\n\nAlong with application and test, `faststream-gen` also generated\n`scripts` directory. You can start local Kafka broker (inside docker\ncontainer) by executing following commands:\n\n``` sh\ncd my-awesome-project\n# make all shell scripts executable\nchmod +x scripts/*.sh\n# start local kafka broker\n./scripts/start_kafka_broker_locally.sh\n```\n\n#### Start application\n\nTo start the FastKafka application, run the following command:\n\n``` sh\nfaststream run app.application:app\n```\n\n#### Stop application\n\nTo stop the FastKafka application, run the following command:\n\n``` sh\n./scripts/stop_kafka_broker_locally.sh\n```\n\n## Copyright\n\nCopyright \u00a9 2023 onwards airt technologies ltd, Inc.\n\n## License\n\nThis project is licensed under the terms of the\n<a href=\"https://github.com/airtai/faststream-gen/blob/main/LICENSE\" target=\"_blank\">Apache\nLicense 2.0</a>\n\n\n",
"bugtrack_url": null,
"license": "Apache Software License 2.0",
"summary": "The faststream-gen library uses advanced AI to generate FastStream code from user descriptions, speeding up FastStream app development.",
"version": "0.1.7",
"project_urls": {
"Homepage": "https://github.com/airtai/faststream-gen"
},
"split_keywords": [
"nbdev",
"jupyter",
"notebook",
"python"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "3ffade664739ecd53d4b2da2312f9d42a89d5a1064b3bb960d1db436b13f2bec",
"md5": "59d8a853153e002c804fcf178296dc5a",
"sha256": "9b2650d91787ca3b1023d1214945ef5028a90a3495a24c370f9f39e35fb2dd89"
},
"downloads": -1,
"filename": "faststream_gen-0.1.7-py3-none-any.whl",
"has_sig": false,
"md5_digest": "59d8a853153e002c804fcf178296dc5a",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 617687,
"upload_time": "2023-10-11T12:38:40",
"upload_time_iso_8601": "2023-10-11T12:38:40.592984Z",
"url": "https://files.pythonhosted.org/packages/3f/fa/de664739ecd53d4b2da2312f9d42a89d5a1064b3bb960d1db436b13f2bec/faststream_gen-0.1.7-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "d4f54c647d37e9095abb401ce74190b70964487d46ec016679042e495e9d5873",
"md5": "842d6aed1559bdbbed12e5ec3cb16ffc",
"sha256": "229446241de8f59d3de55c99d30ec4a9896132517f1800a55af43b83a2f3d7b0"
},
"downloads": -1,
"filename": "faststream-gen-0.1.7.tar.gz",
"has_sig": false,
"md5_digest": "842d6aed1559bdbbed12e5ec3cb16ffc",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 617567,
"upload_time": "2023-10-11T12:38:42",
"upload_time_iso_8601": "2023-10-11T12:38:42.688940Z",
"url": "https://files.pythonhosted.org/packages/d4/f5/4c647d37e9095abb401ce74190b70964487d46ec016679042e495e9d5873/faststream-gen-0.1.7.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-10-11 12:38:42",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "airtai",
"github_project": "faststream-gen",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "faststream-gen"
}