fast-kafka-api


Namefast-kafka-api JSON
Version 0.0.4 PyPI version JSON
download
home_pagehttps://github.com/airtai/fast-kafka-api
SummaryExtension of FastAPI with Kafka event handlers
upload_time2023-01-09 18:09:26
maintainer
docs_urlNone
authorairt
requires_python>=3.8
licenseApache Software License 2.0
keywords nbdev jupyter notebook python
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            FastKafkaAPI
================

<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->

<b>Effortless Kafka integration for your web services</b>

------------------------------------------------------------------------

![PyPI](https://img.shields.io/pypi/v/fast-kafka-api.png) ![PyPI -
Downloads](https://img.shields.io/pypi/dm/fast-kafka-api.png) ![PyPI -
Python
Version](https://img.shields.io/pypi/pyversions/fast-kafka-api.png)

![GitHub Workflow
Status](https://img.shields.io/github/actions/workflow/status/airtai/fast-kafka-api/test.yaml)
![CodeQL](https://github.com/airtai/fast-kafka-api//actions/workflows/codeql.yml/badge.svg)
![Dependency
Review](https://github.com/airtai/fast-kafka-api//actions/workflows/dependency-review.yml/badge.svg)

![GitHub](https://img.shields.io/github/license/airtai/fast-kafka-api.png)

------------------------------------------------------------------------

FastKafkaAPI is a powerful and easy-to-use Python library for building
asynchronous web services that interact with Kafka topics. Built on top
of FastAPI, Starlette, Pydantic, and AIOKafka, FastKafkaAPI simplifies
the process of writing producers and consumers for Kafka topics,
handling all the parsing, networking, and task scheduling automatically.
With FastKafkaAPI, you can quickly prototype and develop
high-performance Kafka-based services with minimal code, making it an
ideal choice for developers looking to streamline their workflow and
accelerate their projects.

## Install

This command installs the FastKafkaAPI package from the Python Package
Index (PyPI) using the pip package manager.

`pip` is a command-line tool that allows you to install and manage
Python packages, including FastKafkaAPI. When you run the `pip install`
command with the name of a package (in this case, “fast-kafka-api”), pip
will download the package from PyPI, along with any dependencies that
the package requires, and install it on your system.

After running this command, you will be able to import and use the
FastKafkaAPI package in your Python code. For example, you might use it
to initialize a FastKafkaAPI application, as shown in the example
bellow, and to use the `@consumes` and `@produces` decorators to define
Kafka consumers and producers in your application.

Installing FastKafkaAPI from PyPI using `pip` is the recommended way to
install the package, as it makes it easy to manage the package and its
dependencies. If you prefer, you can also install FastKafkaAPI from the
source code by cloning the repository and running `pip install .` in the
root directory of the project.

``` sh
pip install fast-kafka-api
```

## How to use

Here is an example python script using FastKafkaAPI that takes data from
an input Kafka topic, makes a prediction using a predictive model, and
outputs the prediction to an output Kafka topic.

### Messages

FastKafkaAPI uses Pydantic to parse input JSON-encoded data into Python
objects, making it easy to work with structured data in your Kafka-based
applications. Pydantic’s `BaseModel` class allows you to define messages
using a declarative syntax, making it easy to specify the fields and
types of your messages.

This example defines two message classes for use in a FastKafkaAPI
application: `InputData` and `Prediction`.

The `InputData` class is used to represent input data for a predictive
model. It has three fields: `user_id`, `feature_1`, and `feature_2`. The
`user_id` field is of type `NonNegativeInt`, which is a subclass of int
that only allows non-negative integers. The `feature_1` and `feature_2`
fields are both lists of floating-point numbers and integers,
respectively. These fields are used to represent input features for the
predictive model.

The `Prediction` class is used to represent the output of the predictive
model. It has two fields: `user_id` and `score`. The `user_id` field is
of type `NonNegativeInt`, and the `score` field is a floating-point
number. The `score` field represents the prediction made by the model,
such as the probability of churn in the next 28 days.

These message classes will be used to parse and validate incoming data
in Kafka consumers and producers. Using these message classes in
combination with FastKafkaAPI makes it easy to work with structured data
in your Kafka-based applications.

``` python
from typing import List

from pydantic import BaseModel, Field, NonNegativeInt


class InputData(BaseModel):
    user_id: NonNegativeInt = Field(..., example=202020, description="ID of a user")
    feature_1: List[float] = Field(
        ...,
        example=[1.2, 2.3, 4.5, 6.7, 0.1],
        description="input feature 1",
    )
    feature_2: List[int] = Field(
        ...,
        example=[2, 4, 3, 1, 0],
        description="input feature 2",
    )


class Prediction(BaseModel):
    user_id: NonNegativeInt = Field(..., example=202020, description="ID of a user")
    score: float = Field(
        ...,
        example=0.4321,
        description="Prediction score (e.g. the probability of churn in the next 28 days)",
        ge=0.0,
        le=1.0,
    )
```

These message classes will be used to parse and validate incoming data
in a Kafka consumer and to produce a JSON-encoded message in a producer.
Using Pydantic’s BaseModel in combination with FastKafkaAPI makes it
easy to work with structured data in your Kafka-based applications.

### Application

This example shows how to initialize a FastKafkaAPI application. It
starts by defining two environment variables: `KAFKA_HOSTNAME` and
`KAFKA_PORT`, which are used to specify the hostname and port of the
Kafka broker.

Next, it defines a dictionary called `kafka_brokers`, which contains two
entries: “localhost” and “production”. Each entry specifies the URL,
port, and other details of a Kafka broker. This dictionary is used to
define the available Kafka brokers that can be used in the application.

The `kafka_config` dictionary specifies the configuration options for
the Kafka broker, such as the `bootstrap_servers` setting, which
specifies the hostname and port of the Kafka broker.

Finally, the FastKafkaAPI class is initialized with several arguments:
`title`, `contact`, `version`, `description`, `kafka_brokers`, and
`kafka_config`. These arguments are used to configure various aspects of
the application, such as the title, version, and description of the
application, as well as the available Kafka brokers and the Kafka
configuration options. The resulting
[`FastKafkaAPI`](https://airtai.github.io/fast-kafka-api/fastkafkaapi.html#fastkafkaapi)
object, which is stored in the `app` variable, represents the
initialized FastKafkaAPI application.

``` python
from os import environ

from fast_kafka_api.application import FastKafkaAPI

kafka_server_url = environ["KAFKA_HOSTNAME"]
kafka_server_port = environ["KAFKA_PORT"]

kafka_brokers = {
    "localhost": {
        "url": "kafka",
        "description": "local development kafka broker",
        "port": 9092,
    },
    "production": {
        "url": "kafka.acme.com",
        "description": "production kafka broker",
        "port": 9092,
        "protocol": "kafka-secure",
        "security": {"type": "plain"},
    },
}

kafka_config = {
    "bootstrap_servers": f"{kafka_server_url}:{kafka_server_port}",
}

app = FastKafkaAPI(
    title="FastKafkaAPI Example",
    contact={"name": "airt.ai", "url": "https://airt.ai", "email": "info@airt.ai"},
    version="0.0.1",
    description="A simple example on how to use FastKafkaAPI",
    kafka_brokers=kafka_brokers,
    **kafka_config,
)
```

### Function decorators

FastKafkaAPI provides convenient function decorators called `@consumes`
and `@produces` to allow you to delegate the actual processing of data
to user-defined functions. These decorators make it easy to specify the
processing logic for your Kafka consumers and producers, allowing you to
focus on the core business logic of your application without worrying
about the underlying Kafka integration.

This example shows how to use the `@consumes` and `@produces` decorators
in a FastKafkaAPI application.

The `@consumes` decorator is applied to the `on_input_data` function,
which specifies that this function should be called whenever a message
is received on the “input_data” Kafka topic. The `on_input_data`
function takes a single argument, `msg`, which is expected to be an
instance of the `InputData` message class.

Inside the `on_input_data` function, the `model.predict` function is
called with the `feature_1` and `feature_2` fields from the `msg`
argument. This function returns a prediction score, which is then passed
to the `to_predictions` function along with the `user_id` field from the
`msg` argument.

The `@produces` decorator is applied to the `to_predictions` function,
which specifies that this function should produce a message to the
“predictions” Kafka topic whenever it is called. The `to_predictions`
function takes two arguments: `user_id` and `score`. It creates a new
`Prediction` message with these values and then returns it.

In summary, this example shows how to use the `@consumes` and
`@produces` decorators to specify the processing logic for Kafka
consumers and producers in a FastKafkaAPI application. The `@consumes`
decorator is applied to functions that should be called when a message
is received on a Kafka topic, and the `@produces` decorator is applied
to functions that should produce a message to a Kafka topic. These
decorators make it easy to specify the processing logic for your Kafka
consumers and producers, allowing you to focus on the core business
logic of your application without worrying about the underlying Kafka
integration.

``` python
@app.consumes(topic="input_data")
async def on_input_data(msg: InputData):
    print(f"msg={msg}")
    score = await model.predict(feature_1=msg.feature_1, feature_2=msg.feature_2)
    await to_predictions(user_id=msg.user_id, score=score)


@app.produces(topic="predictions")
async def to_predictions(user_id: int, score: float) -> Prediction:
    prediction = Prediction(user_id=user_id, score=score)
    print(f"prediction={prediction}")
    return prediction
```

### Running the service

This example shows how to start the FastKafkaAPI service using the
uvicorn library. The `uvicorn.run` function is called with the `app`
argument (which represents the FastKafkaAPI application) and the `host`
and `port` arguments, which specify the hostname and port on which the
service should listen for incoming requests.

When the service is started, several log messages are printed to the
console, including information about the application startup, AsyncAPI
specification generation, and consumer loop status.

During the lifetime of the service, incoming requests will be processed
by the FastKafkaAPI application and appropriate actions will be taken
based on the defined Kafka consumers and producers. For example, if a
message is received on the “input_data” Kafka topic, the `on_input_data`
function will be called to process the message, and if the
`to_predictions` function is called, it will produce a message to the
“predictions” Kafka topic. The service will continue to run until it is
shut down, at which point the application shutdown process will be
initiated and the service will stop.

``` python
import uvicorn

uvicorn.run(app._fast_api_app, host="0.0.0.0", port=4000)
```

    INFO:     Started server process [21284]
    INFO:     Waiting for application startup.

    [INFO] fast_kafka_api._components.asyncapi: Old async specifications at '/work/fast-kafka-api/nbs/asyncapi/spec/asyncapi.yml' does not exist.
    [INFO] fast_kafka_api._components.asyncapi: New async specifications generated at: 'asyncapi/spec/asyncapi.yml'
    [INFO] fast_kafka_api._components.asyncapi: Async docs generated at 'asyncapi/docs'
    [INFO] fast_kafka_api._components.asyncapi: Output of '$ npx -y -p @asyncapi/generator ag asyncapi/spec/asyncapi.yml @asyncapi/html-template -o asyncapi/docs --force-write'

    Done! ✨
    Check out your shiny new generated files at /work/fast-kafka-api/nbs/asyncapi/docs.


    [INFO] fast_kafka_api._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting..
    [INFO] fast_kafka_api._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created.

    INFO:     Application startup complete.
    INFO:     Uvicorn running on http://0.0.0.0:4000 (Press CTRL+C to quit)

    [INFO] fast_kafka_api._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.
    [INFO] aiokafka.consumer.subscription_state: Updating subscribed topics to: frozenset({'input_data'})
    [INFO] aiokafka.consumer.consumer: Subscribed to topic(s): {'input_data'}
    [INFO] fast_kafka_api._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.
    [INFO] aiokafka.consumer.group_coordinator: Metadata for topic has changed from {} to {'input_data': 1}. 

    INFO:     Shutting down
    INFO:     Waiting for application shutdown.

    [INFO] fast_kafka_api._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.
    [INFO] fast_kafka_api._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.

    INFO:     Application shutdown complete.
    INFO:     Finished server process [21284]



            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/airtai/fast-kafka-api",
    "name": "fast-kafka-api",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "",
    "keywords": "nbdev jupyter notebook python",
    "author": "airt",
    "author_email": "info@airt.ai",
    "download_url": "https://files.pythonhosted.org/packages/b5/a7/db2c35dae02a4a8c762232b11ecb53f8e01f09a67d3c85790ebfc7700e65/fast-kafka-api-0.0.4.tar.gz",
    "platform": null,
    "description": "FastKafkaAPI\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n<b>Effortless Kafka integration for your web services</b>\n\n------------------------------------------------------------------------\n\n![PyPI](https://img.shields.io/pypi/v/fast-kafka-api.png) ![PyPI -\nDownloads](https://img.shields.io/pypi/dm/fast-kafka-api.png) ![PyPI -\nPython\nVersion](https://img.shields.io/pypi/pyversions/fast-kafka-api.png)\n\n![GitHub Workflow\nStatus](https://img.shields.io/github/actions/workflow/status/airtai/fast-kafka-api/test.yaml)\n![CodeQL](https://github.com/airtai/fast-kafka-api//actions/workflows/codeql.yml/badge.svg)\n![Dependency\nReview](https://github.com/airtai/fast-kafka-api//actions/workflows/dependency-review.yml/badge.svg)\n\n![GitHub](https://img.shields.io/github/license/airtai/fast-kafka-api.png)\n\n------------------------------------------------------------------------\n\nFastKafkaAPI is a powerful and easy-to-use Python library for building\nasynchronous web services that interact with Kafka topics. Built on top\nof FastAPI, Starlette, Pydantic, and AIOKafka, FastKafkaAPI simplifies\nthe process of writing producers and consumers for Kafka topics,\nhandling all the parsing, networking, and task scheduling automatically.\nWith FastKafkaAPI, you can quickly prototype and develop\nhigh-performance Kafka-based services with minimal code, making it an\nideal choice for developers looking to streamline their workflow and\naccelerate their projects.\n\n## Install\n\nThis command installs the FastKafkaAPI package from the Python Package\nIndex (PyPI) using the pip package manager.\n\n`pip` is a command-line tool that allows you to install and manage\nPython packages, including FastKafkaAPI. When you run the `pip install`\ncommand with the name of a package (in this case, \u201cfast-kafka-api\u201d), pip\nwill download the package from PyPI, along with any dependencies that\nthe package requires, and install it on your system.\n\nAfter running this command, you will be able to import and use the\nFastKafkaAPI package in your Python code. For example, you might use it\nto initialize a FastKafkaAPI application, as shown in the example\nbellow, and to use the `@consumes` and `@produces` decorators to define\nKafka consumers and producers in your application.\n\nInstalling FastKafkaAPI from PyPI using `pip` is the recommended way to\ninstall the package, as it makes it easy to manage the package and its\ndependencies. If you prefer, you can also install FastKafkaAPI from the\nsource code by cloning the repository and running `pip install .` in the\nroot directory of the project.\n\n``` sh\npip install fast-kafka-api\n```\n\n## How to use\n\nHere is an example python script using FastKafkaAPI that takes data from\nan input Kafka topic, makes a prediction using a predictive model, and\noutputs the prediction to an output Kafka topic.\n\n### Messages\n\nFastKafkaAPI uses Pydantic to parse input JSON-encoded data into Python\nobjects, making it easy to work with structured data in your Kafka-based\napplications. Pydantic\u2019s `BaseModel` class allows you to define messages\nusing a declarative syntax, making it easy to specify the fields and\ntypes of your messages.\n\nThis example defines two message classes for use in a FastKafkaAPI\napplication: `InputData` and `Prediction`.\n\nThe `InputData` class is used to represent input data for a predictive\nmodel. It has three fields: `user_id`, `feature_1`, and `feature_2`. The\n`user_id` field is of type `NonNegativeInt`, which is a subclass of int\nthat only allows non-negative integers. The `feature_1` and `feature_2`\nfields are both lists of floating-point numbers and integers,\nrespectively. These fields are used to represent input features for the\npredictive model.\n\nThe `Prediction` class is used to represent the output of the predictive\nmodel. It has two fields: `user_id` and `score`. The `user_id` field is\nof type `NonNegativeInt`, and the `score` field is a floating-point\nnumber. The `score` field represents the prediction made by the model,\nsuch as the probability of churn in the next 28 days.\n\nThese message classes will be used to parse and validate incoming data\nin Kafka consumers and producers. Using these message classes in\ncombination with FastKafkaAPI makes it easy to work with structured data\nin your Kafka-based applications.\n\n``` python\nfrom typing import List\n\nfrom pydantic import BaseModel, Field, NonNegativeInt\n\n\nclass InputData(BaseModel):\n    user_id: NonNegativeInt = Field(..., example=202020, description=\"ID of a user\")\n    feature_1: List[float] = Field(\n        ...,\n        example=[1.2, 2.3, 4.5, 6.7, 0.1],\n        description=\"input feature 1\",\n    )\n    feature_2: List[int] = Field(\n        ...,\n        example=[2, 4, 3, 1, 0],\n        description=\"input feature 2\",\n    )\n\n\nclass Prediction(BaseModel):\n    user_id: NonNegativeInt = Field(..., example=202020, description=\"ID of a user\")\n    score: float = Field(\n        ...,\n        example=0.4321,\n        description=\"Prediction score (e.g. the probability of churn in the next 28 days)\",\n        ge=0.0,\n        le=1.0,\n    )\n```\n\nThese message classes will be used to parse and validate incoming data\nin a Kafka consumer and to produce a JSON-encoded message in a producer.\nUsing Pydantic\u2019s BaseModel in combination with FastKafkaAPI makes it\neasy to work with structured data in your Kafka-based applications.\n\n### Application\n\nThis example shows how to initialize a FastKafkaAPI application. It\nstarts by defining two environment variables: `KAFKA_HOSTNAME` and\n`KAFKA_PORT`, which are used to specify the hostname and port of the\nKafka broker.\n\nNext, it defines a dictionary called `kafka_brokers`, which contains two\nentries: \u201clocalhost\u201d and \u201cproduction\u201d. Each entry specifies the URL,\nport, and other details of a Kafka broker. This dictionary is used to\ndefine the available Kafka brokers that can be used in the application.\n\nThe `kafka_config` dictionary specifies the configuration options for\nthe Kafka broker, such as the `bootstrap_servers` setting, which\nspecifies the hostname and port of the Kafka broker.\n\nFinally, the FastKafkaAPI class is initialized with several arguments:\n`title`, `contact`, `version`, `description`, `kafka_brokers`, and\n`kafka_config`. These arguments are used to configure various aspects of\nthe application, such as the title, version, and description of the\napplication, as well as the available Kafka brokers and the Kafka\nconfiguration options. The resulting\n[`FastKafkaAPI`](https://airtai.github.io/fast-kafka-api/fastkafkaapi.html#fastkafkaapi)\nobject, which is stored in the `app` variable, represents the\ninitialized FastKafkaAPI application.\n\n``` python\nfrom os import environ\n\nfrom fast_kafka_api.application import FastKafkaAPI\n\nkafka_server_url = environ[\"KAFKA_HOSTNAME\"]\nkafka_server_port = environ[\"KAFKA_PORT\"]\n\nkafka_brokers = {\n    \"localhost\": {\n        \"url\": \"kafka\",\n        \"description\": \"local development kafka broker\",\n        \"port\": 9092,\n    },\n    \"production\": {\n        \"url\": \"kafka.acme.com\",\n        \"description\": \"production kafka broker\",\n        \"port\": 9092,\n        \"protocol\": \"kafka-secure\",\n        \"security\": {\"type\": \"plain\"},\n    },\n}\n\nkafka_config = {\n    \"bootstrap_servers\": f\"{kafka_server_url}:{kafka_server_port}\",\n}\n\napp = FastKafkaAPI(\n    title=\"FastKafkaAPI Example\",\n    contact={\"name\": \"airt.ai\", \"url\": \"https://airt.ai\", \"email\": \"info@airt.ai\"},\n    version=\"0.0.1\",\n    description=\"A simple example on how to use FastKafkaAPI\",\n    kafka_brokers=kafka_brokers,\n    **kafka_config,\n)\n```\n\n### Function decorators\n\nFastKafkaAPI provides convenient function decorators called `@consumes`\nand `@produces` to allow you to delegate the actual processing of data\nto user-defined functions. These decorators make it easy to specify the\nprocessing logic for your Kafka consumers and producers, allowing you to\nfocus on the core business logic of your application without worrying\nabout the underlying Kafka integration.\n\nThis example shows how to use the `@consumes` and `@produces` decorators\nin a FastKafkaAPI application.\n\nThe `@consumes` decorator is applied to the `on_input_data` function,\nwhich specifies that this function should be called whenever a message\nis received on the \u201cinput_data\u201d Kafka topic. The `on_input_data`\nfunction takes a single argument, `msg`, which is expected to be an\ninstance of the `InputData` message class.\n\nInside the `on_input_data` function, the `model.predict` function is\ncalled with the `feature_1` and `feature_2` fields from the `msg`\nargument. This function returns a prediction score, which is then passed\nto the `to_predictions` function along with the `user_id` field from the\n`msg` argument.\n\nThe `@produces` decorator is applied to the `to_predictions` function,\nwhich specifies that this function should produce a message to the\n\u201cpredictions\u201d Kafka topic whenever it is called. The `to_predictions`\nfunction takes two arguments: `user_id` and `score`. It creates a new\n`Prediction` message with these values and then returns it.\n\nIn summary, this example shows how to use the `@consumes` and\n`@produces` decorators to specify the processing logic for Kafka\nconsumers and producers in a FastKafkaAPI application. The `@consumes`\ndecorator is applied to functions that should be called when a message\nis received on a Kafka topic, and the `@produces` decorator is applied\nto functions that should produce a message to a Kafka topic. These\ndecorators make it easy to specify the processing logic for your Kafka\nconsumers and producers, allowing you to focus on the core business\nlogic of your application without worrying about the underlying Kafka\nintegration.\n\n``` python\n@app.consumes(topic=\"input_data\")\nasync def on_input_data(msg: InputData):\n    print(f\"msg={msg}\")\n    score = await model.predict(feature_1=msg.feature_1, feature_2=msg.feature_2)\n    await to_predictions(user_id=msg.user_id, score=score)\n\n\n@app.produces(topic=\"predictions\")\nasync def to_predictions(user_id: int, score: float) -> Prediction:\n    prediction = Prediction(user_id=user_id, score=score)\n    print(f\"prediction={prediction}\")\n    return prediction\n```\n\n### Running the service\n\nThis example shows how to start the FastKafkaAPI service using the\nuvicorn library. The `uvicorn.run` function is called with the `app`\nargument (which represents the FastKafkaAPI application) and the `host`\nand `port` arguments, which specify the hostname and port on which the\nservice should listen for incoming requests.\n\nWhen the service is started, several log messages are printed to the\nconsole, including information about the application startup, AsyncAPI\nspecification generation, and consumer loop status.\n\nDuring the lifetime of the service, incoming requests will be processed\nby the FastKafkaAPI application and appropriate actions will be taken\nbased on the defined Kafka consumers and producers. For example, if a\nmessage is received on the \u201cinput_data\u201d Kafka topic, the `on_input_data`\nfunction will be called to process the message, and if the\n`to_predictions` function is called, it will produce a message to the\n\u201cpredictions\u201d Kafka topic. The service will continue to run until it is\nshut down, at which point the application shutdown process will be\ninitiated and the service will stop.\n\n``` python\nimport uvicorn\n\nuvicorn.run(app._fast_api_app, host=\"0.0.0.0\", port=4000)\n```\n\n    INFO:     Started server process [21284]\n    INFO:     Waiting for application startup.\n\n    [INFO] fast_kafka_api._components.asyncapi: Old async specifications at '/work/fast-kafka-api/nbs/asyncapi/spec/asyncapi.yml' does not exist.\n    [INFO] fast_kafka_api._components.asyncapi: New async specifications generated at: 'asyncapi/spec/asyncapi.yml'\n    [INFO] fast_kafka_api._components.asyncapi: Async docs generated at 'asyncapi/docs'\n    [INFO] fast_kafka_api._components.asyncapi: Output of '$ npx -y -p @asyncapi/generator ag asyncapi/spec/asyncapi.yml @asyncapi/html-template -o asyncapi/docs --force-write'\n\n    Done! \u2728\n    Check out your shiny new generated files at /work/fast-kafka-api/nbs/asyncapi/docs.\n\n\n    [INFO] fast_kafka_api._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting..\n    [INFO] fast_kafka_api._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created.\n\n    INFO:     Application startup complete.\n    INFO:     Uvicorn running on http://0.0.0.0:4000 (Press CTRL+C to quit)\n\n    [INFO] fast_kafka_api._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.\n    [INFO] aiokafka.consumer.subscription_state: Updating subscribed topics to: frozenset({'input_data'})\n    [INFO] aiokafka.consumer.consumer: Subscribed to topic(s): {'input_data'}\n    [INFO] fast_kafka_api._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.\n    [INFO] aiokafka.consumer.group_coordinator: Metadata for topic has changed from {} to {'input_data': 1}. \n\n    INFO:     Shutting down\n    INFO:     Waiting for application shutdown.\n\n    [INFO] fast_kafka_api._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.\n    [INFO] fast_kafka_api._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.\n\n    INFO:     Application shutdown complete.\n    INFO:     Finished server process [21284]\n\n\n",
    "bugtrack_url": null,
    "license": "Apache Software License 2.0",
    "summary": "Extension of FastAPI with Kafka event handlers",
    "version": "0.0.4",
    "split_keywords": [
        "nbdev",
        "jupyter",
        "notebook",
        "python"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3ad85ab1dc8b0eb7133512ce99fc9d296d32b7d9e3689d89f1a8a3381e512b34",
                "md5": "4f872805e9573af5214804f0768368c0",
                "sha256": "369ba5997e51225e3c6ce5439771d96d7b3da1a17c42f07e4d191c5cd8da5503"
            },
            "downloads": -1,
            "filename": "fast_kafka_api-0.0.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4f872805e9573af5214804f0768368c0",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 39430,
            "upload_time": "2023-01-09T18:09:24",
            "upload_time_iso_8601": "2023-01-09T18:09:24.436909Z",
            "url": "https://files.pythonhosted.org/packages/3a/d8/5ab1dc8b0eb7133512ce99fc9d296d32b7d9e3689d89f1a8a3381e512b34/fast_kafka_api-0.0.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b5a7db2c35dae02a4a8c762232b11ecb53f8e01f09a67d3c85790ebfc7700e65",
                "md5": "a181c2296da7a10538edeb8f790bfafc",
                "sha256": "74d8094dda3ede32820ce953ad867fe45daf04ba8a2c40f47f062bd6e4e8efb2"
            },
            "downloads": -1,
            "filename": "fast-kafka-api-0.0.4.tar.gz",
            "has_sig": false,
            "md5_digest": "a181c2296da7a10538edeb8f790bfafc",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 31910,
            "upload_time": "2023-01-09T18:09:26",
            "upload_time_iso_8601": "2023-01-09T18:09:26.232611Z",
            "url": "https://files.pythonhosted.org/packages/b5/a7/db2c35dae02a4a8c762232b11ecb53f8e01f09a67d3c85790ebfc7700e65/fast-kafka-api-0.0.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-01-09 18:09:26",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "airtai",
    "github_project": "fast-kafka-api",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "fast-kafka-api"
}
        
Elapsed time: 0.02599s