fabric-message-bus


Namefabric-message-bus JSON
Version 1.6.2 PyPI version JSON
download
home_pageNone
SummaryFabric Message Bus Schema
upload_time2024-02-08 15:11:13
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseNone
keywords kafka fabric message bus schema avro
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![PyPI](https://img.shields.io/pypi/v/fabric-message-bus?style=plastic)](https://pypi.org/project/fabric-message-bus/)

# Message Bus Schema and Message Definition
Basic framework for a Fabric Message Bus Schema and Messages for Inter Actor Communication 


## Overview
Fabric communication across various actors in Control and Measurement Framework is implemented using Apache Kafka.

Apache Kafka is a distributed system designed for streams. It is built to be fault-tolerant, high-throughput, horizontally scalable, and allows geographically distributed data streams and stream processing applications.

Kafka enables event driven implementation of various actors/services. Events are both a Fact and a Trigger. Each fabric actor will be a producer for one topic following the Single Writer Principle and would subscribe to topics from other actors for communication. Messages are exchanged over Kafka using Apache Avro data serialization system. 

## Requirements
- Python 3.7+
- confluent-kafka
- confluent-kafka[avro]

## Installation
```
$ pip3 install .
```

## Usage
This package implements the interface for producer/consumer APIs to push/read messages to/from Kafka via Avro serialization. 

### Message and Schema
User is expected to inherit IMessage class(message.py) to define it's own members and over ride to_dict() function. It is also required to define the corresponding AVRO schema pertaining to the derived class. This new schema shall be used in producer and consumers.

Example schema for basic IMessage class is available in (schema/message.avsc)

### Producers
AvroProducerApi class implements the base functionality for an Avro Kafka producer. User is expected to inherit this class and override delivery_report method to handle message delivery for asynchronous produce. 

Example for usage available at the end of producer.py

### Consumers
AvroConsumerApi class implements the base functionality for an Avro Kafka consumer. User is expected to inherit this class and override process_message method to handle message processing for incoming message. 

Example for usage available at the end of consumer.py

### Admin API
AdminApi class provides support to carry out basic admin functions like create/delete topics/partions etc.


## How to bring up a test Kafka cluster to test
### Generate Credentials
You must generate CA certificates (or use yours if you already have one) and then generate a keystore and truststore for brokers and clients.
```
cd $(pwd)/secrets
./create-certs.sh
(Type yes for all "Trust this certificate? [no]:" prompts.)
cd -
```
Set the environment variable for the secrets directory. This is used in later commands. Make sure that you are in the MessageBus directory.
```
export KAFKA_SSL_SECRETS_DIR=$(pwd)/secrets
```
### Bring up the containers
You can use the docker-compose.yaml file to bring up a simple Kafka cluster containing
- broker
- zookeeper 
- schema registry

Use the below command to bring up the cluster
```
docker-compose up -d
```

This should bring up following containers:
```
docker ps
CONTAINER ID        IMAGE                                    COMMAND                  CREATED             STATUS              PORTS                                                                                        NAMES
189ba0e70b97        confluentinc/cp-schema-registry:latest   "/etc/confluent/dock…"   58 seconds ago      Up 58 seconds       0.0.0.0:8081->8081/tcp                                                                       schemaregistry
49616f1c9b0a        confluentinc/cp-kafka:latest             "/etc/confluent/dock…"   59 seconds ago      Up 58 seconds       0.0.0.0:9092->9092/tcp, 0.0.0.0:19092->19092/tcp                                             broker1
c9d19c82558d        confluentinc/cp-zookeeper:latest         "/etc/confluent/dock…"   59 seconds ago      Up 59 seconds       2888/tcp, 0.0.0.0:2181->2181/tcp, 3888/tcp                                                   zookeeper
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "fabric-message-bus",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "Kafka,Fabric Message Bus Schema,Avro",
    "author": null,
    "author_email": "Komal Thareja <kthare10@renci.org>",
    "download_url": "https://files.pythonhosted.org/packages/1b/f6/9456e779fa9847f97dc445c7e3aaeb7c6dd349da35b2119ed4e6b1ea7d8b/fabric-message-bus-1.6.2.tar.gz",
    "platform": null,
    "description": "[![PyPI](https://img.shields.io/pypi/v/fabric-message-bus?style=plastic)](https://pypi.org/project/fabric-message-bus/)\n\n# Message Bus Schema and Message Definition\nBasic framework for a Fabric Message Bus Schema and Messages for Inter Actor Communication \n\n\n## Overview\nFabric communication across various actors in Control and Measurement Framework is implemented using Apache Kafka.\n\nApache Kafka is a distributed system designed for streams. It is built to be fault-tolerant, high-throughput, horizontally scalable, and allows geographically distributed data streams and stream processing applications.\n\nKafka enables event driven implementation of various actors/services. Events are both a Fact and a Trigger. Each fabric actor will be a producer for one topic following the Single Writer Principle and would subscribe to topics from other actors for communication. Messages are exchanged over Kafka using Apache Avro data serialization system. \n\n## Requirements\n- Python 3.7+\n- confluent-kafka\n- confluent-kafka[avro]\n\n## Installation\n```\n$ pip3 install .\n```\n\n## Usage\nThis package implements the interface for producer/consumer APIs to push/read messages to/from Kafka via Avro serialization. \n\n### Message and Schema\nUser is expected to inherit IMessage class(message.py) to define it's own members and over ride to_dict() function. It is also required to define the corresponding AVRO schema pertaining to the derived class. This new schema shall be used in producer and consumers.\n\nExample schema for basic IMessage class is available in (schema/message.avsc)\n\n### Producers\nAvroProducerApi class implements the base functionality for an Avro Kafka producer. User is expected to inherit this class and override delivery_report method to handle message delivery for asynchronous produce. \n\nExample for usage available at the end of producer.py\n\n### Consumers\nAvroConsumerApi class implements the base functionality for an Avro Kafka consumer. User is expected to inherit this class and override process_message method to handle message processing for incoming message. \n\nExample for usage available at the end of consumer.py\n\n### Admin API\nAdminApi class provides support to carry out basic admin functions like create/delete topics/partions etc.\n\n\n## How to bring up a test Kafka cluster to test\n### Generate Credentials\nYou must generate CA certificates (or use yours if you already have one) and then generate a keystore and truststore for brokers and clients.\n```\ncd $(pwd)/secrets\n./create-certs.sh\n(Type yes for all \"Trust this certificate? [no]:\" prompts.)\ncd -\n```\nSet the environment variable for the secrets directory. This is used in later commands. Make sure that you are in the MessageBus directory.\n```\nexport KAFKA_SSL_SECRETS_DIR=$(pwd)/secrets\n```\n### Bring up the containers\nYou can use the docker-compose.yaml file to bring up a simple Kafka cluster containing\n- broker\n- zookeeper \n- schema registry\n\nUse the below command to bring up the cluster\n```\ndocker-compose up -d\n```\n\nThis should bring up following containers:\n```\ndocker ps\nCONTAINER ID        IMAGE                                    COMMAND                  CREATED             STATUS              PORTS                                                                                        NAMES\n189ba0e70b97        confluentinc/cp-schema-registry:latest   \"/etc/confluent/dock\u2026\"   58 seconds ago      Up 58 seconds       0.0.0.0:8081->8081/tcp                                                                       schemaregistry\n49616f1c9b0a        confluentinc/cp-kafka:latest             \"/etc/confluent/dock\u2026\"   59 seconds ago      Up 58 seconds       0.0.0.0:9092->9092/tcp, 0.0.0.0:19092->19092/tcp                                             broker1\nc9d19c82558d        confluentinc/cp-zookeeper:latest         \"/etc/confluent/dock\u2026\"   59 seconds ago      Up 59 seconds       2888/tcp, 0.0.0.0:2181->2181/tcp, 3888/tcp                                                   zookeeper\n```\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Fabric Message Bus Schema",
    "version": "1.6.2",
    "project_urls": {
        "Home": "https://fabric-testbed.net/",
        "Sources": "https://github.com/fabric-testbed/MessageBusSchema"
    },
    "split_keywords": [
        "kafka",
        "fabric message bus schema",
        "avro"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "034837ea6b16cbee474e30c7f0f35b5e997d4c7eba1d94cae3a7e4da2c33ac9c",
                "md5": "0b8b2873bd0c593922f44e4e7b226c7c",
                "sha256": "34382c571bd271ab355d4c9afdb18388f61e2c9810c306b83051eb81a28c95c4"
            },
            "downloads": -1,
            "filename": "fabric_message_bus-1.6.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "0b8b2873bd0c593922f44e4e7b226c7c",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 142988,
            "upload_time": "2024-02-08T15:11:10",
            "upload_time_iso_8601": "2024-02-08T15:11:10.680992Z",
            "url": "https://files.pythonhosted.org/packages/03/48/37ea6b16cbee474e30c7f0f35b5e997d4c7eba1d94cae3a7e4da2c33ac9c/fabric_message_bus-1.6.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "1bf69456e779fa9847f97dc445c7e3aaeb7c6dd349da35b2119ed4e6b1ea7d8b",
                "md5": "71de0072a020bf8fa4c913a490c396fb",
                "sha256": "36dc134ff8663b9bc6d9fa336a0017124f588e17592d85787efb32389593677e"
            },
            "downloads": -1,
            "filename": "fabric-message-bus-1.6.2.tar.gz",
            "has_sig": false,
            "md5_digest": "71de0072a020bf8fa4c913a490c396fb",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 45157,
            "upload_time": "2024-02-08T15:11:13",
            "upload_time_iso_8601": "2024-02-08T15:11:13.169639Z",
            "url": "https://files.pythonhosted.org/packages/1b/f6/9456e779fa9847f97dc445c7e3aaeb7c6dd349da35b2119ed4e6b1ea7d8b/fabric-message-bus-1.6.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-02-08 15:11:13",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "fabric-testbed",
    "github_project": "MessageBusSchema",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "fabric-message-bus"
}
        
Elapsed time: 0.26168s