Name | python-schema-registry-client JSON |
Version |
2.6.0
JSON |
| download |
home_page | None |
Summary | Python Rest Client to interact against Schema Registry confluent server |
upload_time | 2024-06-19 07:20:12 |
maintainer | None |
docs_url | None |
author | Marcos Schroh |
requires_python | <4.0,>=3.8 |
license | MIT |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Python Rest Client Schema Registry
[![Python package](https://github.com/marcosschroh/python-schema-registry-client/actions/workflows/python-package.yml/badge.svg)](https://github.com/marcosschroh/python-schema-registry-client/actions/workflows/python-package.yml)
[![GitHub license](https://img.shields.io/github/license/marcosschroh/python-schema-registry-client.svg)](https://github.com/marcosschroh/python-schema-registry-client/blob/master/LICENSE)
[![codecov](https://codecov.io/gh/marcosschroh/python-schema-registry-client/branch/master/graph/badge.svg)](https://codecov.io/gh/marcosschroh/python-schema-registry-client)
[![Python Version](https://img.shields.io/badge/python-3.8+-blue.svg)](https://img.shields.io/badge/python-3.8+-blue.svg)
Python Rest Client to interact against [schema-registry](https://docs.confluent.io/current/schema-registry/index.html) confluent server to manage [Avro](https://docs.oracle.com/database/nosql-12.1.3.1/GettingStartedGuide/avroschemas.html) and [JSON](https://json-schema.org/) schemas resources.
## Requirements
python 3.8+
## Installation
```bash
pip install python-schema-registry-client
```
If you want the `Faust` functionality:
```bash
pip install python-schema-registry-client[faust]
```
Note that this will automatically add a dependency on the [faust-streaming](https://github.com/faust-streaming/faust) fork of faust. If you want to use the
old faust version, simply install it manually and then install `python-schema-registry-client` without the `faust` extra enabled, the functionality will
be the same.
## Client API, Serializer, Faust Integration and Schema Server description
**Documentation**: [https://marcosschroh.github.io/python-schema-registry-client.io](https://marcosschroh.github.io/python-schema-registry-client)
## Avro Schema Usage
```python
from schema_registry.client import SchemaRegistryClient, schema
client = SchemaRegistryClient(url="http://127.0.0.1:8081")
deployment_schema = {
"type": "record",
"namespace": "com.kubertenes",
"name": "AvroDeployment",
"fields": [
{"name": "image", "type": "string"},
{"name": "replicas", "type": "int"},
{"name": "port", "type": "int"},
],
}
avro_schema = schema.AvroSchema(deployment_schema)
schema_id = client.register("test-deployment", avro_schema)
```
or async
```python
from schema_registry.client import AsyncSchemaRegistryClient, schema
async_client = AsyncSchemaRegistryClient(url="http://127.0.0.1:8081")
deployment_schema = {
"type": "record",
"namespace": "com.kubertenes",
"name": "AvroDeployment",
"fields": [
{"name": "image", "type": "string"},
{"name": "replicas", "type": "int"},
{"name": "port", "type": "int"},
],
}
avro_schema = schema.AvroSchema(deployment_schema)
schema_id = await async_client.register("test-deployment", avro_schema)
```
## JSON Schema Usage
```python
from schema_registry.client import SchemaRegistryClient, schema
client = SchemaRegistryClient(url="http://127.0.0.1:8081")
deployment_schema = {
"definitions" : {
"JsonDeployment" : {
"type" : "object",
"required" : ["image", "replicas", "port"],
"properties" : {
"image" : {"type" : "string"},
"replicas" : {"type" : "integer"},
"port" : {"type" : "integer"}
}
}
},
"$ref" : "#/definitions/JsonDeployment"
}
json_schema = schema.JsonSchema(deployment_schema)
schema_id = client.register("test-deployment", json_schema)
```
or async
```python
from schema_registry.client import AsyncSchemaRegistryClient, schema
async_client = AsyncSchemaRegistryClient(url="http://127.0.0.1:8081")
deployment_schema = {
"definitions" : {
"JsonDeployment" : {
"type" : "object",
"required" : ["image", "replicas", "port"],
"properties" : {
"image" : {"type" : "string"},
"replicas" : {"type" : "integer"},
"port" : {"type" : "integer"}
}
}
},
"$ref" : "#/definitions/JsonDeployment"
}
json_schema = schema.JsonSchema(deployment_schema)
schema_id = await async_client.register("test-deployment", json_schema)
```
## Usage with dataclasses-avroschema for avro schemas
You can generate the `avro schema` directely from a python class using [dataclasses-avroschema](https://github.com/marcosschroh/dataclasses-avroschema)
and use it in the API for `register schemas`, `check versions` and `test compatibility`:
```python
import dataclasses
from dataclasses_avroschema import AvroModel, types
from schema_registry.client import SchemaRegistryClient
client = SchemaRegistryClient(url="http://127.0.0.1:8081")
@dataclasses.dataclass
class UserAdvance(AvroModel):
name: str
age: int
pets: typing.List[str] = dataclasses.field(default_factory=lambda: ["dog", "cat"])
accounts: typing.Dict[str, int] = dataclasses.field(default_factory=lambda: {"key": 1})
has_car: bool = False
favorite_colors: types.Enum = types.Enum(["BLUE", "YELLOW", "GREEN"], default="BLUE")
country: str = "Argentina"
address: str = None
# register the schema
schema_id = client.register(subject, UserAdvance.avro_schema())
print(schema_id)
# >>> 12
result = client.check_version(subject, UserAdvance.avro_schema())
print(result)
# >>> SchemaVersion(subject='dataclasses-avroschema-subject-2', schema_id=12, schema=1, version={"type":"record" ...')
compatibility = client.test_compatibility(subject, UserAdvance.avro_schema())
print(compatibility)
# >>> True
```
## Usage with pydantic for json schemas
You can generate the json schema directely from a python class using pydantic and use it in the API for register schemas, check versions and test compatibility:
```python
import typing
from enum import Enum
from pydantic import BaseModel
from schema_registry.client import SchemaRegistryClient
client = SchemaRegistryClient(url="http://127.0.0.1:8081")
class ColorEnum(str, Enum):
BLUE = "BLUE"
YELLOW = "YELLOW"
GREEN = "GREEN"
class UserAdvance(BaseModel):
name: str
age: int
pets: typing.List[str] = ["dog", "cat"]
accounts: typing.Dict[str, int] = {"key": 1}
has_car: bool = False
favorite_colors: ColorEnum = ColorEnum.BLUE
country: str = "Argentina"
address: str = None
# register the schema
schema_id = client.register(subject, UserAdvance.model_json_schema(), schema_type="JSON")
print(schema_id)
# >>> 12
result = client.check_version(subject, UserAdvance.model_json_schema(), schema_type="JSON")
print(result)
# >>> SchemaVersion(subject='pydantic-jsonschema-subject', schema_id=12, schema=1, version=<schema_registry.client.schema.JsonSchema object at 0x7f40354550a0>)
compatibility = client.test_compatibility(subject, UserAdvance.model_json_schema(), schema_type="JSON")
print(compatibility)
# >>> True
```
## Serializers
You can use `AvroMessageSerializer` to encode/decode messages in `avro`
```python
from schema_registry.client import SchemaRegistryClient, schema
from schema_registry.serializers import AvroMessageSerializer
client = SchemaRegistryClient("http://127.0.0.1:8081")
avro_message_serializer = AvroMessageSerializer(client)
avro_user_schema = schema.AvroSchema({
"type": "record",
"namespace": "com.example",
"name": "AvroUsers",
"fields": [
{"name": "first_name", "type": "string"},
{"name": "last_name", "type": "string"},
{"name": "age", "type": "int"},
],
})
# We want to encode the user_record with avro_user_schema
user_record = {
"first_name": "my_first_name",
"last_name": "my_last_name",
"age": 20,
}
# Encode the record
message_encoded = avro_message_serializer.encode_record_with_schema(
"user", avro_user_schema, user_record)
print(message_encoded)
# >>> b'\x00\x00\x00\x00\x01\x1amy_first_name\x18my_last_name('
```
or with `json schemas`
```python
from schema_registry.client import SchemaRegistryClient, schema
from schema_registry.serializers import JsonMessageSerializer
client = SchemaRegistryClient("http://127.0.0.1:8081")
json_message_serializer = JsonMessageSerializer(client)
json_schema = schema.JsonSchema({
"definitions" : {
"record:python.test.basic.basic" : {
"description" : "basic schema for tests",
"type" : "object",
"required" : [ "number", "name" ],
"properties" : {
"number" : {
"oneOf" : [ {
"type" : "integer"
}, {
"type" : "null"
} ]
},
"name" : {
"oneOf" : [ {
"type" : "string"
} ]
}
}
}
},
"$ref" : "#/definitions/record:python.test.basic.basic"
})
# Encode the record
basic_record = {
"number": 10,
"name": "a_name",
}
message_encoded = json_message_serializer.encode_record_with_schema(
"basic", json_schema, basic_record)
print(message_encoded)
# >>> b'\x00\x00\x00\x00\x02{"number": 10, "name": "a_name"}'
```
## When use this library
Usually, we have a situation like this:
![Confluent Architecture](docs/img/confluent_architecture.png)
So, our producers/consumers have to serialize/deserialize messages every time that they send/receive from Kafka topics. In this picture, we can imagine a `Faust` application receiving messages (encoded with an Avro schema) and we want to deserialize them, so we can ask the `schema server` to do that for us. In this scenario, the `MessageSerializer` is perfect.
Also, could be a use case that we would like to have an Application only to administrate `Avro Schemas` (register, update compatibilities, delete old schemas, etc.), so the `SchemaRegistryClient` is perfect.
## Development
[Poetry](https://python-poetry.org/docs/) is needed to install the dependencies and develope locally
1. Install dependencies: `poetry install --all-extras`
2. Code linting: `./scripts/format`
3. Run tests: `./scripts/test`
For commit messages we use [commitizen](https://commitizen-tools.github.io/commitizen/) in order to standardize a way of committing rules
*Note*: The tests are run against the `Schema Server` using `docker compose`, so you will need
`Docker` and `Docker Compose` installed.
In a terminal run `docker-compose up`. Then in a different terminal run the tests:
```bash
./scripts/test
```
All additional args will be passed to pytest, for example:
```bash
./scripts/test ./tests/client/
```
### Tests usind the python shell
To perform tests using the python shell you can run the project using `docker-compose`.
1. Execute `docker-compose up`. Then, the `schema registry server` will run on `http://127.0.0.1:8081`, then you can interact against it using the `SchemaRegistryClient`:
1. Use the python interpreter (get a python shell typing `python` in your command line)
1. Play with the `schema server`
```python
from schema_registry.client import SchemaRegistryClient, schema
client = SchemaRegistryClient(url="http://127.0.0.1:8081")
# do some operations with the client...
deployment_schema = {
"type": "record",
"namespace": "com.kubertenes",
"name": "AvroDeployment",
"fields": [
{"name": "image", "type": "string"},
{"name": "replicas", "type": "int"},
{"name": "port", "type": "int"},
],
}
avro_schema = schema.AvroSchema(deployment_schema)
client.register("test-deployment", avro_schema)
# >>>> Out[5]: 1
```
Then, you can check the schema using your browser going to the url `http://127.0.0.1:8081/schemas/ids/1`
Raw data
{
"_id": null,
"home_page": null,
"name": "python-schema-registry-client",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.8",
"maintainer_email": null,
"keywords": null,
"author": "Marcos Schroh",
"author_email": "schrohm@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/4a/78/840c35e9f17b947ca5f3fee6de7ea1ac5ac7e6ff4963892e7b2d589f58f2/python_schema_registry_client-2.6.0.tar.gz",
"platform": null,
"description": "# Python Rest Client Schema Registry\n\n[![Python package](https://github.com/marcosschroh/python-schema-registry-client/actions/workflows/python-package.yml/badge.svg)](https://github.com/marcosschroh/python-schema-registry-client/actions/workflows/python-package.yml)\n[![GitHub license](https://img.shields.io/github/license/marcosschroh/python-schema-registry-client.svg)](https://github.com/marcosschroh/python-schema-registry-client/blob/master/LICENSE)\n[![codecov](https://codecov.io/gh/marcosschroh/python-schema-registry-client/branch/master/graph/badge.svg)](https://codecov.io/gh/marcosschroh/python-schema-registry-client)\n[![Python Version](https://img.shields.io/badge/python-3.8+-blue.svg)](https://img.shields.io/badge/python-3.8+-blue.svg)\n\nPython Rest Client to interact against [schema-registry](https://docs.confluent.io/current/schema-registry/index.html) confluent server to manage [Avro](https://docs.oracle.com/database/nosql-12.1.3.1/GettingStartedGuide/avroschemas.html) and [JSON](https://json-schema.org/) schemas resources.\n\n## Requirements\n\npython 3.8+\n\n## Installation\n\n```bash\npip install python-schema-registry-client\n```\n\nIf you want the `Faust` functionality:\n\n```bash\npip install python-schema-registry-client[faust]\n```\n\nNote that this will automatically add a dependency on the [faust-streaming](https://github.com/faust-streaming/faust) fork of faust. If you want to use the\nold faust version, simply install it manually and then install `python-schema-registry-client` without the `faust` extra enabled, the functionality will\nbe the same.\n\n## Client API, Serializer, Faust Integration and Schema Server description\n\n**Documentation**: [https://marcosschroh.github.io/python-schema-registry-client.io](https://marcosschroh.github.io/python-schema-registry-client)\n\n## Avro Schema Usage\n\n```python\nfrom schema_registry.client import SchemaRegistryClient, schema\n\nclient = SchemaRegistryClient(url=\"http://127.0.0.1:8081\")\n\ndeployment_schema = {\n \"type\": \"record\",\n \"namespace\": \"com.kubertenes\",\n \"name\": \"AvroDeployment\",\n \"fields\": [\n {\"name\": \"image\", \"type\": \"string\"},\n {\"name\": \"replicas\", \"type\": \"int\"},\n {\"name\": \"port\", \"type\": \"int\"},\n ],\n}\n\navro_schema = schema.AvroSchema(deployment_schema)\n\nschema_id = client.register(\"test-deployment\", avro_schema)\n```\n\nor async\n\n```python\nfrom schema_registry.client import AsyncSchemaRegistryClient, schema\n\nasync_client = AsyncSchemaRegistryClient(url=\"http://127.0.0.1:8081\")\n\ndeployment_schema = {\n \"type\": \"record\",\n \"namespace\": \"com.kubertenes\",\n \"name\": \"AvroDeployment\",\n \"fields\": [\n {\"name\": \"image\", \"type\": \"string\"},\n {\"name\": \"replicas\", \"type\": \"int\"},\n {\"name\": \"port\", \"type\": \"int\"},\n ],\n}\n\navro_schema = schema.AvroSchema(deployment_schema)\n\nschema_id = await async_client.register(\"test-deployment\", avro_schema)\n```\n\n## JSON Schema Usage\n\n```python\nfrom schema_registry.client import SchemaRegistryClient, schema\n\nclient = SchemaRegistryClient(url=\"http://127.0.0.1:8081\")\n\ndeployment_schema = {\n \"definitions\" : {\n \"JsonDeployment\" : {\n \"type\" : \"object\",\n \"required\" : [\"image\", \"replicas\", \"port\"],\n \"properties\" : {\n \"image\" : {\"type\" : \"string\"},\n \"replicas\" : {\"type\" : \"integer\"},\n \"port\" : {\"type\" : \"integer\"}\n }\n }\n },\n \"$ref\" : \"#/definitions/JsonDeployment\"\n}\n\njson_schema = schema.JsonSchema(deployment_schema)\n\nschema_id = client.register(\"test-deployment\", json_schema)\n```\n\nor async\n\n```python\nfrom schema_registry.client import AsyncSchemaRegistryClient, schema\n\nasync_client = AsyncSchemaRegistryClient(url=\"http://127.0.0.1:8081\")\n\ndeployment_schema = {\n \"definitions\" : {\n \"JsonDeployment\" : {\n \"type\" : \"object\",\n \"required\" : [\"image\", \"replicas\", \"port\"],\n \"properties\" : {\n \"image\" : {\"type\" : \"string\"},\n \"replicas\" : {\"type\" : \"integer\"},\n \"port\" : {\"type\" : \"integer\"}\n }\n }\n },\n \"$ref\" : \"#/definitions/JsonDeployment\"\n}\n\njson_schema = schema.JsonSchema(deployment_schema)\n\nschema_id = await async_client.register(\"test-deployment\", json_schema)\n```\n\n## Usage with dataclasses-avroschema for avro schemas\n\nYou can generate the `avro schema` directely from a python class using [dataclasses-avroschema](https://github.com/marcosschroh/dataclasses-avroschema)\nand use it in the API for `register schemas`, `check versions` and `test compatibility`:\n\n```python\nimport dataclasses\n\nfrom dataclasses_avroschema import AvroModel, types\n\nfrom schema_registry.client import SchemaRegistryClient\n\nclient = SchemaRegistryClient(url=\"http://127.0.0.1:8081\")\n\n\n@dataclasses.dataclass\nclass UserAdvance(AvroModel):\n name: str\n age: int\n pets: typing.List[str] = dataclasses.field(default_factory=lambda: [\"dog\", \"cat\"])\n accounts: typing.Dict[str, int] = dataclasses.field(default_factory=lambda: {\"key\": 1})\n has_car: bool = False\n favorite_colors: types.Enum = types.Enum([\"BLUE\", \"YELLOW\", \"GREEN\"], default=\"BLUE\")\n country: str = \"Argentina\"\n address: str = None\n\n# register the schema\nschema_id = client.register(subject, UserAdvance.avro_schema())\n\nprint(schema_id)\n# >>> 12\n\nresult = client.check_version(subject, UserAdvance.avro_schema())\nprint(result)\n# >>> SchemaVersion(subject='dataclasses-avroschema-subject-2', schema_id=12, schema=1, version={\"type\":\"record\" ...')\n\ncompatibility = client.test_compatibility(subject, UserAdvance.avro_schema())\nprint(compatibility)\n\n# >>> True\n```\n\n## Usage with pydantic for json schemas\n\nYou can generate the json schema directely from a python class using pydantic and use it in the API for register schemas, check versions and test compatibility:\n\n```python\nimport typing\n\nfrom enum import Enum\n\nfrom pydantic import BaseModel\n\nfrom schema_registry.client import SchemaRegistryClient\n\nclient = SchemaRegistryClient(url=\"http://127.0.0.1:8081\")\n\nclass ColorEnum(str, Enum):\n BLUE = \"BLUE\"\n YELLOW = \"YELLOW\"\n GREEN = \"GREEN\"\n\n\nclass UserAdvance(BaseModel):\n name: str\n age: int\n pets: typing.List[str] = [\"dog\", \"cat\"]\n accounts: typing.Dict[str, int] = {\"key\": 1}\n has_car: bool = False\n favorite_colors: ColorEnum = ColorEnum.BLUE\n country: str = \"Argentina\"\n address: str = None\n\n# register the schema\nschema_id = client.register(subject, UserAdvance.model_json_schema(), schema_type=\"JSON\")\n\nprint(schema_id)\n# >>> 12\n\nresult = client.check_version(subject, UserAdvance.model_json_schema(), schema_type=\"JSON\")\nprint(result)\n# >>> SchemaVersion(subject='pydantic-jsonschema-subject', schema_id=12, schema=1, version=<schema_registry.client.schema.JsonSchema object at 0x7f40354550a0>)\n\ncompatibility = client.test_compatibility(subject, UserAdvance.model_json_schema(), schema_type=\"JSON\")\nprint(compatibility)\n\n# >>> True\n```\n\n## Serializers\n\nYou can use `AvroMessageSerializer` to encode/decode messages in `avro`\n\n```python\nfrom schema_registry.client import SchemaRegistryClient, schema\nfrom schema_registry.serializers import AvroMessageSerializer\n\n\nclient = SchemaRegistryClient(\"http://127.0.0.1:8081\")\navro_message_serializer = AvroMessageSerializer(client)\n\navro_user_schema = schema.AvroSchema({\n \"type\": \"record\",\n \"namespace\": \"com.example\",\n \"name\": \"AvroUsers\",\n \"fields\": [\n {\"name\": \"first_name\", \"type\": \"string\"},\n {\"name\": \"last_name\", \"type\": \"string\"},\n {\"name\": \"age\", \"type\": \"int\"},\n\n ],\n})\n\n# We want to encode the user_record with avro_user_schema\nuser_record = {\n \"first_name\": \"my_first_name\",\n \"last_name\": \"my_last_name\",\n \"age\": 20,\n}\n\n# Encode the record\nmessage_encoded = avro_message_serializer.encode_record_with_schema(\n \"user\", avro_user_schema, user_record)\n\nprint(message_encoded)\n# >>> b'\\x00\\x00\\x00\\x00\\x01\\x1amy_first_name\\x18my_last_name('\n```\n\nor with `json schemas`\n\n```python\nfrom schema_registry.client import SchemaRegistryClient, schema\nfrom schema_registry.serializers import JsonMessageSerializer\n\n\nclient = SchemaRegistryClient(\"http://127.0.0.1:8081\")\njson_message_serializer = JsonMessageSerializer(client)\n\njson_schema = schema.JsonSchema({\n \"definitions\" : {\n \"record:python.test.basic.basic\" : {\n \"description\" : \"basic schema for tests\",\n \"type\" : \"object\",\n \"required\" : [ \"number\", \"name\" ],\n \"properties\" : {\n \"number\" : {\n \"oneOf\" : [ {\n \"type\" : \"integer\"\n }, {\n \"type\" : \"null\"\n } ]\n },\n \"name\" : {\n \"oneOf\" : [ {\n \"type\" : \"string\"\n } ]\n }\n }\n }\n },\n \"$ref\" : \"#/definitions/record:python.test.basic.basic\"\n})\n\n# Encode the record\nbasic_record = {\n \"number\": 10,\n \"name\": \"a_name\",\n}\n\nmessage_encoded = json_message_serializer.encode_record_with_schema(\n \"basic\", json_schema, basic_record)\n\nprint(message_encoded)\n# >>> b'\\x00\\x00\\x00\\x00\\x02{\"number\": 10, \"name\": \"a_name\"}'\n```\n\n## When use this library\n\nUsually, we have a situation like this:\n\n![Confluent Architecture](docs/img/confluent_architecture.png)\n\nSo, our producers/consumers have to serialize/deserialize messages every time that they send/receive from Kafka topics. In this picture, we can imagine a `Faust` application receiving messages (encoded with an Avro schema) and we want to deserialize them, so we can ask the `schema server` to do that for us. In this scenario, the `MessageSerializer` is perfect.\n\nAlso, could be a use case that we would like to have an Application only to administrate `Avro Schemas` (register, update compatibilities, delete old schemas, etc.), so the `SchemaRegistryClient` is perfect.\n\n## Development\n\n[Poetry](https://python-poetry.org/docs/) is needed to install the dependencies and develope locally\n\n1. Install dependencies: `poetry install --all-extras`\n2. Code linting: `./scripts/format`\n3. Run tests: `./scripts/test`\n\nFor commit messages we use [commitizen](https://commitizen-tools.github.io/commitizen/) in order to standardize a way of committing rules\n\n*Note*: The tests are run against the `Schema Server` using `docker compose`, so you will need\n`Docker` and `Docker Compose` installed.\n\nIn a terminal run `docker-compose up`. Then in a different terminal run the tests:\n\n```bash\n./scripts/test\n```\n\nAll additional args will be passed to pytest, for example:\n\n```bash\n./scripts/test ./tests/client/\n```\n\n### Tests usind the python shell\n\nTo perform tests using the python shell you can run the project using `docker-compose`.\n\n1. Execute `docker-compose up`. Then, the `schema registry server` will run on `http://127.0.0.1:8081`, then you can interact against it using the `SchemaRegistryClient`:\n1. Use the python interpreter (get a python shell typing `python` in your command line)\n1. Play with the `schema server`\n\n```python\nfrom schema_registry.client import SchemaRegistryClient, schema\n\nclient = SchemaRegistryClient(url=\"http://127.0.0.1:8081\")\n\n# do some operations with the client...\ndeployment_schema = {\n \"type\": \"record\",\n \"namespace\": \"com.kubertenes\",\n \"name\": \"AvroDeployment\",\n \"fields\": [\n {\"name\": \"image\", \"type\": \"string\"},\n {\"name\": \"replicas\", \"type\": \"int\"},\n {\"name\": \"port\", \"type\": \"int\"},\n ],\n}\n\navro_schema = schema.AvroSchema(deployment_schema)\nclient.register(\"test-deployment\", avro_schema)\n# >>>> Out[5]: 1\n```\n\nThen, you can check the schema using your browser going to the url `http://127.0.0.1:8081/schemas/ids/1`\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Python Rest Client to interact against Schema Registry confluent server",
"version": "2.6.0",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "0e2fa472b6202071dec88154901d975b91491708ba8b3587c55c6eafc1977751",
"md5": "1e35e27efb14cefde649cfb37096870e",
"sha256": "a0688a9cd6e2a616a79fb46a6615c531cd5f9e2a5145f5c95932f792417731cb"
},
"downloads": -1,
"filename": "python_schema_registry_client-2.6.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "1e35e27efb14cefde649cfb37096870e",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.8",
"size": 22964,
"upload_time": "2024-06-19T07:20:08",
"upload_time_iso_8601": "2024-06-19T07:20:08.123642Z",
"url": "https://files.pythonhosted.org/packages/0e/2f/a472b6202071dec88154901d975b91491708ba8b3587c55c6eafc1977751/python_schema_registry_client-2.6.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "4a78840c35e9f17b947ca5f3fee6de7ea1ac5ac7e6ff4963892e7b2d589f58f2",
"md5": "64cb322d1f92cbe5265d67f74c5566d8",
"sha256": "e5495899c2bf4fd33bc6689a2068a3423dc94875677f3fd343b6e492a7877ba0"
},
"downloads": -1,
"filename": "python_schema_registry_client-2.6.0.tar.gz",
"has_sig": false,
"md5_digest": "64cb322d1f92cbe5265d67f74c5566d8",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.8",
"size": 21745,
"upload_time": "2024-06-19T07:20:12",
"upload_time_iso_8601": "2024-06-19T07:20:12.635085Z",
"url": "https://files.pythonhosted.org/packages/4a/78/840c35e9f17b947ca5f3fee6de7ea1ac5ac7e6ff4963892e7b2d589f58f2/python_schema_registry_client-2.6.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-06-19 07:20:12",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "python-schema-registry-client"
}