confluent-kafka


Nameconfluent-kafka JSON
Version 2.12.1 PyPI version JSON
download
home_pageNone
SummaryConfluent's Python client for Apache Kafka
upload_time2025-10-21 20:49:58
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Confluent Python Client for Apache Kafka

[![Try Confluent Cloud - The Data Streaming Platform](https://images.ctfassets.net/8vofjvai1hpv/10bgcSfn5MzmvS4nNqr94J/af43dd2336e3f9e0c0ca4feef4398f6f/confluent-banner-v2.svg)](https://confluent.cloud/signup?utm_source=github&utm_medium=banner&utm_campaign=tm.plg.cflt-oss-repos&utm_term=confluent-kafka-python)

Confluent's Python Client for Apache Kafka<sup>TM</sup>
=======================================================

**confluent-kafka-python** provides a high-level `Producer`, `Consumer` and `AdminClient` compatible with all [Apache Kafka™](http://kafka.apache.org/) brokers >= v0.8, [Confluent Cloud](https://www.confluent.io/confluent-cloud/) and [Confluent Platform](https://www.confluent.io/product/compare/).

**Recommended for Production:** While this client works with any Kafka deployment, it's optimized for and fully supported with [Confluent Cloud](https://www.confluent.io/confluent-cloud/) (fully managed) and [Confluent Platform](https://www.confluent.io/product/compare/) (self-managed), which provide enterprise-grade security, monitoring, and support.

## Why Choose Confluent's Python Client?

Unlike the basic Apache Kafka Python client, `confluent-kafka-python` provides:

- **Production-Ready Performance**: Built on `librdkafka` (C library) for maximum throughput and minimal latency, significantly outperforming pure Python implementations.
- **Enterprise Features**: Schema Registry integration, transactions, exactly-once semantics, and advanced serialization support out of the box.
- **AsyncIO Support**: Native async/await support for modern Python applications - not available in the Apache Kafka client.
- **Comprehensive Serialization**: Built-in Avro, Protobuf, and JSON Schema support with automatic schema evolution handling.
- **Professional Support**: Backed by Confluent's engineering team with enterprise SLAs and 24/7 support options.
- **Active Development**: Continuously updated with the latest Kafka features and performance optimizations.
- **Battle-Tested**: Used by thousands of organizations in production, from startups to Fortune 500 companies.

**Performance Note:** The Apache Kafka Python client (`kafka-python`) is a pure Python implementation that, while functional, has significant performance limitations for high-throughput production use cases. `confluent-kafka-python` leverages the same high-performance C library (`librdkafka`) used by Confluent's other clients, providing enterprise-grade performance and reliability.

## Key Features

- **High Performance & Reliability**: Built on [`librdkafka`](https://github.com/confluentinc/librdkafka), the battle-tested C client for Apache Kafka, ensuring maximum throughput, low latency, and stability. The client is supported by Confluent and is trusted in mission-critical production environments.
- **Comprehensive Kafka Support**: Full support for the Kafka protocol, transactions, and administration APIs.
- **Experimental; AsyncIO Producer**: An experimental fully asynchronous producer (`AIOProducer`) for seamless integration with modern Python applications using `asyncio`.
- **Seamless Schema Registry Integration**: Synchronous and asynchronous clients for Confluent Schema Registry to handle schema management and serialization (Avro, Protobuf, JSON Schema).
- **Improved Error Handling**: Detailed, context-aware error messages and exceptions to speed up debugging and troubleshooting.
- **[Confluent Cloud] Automatic Zone Detection**: Producers automatically connect to brokers in the same availability zone, reducing latency and data transfer costs without requiring manual configuration.
- **[Confluent Cloud] Simplified Configuration Profiles**: Pre-defined configuration profiles optimized for common use cases like high throughput or low latency, simplifying client setup.
- **Enterprise Support**: Backed by Confluent's expert support team with SLAs and 24/7 assistance for production deployments.

## Usage

For a step-by-step guide on using the client, see [Getting Started with Apache Kafka and Python](https://developer.confluent.io/get-started/python/).

### Choosing Your Kafka Deployment

- **[Confluent Cloud](https://www.confluent.io/confluent-cloud/)** - Fully managed service with automatic scaling, security, and monitoring. Best for teams wanting to focus on applications rather than infrastructure.
- **[Confluent Platform](https://www.confluent.io/product/compare/)** - Self-managed deployment with enterprise features, support, and tooling. Ideal for on-premises or hybrid cloud requirements.
- **Apache Kafka** - Open source deployment. Requires manual setup, monitoring, and maintenance.

Additional examples can be found in the [examples](examples) directory or the [confluentinc/examples](https://github.com/confluentinc/examples/tree/master/clients/cloud/python) GitHub repo, which include demonstrations of:

- Exactly once data processing using the transactional API.
- Integration with asyncio.
- (De)serializing Protobuf, JSON, and Avro data with Confluent Schema Registry integration.
- [Confluent Cloud](https://www.confluent.io/confluent-cloud/) configuration.

Also see the [Python client docs](https://docs.confluent.io/kafka-clients/python/current/overview.html) and the [API reference](https://docs.confluent.io/kafka-clients/python/current/).

Finally, the [tests](tests) are useful as a reference for example usage.
### AsyncIO Producer (experimental)

Use the AsyncIO `Producer` inside async applications to avoid blocking the event loop.

```python
import asyncio
from confluent_kafka.experimental.aio import AIOProducer

async def main():
    p = AIOProducer({"bootstrap.servers": "mybroker"})
    try:
        # produce() returns a Future; first await the coroutine to get the Future,
        # then await the Future to get the delivered Message.
        delivery_future = await p.produce("mytopic", value=b"hello")
        delivered_msg = await delivery_future
        # Optionally flush any remaining buffered messages before shutdown
        await p.flush()
    finally:
        await p.close()

asyncio.run(main())
```

Notes:

- Batched async produce buffers messages; delivery callbacks, stats, errors, and logger run on the event loop.
- Per-message headers are not supported in the batched async path. If headers are required, use the synchronous `Producer.produce(...)` (you can offload to a thread in async apps).

For a more detailed example that includes both an async producer and consumer, see
[`examples/asyncio_example.py`](examples/asyncio_example.py).

**Architecture:** For implementation details and component architecture, see the [AIOProducer Architecture Overview](aio_producer_simple_diagram.md).

#### When to use AsyncIO vs synchronous Producer

- **Use AsyncIO `Producer`** when your code runs under an event loop (FastAPI/Starlette, aiohttp, Sanic, asyncio workers) and must not block.
- **Use synchronous `Producer`** for scripts, batch jobs, and highest-throughput pipelines where you control threads/processes and can call `poll()`/`flush()` directly.
- **In async servers**, prefer AsyncIO `Producer`; if you need headers, call sync `produce()` via `run_in_executor` for that path.

#### AsyncIO with Schema Registry

The AsyncIO producer and consumer integrate seamlessly with async Schema Registry serializers. See the [Schema Registry Integration](#schema-registry-integration) section below for full details.

### Basic Producer example

```python
from confluent_kafka import Producer

p = Producer({'bootstrap.servers': 'mybroker1,mybroker2'})

def delivery_report(err, msg):
    """ Called once for each message produced to indicate delivery result.
        Triggered by poll() or flush()."""
    if err is not None:
        print('Message delivery failed: {}'.format(err))
    else:
        print('Message delivered to {} [{}]'.format(msg.topic(), msg.partition()))

for data in some_data_source:
    # Trigger any available delivery report callbacks from previous produce() calls
    p.poll(0)

    # Asynchronously produce a message. The delivery report callback will
    # be triggered from the call to poll() above, or flush() below, when the
    # message has been successfully delivered or failed permanently.
    p.produce('mytopic', data.encode('utf-8'), callback=delivery_report)

# Wait for any outstanding messages to be delivered and delivery report
# callbacks to be triggered.
p.flush()
```

For a discussion on the poll based producer API, refer to the
[Integrating Apache Kafka With Python Asyncio Web Applications](https://www.confluent.io/blog/kafka-python-asyncio-integration/)
blog post.
### Schema Registry Integration

This client provides full integration with Schema Registry for schema management and message serialization, and is compatible with both [Confluent Platform](https://docs.confluent.io/platform/current/schema-registry/index.html) and [Confluent Cloud](https://docs.confluent.io/cloud/current/sr/index.html). Both synchronous and asynchronous clients are available.

#### Learn more

- [Getting Started with Apache Kafka and Python](https://developer.confluent.io/get-started/python/) – step-by-step course with videos and labs.
- [Schema Registry Basics](https://developer.confluent.io/learn-kafka/stream-processing/sr-schema-registry/) – short tutorial explaining subjects, compatibility, and serialization patterns.

#### Synchronous Client & Serializers

Use the synchronous `SchemaRegistryClient` with the standard `Producer` and `Consumer`.

```python
from confluent_kafka import Producer
from confluent_kafka.schema_registry import SchemaRegistryClient
from confluent_kafka.schema_registry.avro import AvroSerializer
from confluent_kafka.serialization import StringSerializer, SerializationContext, MessageField

# Configure Schema Registry Client
schema_registry_conf = {'url': 'http://localhost:8081'}  # Confluent Platform
# For Confluent Cloud, add: 'basic.auth.user.info': '<sr-api-key>:<sr-api-secret>'
# See: https://docs.confluent.io/cloud/current/sr/index.html
schema_registry_client = SchemaRegistryClient(schema_registry_conf)

# 2. Configure AvroSerializer
avro_serializer = AvroSerializer(schema_registry_client,
                                 user_schema_str,
                                 lambda user, ctx: user.to_dict())

# 3. Configure Producer
producer_conf = {
    'bootstrap.servers': 'localhost:9092',
    'key.serializer': StringSerializer('utf_8'),
    'value.serializer': avro_serializer
}
producer = Producer(producer_conf)

# 4. Produce messages
producer.produce('my-topic', key='user1', value=some_user_object)
producer.flush()
```

#### Asynchronous Client & Serializers (AsyncIO)

Use the `AsyncSchemaRegistryClient` and `Async` serializers with `AIOProducer` and `AIOConsumer`. The configuration is the same as the synchronous client.

```python
from confluent_kafka.experimental.aio import AIOProducer
from confluent_kafka.schema_registry import AsyncSchemaRegistryClient
from confluent_kafka.schema_registry._async.avro import AsyncAvroSerializer

# Setup async Schema Registry client and serializer
# (See configuration options in the synchronous example above)
schema_registry_conf = {'url': 'http://localhost:8081'}
schema_client = AsyncSchemaRegistryClient(schema_registry_conf)
serializer = await AsyncAvroSerializer(schema_client, schema_str=avro_schema)

# Use with AsyncIO producer
producer = AIOProducer({"bootstrap.servers": "localhost:9092"})
serialized_value = await serializer(data, SerializationContext("topic", MessageField.VALUE))
delivery_future = await producer.produce("topic", value=serialized_value)
```

Available async serializers: `AsyncAvroSerializer`, `AsyncJSONSerializer`, `AsyncProtobufSerializer` (and corresponding deserializers).

See also:

- Example: [`examples/asyncio_avro_producer.py`](examples/asyncio_avro_producer.py)

#### Import paths

```python
from confluent_kafka.schema_registry._async.avro import AsyncAvroSerializer, AsyncAvroDeserializer
from confluent_kafka.schema_registry._async.json_schema import AsyncJSONSerializer, AsyncJSONDeserializer
from confluent_kafka.schema_registry._async.protobuf import AsyncProtobufSerializer, AsyncProtobufDeserializer
```

**Client-Side Field Level Encryption (CSFLE):** To use Data Contracts rules (including CSFLE), install the `rules` extra (see Install section), and refer to the encryption examples in [`examples/README.md`](examples/README.md). For CSFLE-specific guidance, see the [Confluent Cloud CSFLE documentation](https://docs.confluent.io/cloud/current/security/encrypt/csfle/overview.html).

**Note:** The async Schema Registry interface mirrors the synchronous client exactly - same configuration options, same calling patterns, no unexpected gotchas or limitations. Simply add `await` to method calls and use the `Async` prefixed classes.

#### Troubleshooting

- **401/403 Unauthorized when using Confluent Cloud:** Verify your `basic.auth.user.info` (SR API key/secret) is correct and that the Schema Registry URL is for your specific cluster. Ensure you are using an SR API key, not a Kafka API key.
- **Schema not found:** Check that your `subject.name.strategy` configuration matches how your schemas are registered in Schema Registry, and that the topic and message field (key/value) pairing is correct.
### Basic Consumer example

```python
from confluent_kafka import Consumer

c = Consumer({
    'bootstrap.servers': 'mybroker',
    'group.id': 'mygroup',
    'auto.offset.reset': 'earliest'
})

c.subscribe(['mytopic'])

while True:
    msg = c.poll(1.0)

    if msg is None:
        continue
    if msg.error():
        print("Consumer error: {}".format(msg.error()))
        continue

    print('Received message: {}'.format(msg.value().decode('utf-8')))

c.close()
```
### Basic AdminClient example

Create topics:

```python
from confluent_kafka.admin import AdminClient, NewTopic

a = AdminClient({'bootstrap.servers': 'mybroker'})

new_topics = [NewTopic(topic, num_partitions=3, replication_factor=1) for topic in ["topic1", "topic2"]]
# Note: In a multi-cluster production scenario, it is more typical to use a replication_factor of 3 for durability.

# Call create_topics to asynchronously create topics. A dict
# of <topic,future> is returned.
fs = a.create_topics(new_topics)

# Wait for each operation to finish.
for topic, f in fs.items():
    try:
        f.result()  # The result itself is None
        print("Topic {} created".format(topic))
    except Exception as e:
        print("Failed to create topic {}: {}".format(topic, e))
```
## Thread safety

The `Producer`, `Consumer`, and `AdminClient` are all thread safe.
## Install

```bash
# Basic installation
pip install confluent-kafka

# With Schema Registry support
pip install "confluent-kafka[avro,schemaregistry]"     # Avro
pip install "confluent-kafka[json,schemaregistry]"     # JSON Schema  
pip install "confluent-kafka[protobuf,schemaregistry]" # Protobuf

# With Data Contract rules (includes CSFLE support)
pip install "confluent-kafka[avro,schemaregistry,rules]"
```

**Note:** Pre-built Linux wheels do not include SASL Kerberos/GSSAPI support. For Kerberos, see the source installation instructions in [INSTALL.md](INSTALL.md).
To use Schema Registry with the Avro serializer/deserializer:

```bash
pip install "confluent-kafka[avro,schemaregistry]"
```

To use Schema Registry with the JSON serializer/deserializer:

```bash
pip install "confluent-kafka[json,schemaregistry]"
```

To use Schema Registry with the Protobuf serializer/deserializer:

```bash
pip install "confluent-kafka[protobuf,schemaregistry]"
```

When using Data Contract rules (including CSFLE) add the `rules`extra, e.g.:

```bash
pip install "confluent-kafka[avro,schemaregistry,rules]"
```

**Install from source**

For source install, see the *Install from source* section in [INSTALL.md](INSTALL.md).

## Broker compatibility

The Python client (as well as the underlying C library librdkafka) supports
all broker versions >= 0.8.
But due to the nature of the Kafka protocol in broker versions 0.8 and 0.9 it
is not safe for a client to assume what protocol version is actually supported
by the broker, thus you will need to hint the Python client what protocol
version it may use. This is done through two configuration settings:

- `broker.version.fallback=YOUR_BROKER_VERSION` (default 0.9.0.1)
- `api.version.request=true|false` (default true)

When using a Kafka 0.10 broker or later you don't need to do anything

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "confluent-kafka",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": "\"Confluent Inc.\" <support@confluent.io>",
    "download_url": null,
    "platform": null,
    "description": "# Confluent Python Client for Apache Kafka\n\n[![Try Confluent Cloud - The Data Streaming Platform](https://images.ctfassets.net/8vofjvai1hpv/10bgcSfn5MzmvS4nNqr94J/af43dd2336e3f9e0c0ca4feef4398f6f/confluent-banner-v2.svg)](https://confluent.cloud/signup?utm_source=github&utm_medium=banner&utm_campaign=tm.plg.cflt-oss-repos&utm_term=confluent-kafka-python)\n\nConfluent's Python Client for Apache Kafka<sup>TM</sup>\n=======================================================\n\n**confluent-kafka-python** provides a high-level `Producer`, `Consumer` and `AdminClient` compatible with all [Apache Kafka\u2122](http://kafka.apache.org/) brokers >= v0.8, [Confluent Cloud](https://www.confluent.io/confluent-cloud/) and [Confluent Platform](https://www.confluent.io/product/compare/).\n\n**Recommended for Production:** While this client works with any Kafka deployment, it's optimized for and fully supported with [Confluent Cloud](https://www.confluent.io/confluent-cloud/) (fully managed) and [Confluent Platform](https://www.confluent.io/product/compare/) (self-managed), which provide enterprise-grade security, monitoring, and support.\n\n## Why Choose Confluent's Python Client?\n\nUnlike the basic Apache Kafka Python client, `confluent-kafka-python` provides:\n\n- **Production-Ready Performance**: Built on `librdkafka` (C library) for maximum throughput and minimal latency, significantly outperforming pure Python implementations.\n- **Enterprise Features**: Schema Registry integration, transactions, exactly-once semantics, and advanced serialization support out of the box.\n- **AsyncIO Support**: Native async/await support for modern Python applications - not available in the Apache Kafka client.\n- **Comprehensive Serialization**: Built-in Avro, Protobuf, and JSON Schema support with automatic schema evolution handling.\n- **Professional Support**: Backed by Confluent's engineering team with enterprise SLAs and 24/7 support options.\n- **Active Development**: Continuously updated with the latest Kafka features and performance optimizations.\n- **Battle-Tested**: Used by thousands of organizations in production, from startups to Fortune 500 companies.\n\n**Performance Note:** The Apache Kafka Python client (`kafka-python`) is a pure Python implementation that, while functional, has significant performance limitations for high-throughput production use cases. `confluent-kafka-python` leverages the same high-performance C library (`librdkafka`) used by Confluent's other clients, providing enterprise-grade performance and reliability.\n\n## Key Features\n\n- **High Performance & Reliability**: Built on [`librdkafka`](https://github.com/confluentinc/librdkafka), the battle-tested C client for Apache Kafka, ensuring maximum throughput, low latency, and stability. The client is supported by Confluent and is trusted in mission-critical production environments.\n- **Comprehensive Kafka Support**: Full support for the Kafka protocol, transactions, and administration APIs.\n- **Experimental; AsyncIO Producer**: An experimental fully asynchronous producer (`AIOProducer`) for seamless integration with modern Python applications using `asyncio`.\n- **Seamless Schema Registry Integration**: Synchronous and asynchronous clients for Confluent Schema Registry to handle schema management and serialization (Avro, Protobuf, JSON Schema).\n- **Improved Error Handling**: Detailed, context-aware error messages and exceptions to speed up debugging and troubleshooting.\n- **[Confluent Cloud] Automatic Zone Detection**: Producers automatically connect to brokers in the same availability zone, reducing latency and data transfer costs without requiring manual configuration.\n- **[Confluent Cloud] Simplified Configuration Profiles**: Pre-defined configuration profiles optimized for common use cases like high throughput or low latency, simplifying client setup.\n- **Enterprise Support**: Backed by Confluent's expert support team with SLAs and 24/7 assistance for production deployments.\n\n## Usage\n\nFor a step-by-step guide on using the client, see [Getting Started with Apache Kafka and Python](https://developer.confluent.io/get-started/python/).\n\n### Choosing Your Kafka Deployment\n\n- **[Confluent Cloud](https://www.confluent.io/confluent-cloud/)** - Fully managed service with automatic scaling, security, and monitoring. Best for teams wanting to focus on applications rather than infrastructure.\n- **[Confluent Platform](https://www.confluent.io/product/compare/)** - Self-managed deployment with enterprise features, support, and tooling. Ideal for on-premises or hybrid cloud requirements.\n- **Apache Kafka** - Open source deployment. Requires manual setup, monitoring, and maintenance.\n\nAdditional examples can be found in the [examples](examples) directory or the [confluentinc/examples](https://github.com/confluentinc/examples/tree/master/clients/cloud/python) GitHub repo, which include demonstrations of:\n\n- Exactly once data processing using the transactional API.\n- Integration with asyncio.\n- (De)serializing Protobuf, JSON, and Avro data with Confluent Schema Registry integration.\n- [Confluent Cloud](https://www.confluent.io/confluent-cloud/) configuration.\n\nAlso see the [Python client docs](https://docs.confluent.io/kafka-clients/python/current/overview.html) and the [API reference](https://docs.confluent.io/kafka-clients/python/current/).\n\nFinally, the [tests](tests) are useful as a reference for example usage.\n### AsyncIO Producer (experimental)\n\nUse the AsyncIO `Producer` inside async applications to avoid blocking the event loop.\n\n```python\nimport asyncio\nfrom confluent_kafka.experimental.aio import AIOProducer\n\nasync def main():\n    p = AIOProducer({\"bootstrap.servers\": \"mybroker\"})\n    try:\n        # produce() returns a Future; first await the coroutine to get the Future,\n        # then await the Future to get the delivered Message.\n        delivery_future = await p.produce(\"mytopic\", value=b\"hello\")\n        delivered_msg = await delivery_future\n        # Optionally flush any remaining buffered messages before shutdown\n        await p.flush()\n    finally:\n        await p.close()\n\nasyncio.run(main())\n```\n\nNotes:\n\n- Batched async produce buffers messages; delivery callbacks, stats, errors, and logger run on the event loop.\n- Per-message headers are not supported in the batched async path. If headers are required, use the synchronous `Producer.produce(...)` (you can offload to a thread in async apps).\n\nFor a more detailed example that includes both an async producer and consumer, see\n[`examples/asyncio_example.py`](examples/asyncio_example.py).\n\n**Architecture:** For implementation details and component architecture, see the [AIOProducer Architecture Overview](aio_producer_simple_diagram.md).\n\n#### When to use AsyncIO vs synchronous Producer\n\n- **Use AsyncIO `Producer`** when your code runs under an event loop (FastAPI/Starlette, aiohttp, Sanic, asyncio workers) and must not block.\n- **Use synchronous `Producer`** for scripts, batch jobs, and highest-throughput pipelines where you control threads/processes and can call `poll()`/`flush()` directly.\n- **In async servers**, prefer AsyncIO `Producer`; if you need headers, call sync `produce()` via `run_in_executor` for that path.\n\n#### AsyncIO with Schema Registry\n\nThe AsyncIO producer and consumer integrate seamlessly with async Schema Registry serializers. See the [Schema Registry Integration](#schema-registry-integration) section below for full details.\n\n### Basic Producer example\n\n```python\nfrom confluent_kafka import Producer\n\np = Producer({'bootstrap.servers': 'mybroker1,mybroker2'})\n\ndef delivery_report(err, msg):\n    \"\"\" Called once for each message produced to indicate delivery result.\n        Triggered by poll() or flush().\"\"\"\n    if err is not None:\n        print('Message delivery failed: {}'.format(err))\n    else:\n        print('Message delivered to {} [{}]'.format(msg.topic(), msg.partition()))\n\nfor data in some_data_source:\n    # Trigger any available delivery report callbacks from previous produce() calls\n    p.poll(0)\n\n    # Asynchronously produce a message. The delivery report callback will\n    # be triggered from the call to poll() above, or flush() below, when the\n    # message has been successfully delivered or failed permanently.\n    p.produce('mytopic', data.encode('utf-8'), callback=delivery_report)\n\n# Wait for any outstanding messages to be delivered and delivery report\n# callbacks to be triggered.\np.flush()\n```\n\nFor a discussion on the poll based producer API, refer to the\n[Integrating Apache Kafka With Python Asyncio Web Applications](https://www.confluent.io/blog/kafka-python-asyncio-integration/)\nblog post.\n### Schema Registry Integration\n\nThis client provides full integration with Schema Registry for schema management and message serialization, and is compatible with both [Confluent Platform](https://docs.confluent.io/platform/current/schema-registry/index.html) and [Confluent Cloud](https://docs.confluent.io/cloud/current/sr/index.html). Both synchronous and asynchronous clients are available.\n\n#### Learn more\n\n- [Getting Started with Apache Kafka and Python](https://developer.confluent.io/get-started/python/) \u2013 step-by-step course with videos and labs.\n- [Schema Registry Basics](https://developer.confluent.io/learn-kafka/stream-processing/sr-schema-registry/) \u2013 short tutorial explaining subjects, compatibility, and serialization patterns.\n\n#### Synchronous Client & Serializers\n\nUse the synchronous `SchemaRegistryClient` with the standard `Producer` and `Consumer`.\n\n```python\nfrom confluent_kafka import Producer\nfrom confluent_kafka.schema_registry import SchemaRegistryClient\nfrom confluent_kafka.schema_registry.avro import AvroSerializer\nfrom confluent_kafka.serialization import StringSerializer, SerializationContext, MessageField\n\n# Configure Schema Registry Client\nschema_registry_conf = {'url': 'http://localhost:8081'}  # Confluent Platform\n# For Confluent Cloud, add: 'basic.auth.user.info': '<sr-api-key>:<sr-api-secret>'\n# See: https://docs.confluent.io/cloud/current/sr/index.html\nschema_registry_client = SchemaRegistryClient(schema_registry_conf)\n\n# 2. Configure AvroSerializer\navro_serializer = AvroSerializer(schema_registry_client,\n                                 user_schema_str,\n                                 lambda user, ctx: user.to_dict())\n\n# 3. Configure Producer\nproducer_conf = {\n    'bootstrap.servers': 'localhost:9092',\n    'key.serializer': StringSerializer('utf_8'),\n    'value.serializer': avro_serializer\n}\nproducer = Producer(producer_conf)\n\n# 4. Produce messages\nproducer.produce('my-topic', key='user1', value=some_user_object)\nproducer.flush()\n```\n\n#### Asynchronous Client & Serializers (AsyncIO)\n\nUse the `AsyncSchemaRegistryClient` and `Async` serializers with `AIOProducer` and `AIOConsumer`. The configuration is the same as the synchronous client.\n\n```python\nfrom confluent_kafka.experimental.aio import AIOProducer\nfrom confluent_kafka.schema_registry import AsyncSchemaRegistryClient\nfrom confluent_kafka.schema_registry._async.avro import AsyncAvroSerializer\n\n# Setup async Schema Registry client and serializer\n# (See configuration options in the synchronous example above)\nschema_registry_conf = {'url': 'http://localhost:8081'}\nschema_client = AsyncSchemaRegistryClient(schema_registry_conf)\nserializer = await AsyncAvroSerializer(schema_client, schema_str=avro_schema)\n\n# Use with AsyncIO producer\nproducer = AIOProducer({\"bootstrap.servers\": \"localhost:9092\"})\nserialized_value = await serializer(data, SerializationContext(\"topic\", MessageField.VALUE))\ndelivery_future = await producer.produce(\"topic\", value=serialized_value)\n```\n\nAvailable async serializers: `AsyncAvroSerializer`, `AsyncJSONSerializer`, `AsyncProtobufSerializer` (and corresponding deserializers).\n\nSee also:\n\n- Example: [`examples/asyncio_avro_producer.py`](examples/asyncio_avro_producer.py)\n\n#### Import paths\n\n```python\nfrom confluent_kafka.schema_registry._async.avro import AsyncAvroSerializer, AsyncAvroDeserializer\nfrom confluent_kafka.schema_registry._async.json_schema import AsyncJSONSerializer, AsyncJSONDeserializer\nfrom confluent_kafka.schema_registry._async.protobuf import AsyncProtobufSerializer, AsyncProtobufDeserializer\n```\n\n**Client-Side Field Level Encryption (CSFLE):** To use Data Contracts rules (including CSFLE), install the `rules` extra (see Install section), and refer to the encryption examples in [`examples/README.md`](examples/README.md). For CSFLE-specific guidance, see the [Confluent Cloud CSFLE documentation](https://docs.confluent.io/cloud/current/security/encrypt/csfle/overview.html).\n\n**Note:** The async Schema Registry interface mirrors the synchronous client exactly - same configuration options, same calling patterns, no unexpected gotchas or limitations. Simply add `await` to method calls and use the `Async` prefixed classes.\n\n#### Troubleshooting\n\n- **401/403 Unauthorized when using Confluent Cloud:** Verify your `basic.auth.user.info` (SR API key/secret) is correct and that the Schema Registry URL is for your specific cluster. Ensure you are using an SR API key, not a Kafka API key.\n- **Schema not found:** Check that your `subject.name.strategy` configuration matches how your schemas are registered in Schema Registry, and that the topic and message field (key/value) pairing is correct.\n### Basic Consumer example\n\n```python\nfrom confluent_kafka import Consumer\n\nc = Consumer({\n    'bootstrap.servers': 'mybroker',\n    'group.id': 'mygroup',\n    'auto.offset.reset': 'earliest'\n})\n\nc.subscribe(['mytopic'])\n\nwhile True:\n    msg = c.poll(1.0)\n\n    if msg is None:\n        continue\n    if msg.error():\n        print(\"Consumer error: {}\".format(msg.error()))\n        continue\n\n    print('Received message: {}'.format(msg.value().decode('utf-8')))\n\nc.close()\n```\n### Basic AdminClient example\n\nCreate topics:\n\n```python\nfrom confluent_kafka.admin import AdminClient, NewTopic\n\na = AdminClient({'bootstrap.servers': 'mybroker'})\n\nnew_topics = [NewTopic(topic, num_partitions=3, replication_factor=1) for topic in [\"topic1\", \"topic2\"]]\n# Note: In a multi-cluster production scenario, it is more typical to use a replication_factor of 3 for durability.\n\n# Call create_topics to asynchronously create topics. A dict\n# of <topic,future> is returned.\nfs = a.create_topics(new_topics)\n\n# Wait for each operation to finish.\nfor topic, f in fs.items():\n    try:\n        f.result()  # The result itself is None\n        print(\"Topic {} created\".format(topic))\n    except Exception as e:\n        print(\"Failed to create topic {}: {}\".format(topic, e))\n```\n## Thread safety\n\nThe `Producer`, `Consumer`, and `AdminClient` are all thread safe.\n## Install\n\n```bash\n# Basic installation\npip install confluent-kafka\n\n# With Schema Registry support\npip install \"confluent-kafka[avro,schemaregistry]\"     # Avro\npip install \"confluent-kafka[json,schemaregistry]\"     # JSON Schema  \npip install \"confluent-kafka[protobuf,schemaregistry]\" # Protobuf\n\n# With Data Contract rules (includes CSFLE support)\npip install \"confluent-kafka[avro,schemaregistry,rules]\"\n```\n\n**Note:** Pre-built Linux wheels do not include SASL Kerberos/GSSAPI support. For Kerberos, see the source installation instructions in [INSTALL.md](INSTALL.md).\nTo use Schema Registry with the Avro serializer/deserializer:\n\n```bash\npip install \"confluent-kafka[avro,schemaregistry]\"\n```\n\nTo use Schema Registry with the JSON serializer/deserializer:\n\n```bash\npip install \"confluent-kafka[json,schemaregistry]\"\n```\n\nTo use Schema Registry with the Protobuf serializer/deserializer:\n\n```bash\npip install \"confluent-kafka[protobuf,schemaregistry]\"\n```\n\nWhen using Data Contract rules (including CSFLE) add the `rules`extra, e.g.:\n\n```bash\npip install \"confluent-kafka[avro,schemaregistry,rules]\"\n```\n\n**Install from source**\n\nFor source install, see the *Install from source* section in [INSTALL.md](INSTALL.md).\n\n## Broker compatibility\n\nThe Python client (as well as the underlying C library librdkafka) supports\nall broker versions >= 0.8.\nBut due to the nature of the Kafka protocol in broker versions 0.8 and 0.9 it\nis not safe for a client to assume what protocol version is actually supported\nby the broker, thus you will need to hint the Python client what protocol\nversion it may use. This is done through two configuration settings:\n\n- `broker.version.fallback=YOUR_BROKER_VERSION` (default 0.9.0.1)\n- `api.version.request=true|false` (default true)\n\nWhen using a Kafka 0.10 broker or later you don't need to do anything\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Confluent's Python client for Apache Kafka",
    "version": "2.12.1",
    "project_urls": {
        "Homepage": "https://github.com/confluentinc/confluent-kafka-python"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "c9ad0d2b906aece9069fc0bead3bd7907a440442209ee3795954359555a51b91",
                "md5": "b09ad69d6b666b592132948a4a70872f",
                "sha256": "418f52c52c3833a2481cb4f576f4bee5183776dd9411f5288286117d794d858c"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp310-cp310-macosx_10_9_x86_64.whl",
            "has_sig": false,
            "md5_digest": "b09ad69d6b666b592132948a4a70872f",
            "packagetype": "bdist_wheel",
            "python_version": "cp310",
            "requires_python": ">=3.8",
            "size": 3593667,
            "upload_time": "2025-10-21T20:49:58",
            "upload_time_iso_8601": "2025-10-21T20:49:58.317605Z",
            "url": "https://files.pythonhosted.org/packages/c9/ad/0d2b906aece9069fc0bead3bd7907a440442209ee3795954359555a51b91/confluent_kafka-2.12.1-cp310-cp310-macosx_10_9_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "c4ae27220c24f96b5cb7d69b01035e3ebe1fc6d6cde54e8d9e6e69a6fd8fcd69",
                "md5": "c9dd43a2ea8e012da056a0999ec6305f",
                "sha256": "6308da6a1feddfc8a5ab757982ab546c4482e33c056e7b4f379deeddfd821c0c"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp310-cp310-macosx_11_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "c9dd43a2ea8e012da056a0999ec6305f",
            "packagetype": "bdist_wheel",
            "python_version": "cp310",
            "requires_python": ">=3.8",
            "size": 3153382,
            "upload_time": "2025-10-21T20:50:01",
            "upload_time_iso_8601": "2025-10-21T20:50:01.627361Z",
            "url": "https://files.pythonhosted.org/packages/c4/ae/27220c24f96b5cb7d69b01035e3ebe1fc6d6cde54e8d9e6e69a6fd8fcd69/confluent_kafka-2.12.1-cp310-cp310-macosx_11_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "d15d10e76ee9d88ebbadbaff6dd739704f12c3e3ac859528d2c076fbbc581cf6",
                "md5": "8a938f7983019918fd30c42151ff0688",
                "sha256": "45e5ee77416b82092ba4bf9f928208314ef1e54ac887397790db28b1ea9c39ef"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp310-cp310-manylinux_2_28_aarch64.whl",
            "has_sig": false,
            "md5_digest": "8a938f7983019918fd30c42151ff0688",
            "packagetype": "bdist_wheel",
            "python_version": "cp310",
            "requires_python": ">=3.8",
            "size": 3691555,
            "upload_time": "2025-10-21T20:50:04",
            "upload_time_iso_8601": "2025-10-21T20:50:04.474368Z",
            "url": "https://files.pythonhosted.org/packages/d1/5d/10e76ee9d88ebbadbaff6dd739704f12c3e3ac859528d2c076fbbc581cf6/confluent_kafka-2.12.1-cp310-cp310-manylinux_2_28_aarch64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "6185636ce8295e86b1f740e81b7a6ebd00ae11a605bcb9e590744555444efb2f",
                "md5": "e3a36123e41e115ce3276d8e371056de",
                "sha256": "15c4389d0a16a4946c037f96388f6fbc8dfad38649afc1d583e0826131bb5c2c"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp310-cp310-manylinux_2_28_x86_64.whl",
            "has_sig": false,
            "md5_digest": "e3a36123e41e115ce3276d8e371056de",
            "packagetype": "bdist_wheel",
            "python_version": "cp310",
            "requires_python": ">=3.8",
            "size": 3943461,
            "upload_time": "2025-10-21T20:50:07",
            "upload_time_iso_8601": "2025-10-21T20:50:07.614882Z",
            "url": "https://files.pythonhosted.org/packages/61/85/636ce8295e86b1f740e81b7a6ebd00ae11a605bcb9e590744555444efb2f/confluent_kafka-2.12.1-cp310-cp310-manylinux_2_28_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "32d835ee9d70fa28b144a4bafbc10fda33ef61efbd6fe992724b43a1069b55d8",
                "md5": "1259f32ea3461074631b6afe6b125b14",
                "sha256": "e78ae6689b8df7ef8ffac5039b6c4e8f8b733161a5278bcfba5b4a0da6ed9a2c"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp310-cp310-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "1259f32ea3461074631b6afe6b125b14",
            "packagetype": "bdist_wheel",
            "python_version": "cp310",
            "requires_python": ">=3.8",
            "size": 4067588,
            "upload_time": "2025-10-21T20:50:10",
            "upload_time_iso_8601": "2025-10-21T20:50:10.779802Z",
            "url": "https://files.pythonhosted.org/packages/32/d8/35ee9d70fa28b144a4bafbc10fda33ef61efbd6fe992724b43a1069b55d8/confluent_kafka-2.12.1-cp310-cp310-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "46739ab5a5ccb3c89ed46f399b6c09e7ccc5538608ccdc118bcab48b04848dd4",
                "md5": "e494e8950d564dad35b62b9b077923fc",
                "sha256": "5929ac16c1e7c6f632d99201575be5b45780a93bc0ac1b26391a719893ba21ea"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp311-cp311-macosx_10_9_x86_64.whl",
            "has_sig": false,
            "md5_digest": "e494e8950d564dad35b62b9b077923fc",
            "packagetype": "bdist_wheel",
            "python_version": "cp311",
            "requires_python": ">=3.8",
            "size": 3592985,
            "upload_time": "2025-10-21T20:50:13",
            "upload_time_iso_8601": "2025-10-21T20:50:13.807882Z",
            "url": "https://files.pythonhosted.org/packages/46/73/9ab5a5ccb3c89ed46f399b6c09e7ccc5538608ccdc118bcab48b04848dd4/confluent_kafka-2.12.1-cp311-cp311-macosx_10_9_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "cebce63edbea266797108a7a7640cdfdccf28f96db84d9bd1ad834414e225993",
                "md5": "43e40a11e6c83841daa0759422adb2ab",
                "sha256": "8df2feafbac73f624121073ad49fe1ce890eb4833a5237b37fe7dd24fc5ba824"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp311-cp311-macosx_11_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "43e40a11e6c83841daa0759422adb2ab",
            "packagetype": "bdist_wheel",
            "python_version": "cp311",
            "requires_python": ">=3.8",
            "size": 3152939,
            "upload_time": "2025-10-21T20:50:15",
            "upload_time_iso_8601": "2025-10-21T20:50:15.927912Z",
            "url": "https://files.pythonhosted.org/packages/ce/bc/e63edbea266797108a7a7640cdfdccf28f96db84d9bd1ad834414e225993/confluent_kafka-2.12.1-cp311-cp311-macosx_11_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "d3cf554207922e8d13330da1d539e9bef59a89b6be0838a1d42a4dd327c63bd1",
                "md5": "c67a164e8157bbcf5c4ba3d518a7589a",
                "sha256": "e497f9fc780b21d72eedeb5dd0648966417467c42358a20b692d6c0bff85f533"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp311-cp311-manylinux_2_28_aarch64.whl",
            "has_sig": false,
            "md5_digest": "c67a164e8157bbcf5c4ba3d518a7589a",
            "packagetype": "bdist_wheel",
            "python_version": "cp311",
            "requires_python": ">=3.8",
            "size": 3691286,
            "upload_time": "2025-10-21T20:50:18",
            "upload_time_iso_8601": "2025-10-21T20:50:18.252766Z",
            "url": "https://files.pythonhosted.org/packages/d3/cf/554207922e8d13330da1d539e9bef59a89b6be0838a1d42a4dd327c63bd1/confluent_kafka-2.12.1-cp311-cp311-manylinux_2_28_aarch64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "3ab95cd5e26e66d2ac0470dd10d21660b18e2e8e201a1a0cba823b7ceba985e7",
                "md5": "3be3a47009a064e490abcc9d75974b3c",
                "sha256": "5958feea006c92ac4d5f37cc9b730eb1824a68d7536be7ec274d6841ce0412af"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp311-cp311-manylinux_2_28_x86_64.whl",
            "has_sig": false,
            "md5_digest": "3be3a47009a064e490abcc9d75974b3c",
            "packagetype": "bdist_wheel",
            "python_version": "cp311",
            "requires_python": ">=3.8",
            "size": 3942926,
            "upload_time": "2025-10-21T20:50:20",
            "upload_time_iso_8601": "2025-10-21T20:50:20.915949Z",
            "url": "https://files.pythonhosted.org/packages/3a/b9/5cd5e26e66d2ac0470dd10d21660b18e2e8e201a1a0cba823b7ceba985e7/confluent_kafka-2.12.1-cp311-cp311-manylinux_2_28_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "06f4292c56739663ec6150c2adf01bf256e0b0eeda8ee848d621c1f7640c87c8",
                "md5": "23ad63f7aa26da2a1362e0be982c92cf",
                "sha256": "6c344b1e9a98baee9a221ef910b4817a1f4bb3b81ebe3304d4a9b3be3b99ddc1"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp311-cp311-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "23ad63f7aa26da2a1362e0be982c92cf",
            "packagetype": "bdist_wheel",
            "python_version": "cp311",
            "requires_python": ">=3.8",
            "size": 4067585,
            "upload_time": "2025-10-21T20:50:23",
            "upload_time_iso_8601": "2025-10-21T20:50:23.845008Z",
            "url": "https://files.pythonhosted.org/packages/06/f4/292c56739663ec6150c2adf01bf256e0b0eeda8ee848d621c1f7640c87c8/confluent_kafka-2.12.1-cp311-cp311-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "daa7bbce88f7d347d706ce8140bdb625467465f2cace54ad57457ffbc66f2e82",
                "md5": "7d898bcb8f8bedd6c7c042546499b0ed",
                "sha256": "197b1222535a2f9c8b32cd294e9e01af919bc315264015ba6bb1f18dabb638c4"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp312-cp312-macosx_10_9_x86_64.whl",
            "has_sig": false,
            "md5_digest": "7d898bcb8f8bedd6c7c042546499b0ed",
            "packagetype": "bdist_wheel",
            "python_version": "cp312",
            "requires_python": ">=3.8",
            "size": 3598616,
            "upload_time": "2025-10-21T20:50:26",
            "upload_time_iso_8601": "2025-10-21T20:50:26.555469Z",
            "url": "https://files.pythonhosted.org/packages/da/a7/bbce88f7d347d706ce8140bdb625467465f2cace54ad57457ffbc66f2e82/confluent_kafka-2.12.1-cp312-cp312-macosx_10_9_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "e0cbf588c1e6a4be51c3cb932eb5f58a4545b5ad2555106e19a073b6f30d400c",
                "md5": "16239d6462e8e10817fd062c192c442b",
                "sha256": "0c0d6d3c0bd96d2923ac0a6e74df6d86763cba881383bc882972de1bdd48a57a"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp312-cp312-macosx_11_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "16239d6462e8e10817fd062c192c442b",
            "packagetype": "bdist_wheel",
            "python_version": "cp312",
            "requires_python": ">=3.8",
            "size": 3157975,
            "upload_time": "2025-10-21T20:50:29",
            "upload_time_iso_8601": "2025-10-21T20:50:29.177451Z",
            "url": "https://files.pythonhosted.org/packages/e0/cb/f588c1e6a4be51c3cb932eb5f58a4545b5ad2555106e19a073b6f30d400c/confluent_kafka-2.12.1-cp312-cp312-macosx_11_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "5e486a7ab7968f601cd074e679523ca78ccb3ee4e35ced1f7f737bc2adb3f309",
                "md5": "0d7f16b1e9c3de7b6cd8d2aed4ccb0f2",
                "sha256": "d028bb2cffba425fd204435a1153466b7693e81c874bf9cdfc8147cb0aa285c7"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp312-cp312-manylinux_2_28_aarch64.whl",
            "has_sig": false,
            "md5_digest": "0d7f16b1e9c3de7b6cd8d2aed4ccb0f2",
            "packagetype": "bdist_wheel",
            "python_version": "cp312",
            "requires_python": ">=3.8",
            "size": 3695681,
            "upload_time": "2025-10-21T20:50:34",
            "upload_time_iso_8601": "2025-10-21T20:50:34.376486Z",
            "url": "https://files.pythonhosted.org/packages/5e/48/6a7ab7968f601cd074e679523ca78ccb3ee4e35ced1f7f737bc2adb3f309/confluent_kafka-2.12.1-cp312-cp312-manylinux_2_28_aarch64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "b49bf9a02935d136df317e7d7628d58cd44ce43e6efcaab38b17233eeb37ddd7",
                "md5": "a6868067f884c64a2f30c9655ae0e6c7",
                "sha256": "aa06b7ec20b2b168c6ba20b32a90c9f87f08c9017a125373c3217f2aa011a504"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp312-cp312-manylinux_2_28_x86_64.whl",
            "has_sig": false,
            "md5_digest": "a6868067f884c64a2f30c9655ae0e6c7",
            "packagetype": "bdist_wheel",
            "python_version": "cp312",
            "requires_python": ">=3.8",
            "size": 3947299,
            "upload_time": "2025-10-21T20:50:36",
            "upload_time_iso_8601": "2025-10-21T20:50:36.691475Z",
            "url": "https://files.pythonhosted.org/packages/b4/9b/f9a02935d136df317e7d7628d58cd44ce43e6efcaab38b17233eeb37ddd7/confluent_kafka-2.12.1-cp312-cp312-manylinux_2_28_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "8ca2871cdb1ee19312b2a09702f30af07462a5c6a1a6adea8055f93cc6525971",
                "md5": "317516e7a8cd30234ebb97d60d2848d0",
                "sha256": "6209fe4daed05d7ca578e99ac8f1e8c5bc3aa434e8183814c3b103bc10ed1c71"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp312-cp312-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "317516e7a8cd30234ebb97d60d2848d0",
            "packagetype": "bdist_wheel",
            "python_version": "cp312",
            "requires_python": ">=3.8",
            "size": 4068259,
            "upload_time": "2025-10-21T20:50:39",
            "upload_time_iso_8601": "2025-10-21T20:50:39.337642Z",
            "url": "https://files.pythonhosted.org/packages/8c/a2/871cdb1ee19312b2a09702f30af07462a5c6a1a6adea8055f93cc6525971/confluent_kafka-2.12.1-cp312-cp312-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "2d315221c9261903430e23b30c8f8776d00db881d4df12580855febd4f60a766",
                "md5": "af421f214904be600865bfe6b2dc5694",
                "sha256": "9620a7327d7259549a696272b91cc95689a26e5c770f5352c1fb5c1017f28dfc"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp313-cp313-macosx_13_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "af421f214904be600865bfe6b2dc5694",
            "packagetype": "bdist_wheel",
            "python_version": "cp313",
            "requires_python": ">=3.8",
            "size": 3163059,
            "upload_time": "2025-10-21T20:50:41",
            "upload_time_iso_8601": "2025-10-21T20:50:41.405605Z",
            "url": "https://files.pythonhosted.org/packages/2d/31/5221c9261903430e23b30c8f8776d00db881d4df12580855febd4f60a766/confluent_kafka-2.12.1-cp313-cp313-macosx_13_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "abe0c9f4a3f2b6d0a8da8ec24125874e012523176e23192bd2bb439b0ff7adea",
                "md5": "cfd69c6d2cf00266542121841f6f7040",
                "sha256": "0e8e434ed453c9c5eb5d2d236b085b6dff91b39efb2ca962bf63581a9bb2f7a2"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp313-cp313-macosx_13_0_x86_64.whl",
            "has_sig": false,
            "md5_digest": "cfd69c6d2cf00266542121841f6f7040",
            "packagetype": "bdist_wheel",
            "python_version": "cp313",
            "requires_python": ">=3.8",
            "size": 3602334,
            "upload_time": "2025-10-21T20:50:44",
            "upload_time_iso_8601": "2025-10-21T20:50:44.744944Z",
            "url": "https://files.pythonhosted.org/packages/ab/e0/c9f4a3f2b6d0a8da8ec24125874e012523176e23192bd2bb439b0ff7adea/confluent_kafka-2.12.1-cp313-cp313-macosx_13_0_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "bbdf534e39ee496e17907bc19e0fda97ff65cd4a56a1379988602ad7339347e8",
                "md5": "eb1ebe8300f7e1d3fbbe7640c3c56599",
                "sha256": "23716c2f8f05815d1ce7a0caf2aa98bf71b7c73265295080d58882a091acddce"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp313-cp313-manylinux_2_28_aarch64.whl",
            "has_sig": false,
            "md5_digest": "eb1ebe8300f7e1d3fbbe7640c3c56599",
            "packagetype": "bdist_wheel",
            "python_version": "cp313",
            "requires_python": ">=3.8",
            "size": 3696062,
            "upload_time": "2025-10-21T20:50:47",
            "upload_time_iso_8601": "2025-10-21T20:50:47.214689Z",
            "url": "https://files.pythonhosted.org/packages/bb/df/534e39ee496e17907bc19e0fda97ff65cd4a56a1379988602ad7339347e8/confluent_kafka-2.12.1-cp313-cp313-manylinux_2_28_aarch64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "ba112bfdde02bd9139aec6d0d3961a06b84c50dfbaec46bf09402c5a5c710884",
                "md5": "f5b4f8b252a8eaa2faf6676bbbfd2f52",
                "sha256": "537346d05271f6966a2559011d433632178367711d71e3c79d32246ac09c775e"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp313-cp313-manylinux_2_28_x86_64.whl",
            "has_sig": false,
            "md5_digest": "f5b4f8b252a8eaa2faf6676bbbfd2f52",
            "packagetype": "bdist_wheel",
            "python_version": "cp313",
            "requires_python": ">=3.8",
            "size": 3947699,
            "upload_time": "2025-10-21T20:50:49",
            "upload_time_iso_8601": "2025-10-21T20:50:49.878740Z",
            "url": "https://files.pythonhosted.org/packages/ba/11/2bfdde02bd9139aec6d0d3961a06b84c50dfbaec46bf09402c5a5c710884/confluent_kafka-2.12.1-cp313-cp313-manylinux_2_28_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "3181686be6b4a176f5299d1b02fe26a16071c1efc8043431e78402ae7c7f07b3",
                "md5": "b93f153d2dc0f35ea2451acdf92063bb",
                "sha256": "4391dd12f2bd56ab2d075bce4b91982b7a2684abf9f12ab4bc4227edefda7c19"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp313-cp313-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "b93f153d2dc0f35ea2451acdf92063bb",
            "packagetype": "bdist_wheel",
            "python_version": "cp313",
            "requires_python": ">=3.8",
            "size": 4127934,
            "upload_time": "2025-10-21T20:50:53",
            "upload_time_iso_8601": "2025-10-21T20:50:53.367561Z",
            "url": "https://files.pythonhosted.org/packages/31/81/686be6b4a176f5299d1b02fe26a16071c1efc8043431e78402ae7c7f07b3/confluent_kafka-2.12.1-cp313-cp313-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "035caf272d90e0f75f5516087df97cd1e2e04673df2245c0e43754e0ec5cde48",
                "md5": "b4c30183a2eda06dd83c080bcdda05b1",
                "sha256": "8c708d3f774217871a3959443606b28a667587ed783b70c3c01f057a38eaeedb"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp314-cp314-macosx_13_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "b4c30183a2eda06dd83c080bcdda05b1",
            "packagetype": "bdist_wheel",
            "python_version": "cp314",
            "requires_python": ">=3.8",
            "size": 3162878,
            "upload_time": "2025-10-21T20:50:55",
            "upload_time_iso_8601": "2025-10-21T20:50:55.867471Z",
            "url": "https://files.pythonhosted.org/packages/03/5c/af272d90e0f75f5516087df97cd1e2e04673df2245c0e43754e0ec5cde48/confluent_kafka-2.12.1-cp314-cp314-macosx_13_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "8dcf768d89e471572f348bf94802b2381b64fc7105315d76be1a611bb6f1d62b",
                "md5": "a74458eaee54f1349dc19caf6c6349e4",
                "sha256": "e26739ed8d19c7def3dde37fec621373cf15682a1a80b37add51ae435ee19d5f"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp314-cp314-macosx_13_0_x86_64.whl",
            "has_sig": false,
            "md5_digest": "a74458eaee54f1349dc19caf6c6349e4",
            "packagetype": "bdist_wheel",
            "python_version": "cp314",
            "requires_python": ">=3.8",
            "size": 3602141,
            "upload_time": "2025-10-21T20:50:59",
            "upload_time_iso_8601": "2025-10-21T20:50:59.011977Z",
            "url": "https://files.pythonhosted.org/packages/8d/cf/768d89e471572f348bf94802b2381b64fc7105315d76be1a611bb6f1d62b/confluent_kafka-2.12.1-cp314-cp314-macosx_13_0_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "71a18ad5e17883bcc07022e40859f76e7bc595524ebbf1fc71b9d54e148e9806",
                "md5": "2238eae1ddb38abd315156491c6678da",
                "sha256": "1460e51e72a03cafc10cebd52d25b84cc494983e1f51c46ccd043f46fab6f1d5"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp314-cp314-manylinux_2_28_aarch64.whl",
            "has_sig": false,
            "md5_digest": "2238eae1ddb38abd315156491c6678da",
            "packagetype": "bdist_wheel",
            "python_version": "cp314",
            "requires_python": ">=3.8",
            "size": 3695891,
            "upload_time": "2025-10-21T20:51:01",
            "upload_time_iso_8601": "2025-10-21T20:51:01.738200Z",
            "url": "https://files.pythonhosted.org/packages/71/a1/8ad5e17883bcc07022e40859f76e7bc595524ebbf1fc71b9d54e148e9806/confluent_kafka-2.12.1-cp314-cp314-manylinux_2_28_aarch64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "0f2da8077855fd282eed2d29faef6522d96937ea64aa789ad79c3d0fbf3d95c1",
                "md5": "61b2168de32e4ffef5163c584da9fe3e",
                "sha256": "77e1013062934294a374ef4f694179e0365e36317b4f8d3f84b009e4136cdc2f"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp314-cp314-manylinux_2_28_x86_64.whl",
            "has_sig": false,
            "md5_digest": "61b2168de32e4ffef5163c584da9fe3e",
            "packagetype": "bdist_wheel",
            "python_version": "cp314",
            "requires_python": ">=3.8",
            "size": 3947450,
            "upload_time": "2025-10-21T20:51:04",
            "upload_time_iso_8601": "2025-10-21T20:51:04.653797Z",
            "url": "https://files.pythonhosted.org/packages/0f/2d/a8077855fd282eed2d29faef6522d96937ea64aa789ad79c3d0fbf3d95c1/confluent_kafka-2.12.1-cp314-cp314-manylinux_2_28_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "7fe8db65e7e9b5bf45f97fefa7c78039826ed3c6d32c02e599ee913f8cede480",
                "md5": "813e9e5cd975b78dea3dbc040a88aebb",
                "sha256": "b711de5a45eb8e54b4c1cb5bf841b969e53e680e971514003b210f0e544f96d9"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp314-cp314t-macosx_13_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "813e9e5cd975b78dea3dbc040a88aebb",
            "packagetype": "bdist_wheel",
            "python_version": "cp314",
            "requires_python": ">=3.8",
            "size": 3161515,
            "upload_time": "2025-10-21T20:51:09",
            "upload_time_iso_8601": "2025-10-21T20:51:09.893431Z",
            "url": "https://files.pythonhosted.org/packages/7f/e8/db65e7e9b5bf45f97fefa7c78039826ed3c6d32c02e599ee913f8cede480/confluent_kafka-2.12.1-cp314-cp314t-macosx_13_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "64ba63526ae0c56b95892b73f66a81b2aae9af417c642b0818ec9e4e7e944223",
                "md5": "21a6aab323d85c5580f8b13b372ada32",
                "sha256": "2672953ac47d80b6f5de9f4c182766a7cd26615afb0aaa1b8b00f4a7cfc62943"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp314-cp314t-macosx_13_0_x86_64.whl",
            "has_sig": false,
            "md5_digest": "21a6aab323d85c5580f8b13b372ada32",
            "packagetype": "bdist_wheel",
            "python_version": "cp314",
            "requires_python": ">=3.8",
            "size": 3600003,
            "upload_time": "2025-10-21T20:51:12",
            "upload_time_iso_8601": "2025-10-21T20:51:12.785353Z",
            "url": "https://files.pythonhosted.org/packages/64/ba/63526ae0c56b95892b73f66a81b2aae9af417c642b0818ec9e4e7e944223/confluent_kafka-2.12.1-cp314-cp314t-macosx_13_0_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "a61e19b6eb0927d853d7149b97f88e7a4585f3a4d06cf6d8b575e8a80fcffce2",
                "md5": "c8610203e53ca77b2aedc9a7d1cb0333",
                "sha256": "43db6424ac66ff88bbe6ba1a686196d1f6bd79aa54d5611f1f0473bf5be47efa"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp314-cp314t-manylinux_2_28_aarch64.whl",
            "has_sig": false,
            "md5_digest": "c8610203e53ca77b2aedc9a7d1cb0333",
            "packagetype": "bdist_wheel",
            "python_version": "cp314",
            "requires_python": ">=3.8",
            "size": 3694769,
            "upload_time": "2025-10-21T20:51:15",
            "upload_time_iso_8601": "2025-10-21T20:51:15.430310Z",
            "url": "https://files.pythonhosted.org/packages/a6/1e/19b6eb0927d853d7149b97f88e7a4585f3a4d06cf6d8b575e8a80fcffce2/confluent_kafka-2.12.1-cp314-cp314t-manylinux_2_28_aarch64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "33193452a5417ca856d37d6cf9b063764c4e11ce4dcec412c28cb12336ffd3aa",
                "md5": "e35b5c6fb8f99113388438d304e13614",
                "sha256": "7e58b64832b31e8a91ae933c1c8d44de41d80b4352507b79b7c67d908845a6b3"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp314-cp314t-manylinux_2_28_x86_64.whl",
            "has_sig": false,
            "md5_digest": "e35b5c6fb8f99113388438d304e13614",
            "packagetype": "bdist_wheel",
            "python_version": "cp314",
            "requires_python": ">=3.8",
            "size": 3946110,
            "upload_time": "2025-10-21T20:51:18",
            "upload_time_iso_8601": "2025-10-21T20:51:18.233936Z",
            "url": "https://files.pythonhosted.org/packages/33/19/3452a5417ca856d37d6cf9b063764c4e11ce4dcec412c28cb12336ffd3aa/confluent_kafka-2.12.1-cp314-cp314t-manylinux_2_28_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "8f40fdc769717baa73e8aad16c2d2b523d1a702b1dc8e9cfc243dbfe9d3a152e",
                "md5": "4f35967fa8a8df86757b764bbafb0f48",
                "sha256": "0b9c1bc62e97123c969c8ec3f3623ad1ac85f3c1953da7b3b288cb759b3e21aa"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp314-cp314-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "4f35967fa8a8df86757b764bbafb0f48",
            "packagetype": "bdist_wheel",
            "python_version": "cp314",
            "requires_python": ">=3.8",
            "size": 4245354,
            "upload_time": "2025-10-21T20:51:07",
            "upload_time_iso_8601": "2025-10-21T20:51:07.303140Z",
            "url": "https://files.pythonhosted.org/packages/8f/40/fdc769717baa73e8aad16c2d2b523d1a702b1dc8e9cfc243dbfe9d3a152e/confluent_kafka-2.12.1-cp314-cp314-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "28f9540ed82efe753344bee1308018e56d57960d64d34e07d3729aa0145f4489",
                "md5": "e22075e812279ab2be284e2f8bb687c3",
                "sha256": "073af99734dc6198d95ae6d41ada304cf791bd49cb9a5b29be665740a0028b28"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp38-cp38-macosx_10_9_x86_64.whl",
            "has_sig": false,
            "md5_digest": "e22075e812279ab2be284e2f8bb687c3",
            "packagetype": "bdist_wheel",
            "python_version": "cp38",
            "requires_python": ">=3.8",
            "size": 3592244,
            "upload_time": "2025-10-21T20:51:20",
            "upload_time_iso_8601": "2025-10-21T20:51:20.603299Z",
            "url": "https://files.pythonhosted.org/packages/28/f9/540ed82efe753344bee1308018e56d57960d64d34e07d3729aa0145f4489/confluent_kafka-2.12.1-cp38-cp38-macosx_10_9_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "00b86a241f56c47edc02723fbc3c72d613927d26b51efea8283df932168ae60e",
                "md5": "2ad65fa95054a2a3467811003d3568c9",
                "sha256": "7626d800526489dd3ec1a9eac030e2cee0eda610e062732542f7b2bff07062a9"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp38-cp38-macosx_11_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "2ad65fa95054a2a3467811003d3568c9",
            "packagetype": "bdist_wheel",
            "python_version": "cp38",
            "requires_python": ">=3.8",
            "size": 3151123,
            "upload_time": "2025-10-21T20:51:23",
            "upload_time_iso_8601": "2025-10-21T20:51:23.464473Z",
            "url": "https://files.pythonhosted.org/packages/00/b8/6a241f56c47edc02723fbc3c72d613927d26b51efea8283df932168ae60e/confluent_kafka-2.12.1-cp38-cp38-macosx_11_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "b36a9bc394457a0310cdb4af8879db1f80afb06d6e07928bc93a4951ee608655",
                "md5": "543da0c8b8b2880a61943256032fe005",
                "sha256": "11b8e599b4cda812df288af736a92fb136c5a3f42803b64a7811066db8189cd6"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp38-cp38-manylinux_2_28_aarch64.whl",
            "has_sig": false,
            "md5_digest": "543da0c8b8b2880a61943256032fe005",
            "packagetype": "bdist_wheel",
            "python_version": "cp38",
            "requires_python": ">=3.8",
            "size": 3898333,
            "upload_time": "2025-10-21T20:51:26",
            "upload_time_iso_8601": "2025-10-21T20:51:26.587092Z",
            "url": "https://files.pythonhosted.org/packages/b3/6a/9bc394457a0310cdb4af8879db1f80afb06d6e07928bc93a4951ee608655/confluent_kafka-2.12.1-cp38-cp38-manylinux_2_28_aarch64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "c7b7e32f08408120f84675c6d969cba8851acdb0929da70fbb40a518f9535329",
                "md5": "60bae49fe685b2f8d4f48f3259189c2e",
                "sha256": "5b021343cb27e08c145835602179702ccfa992885c7e500e92e46543c73bc7db"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp38-cp38-manylinux_2_28_x86_64.whl",
            "has_sig": false,
            "md5_digest": "60bae49fe685b2f8d4f48f3259189c2e",
            "packagetype": "bdist_wheel",
            "python_version": "cp38",
            "requires_python": ">=3.8",
            "size": 4146708,
            "upload_time": "2025-10-21T20:51:29",
            "upload_time_iso_8601": "2025-10-21T20:51:29.995478Z",
            "url": "https://files.pythonhosted.org/packages/c7/b7/e32f08408120f84675c6d969cba8851acdb0929da70fbb40a518f9535329/confluent_kafka-2.12.1-cp38-cp38-manylinux_2_28_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "2e1f6cf2848450c0c6d150317e1f2ce817e4f1cd2ebbc5ceff1e0f26b20c6e9a",
                "md5": "6e626adde44df133f1c9924c9ee38bb1",
                "sha256": "a8ab38f8e68c8a806682436e8bfa54537e39f349963a998fac43281af31f38fa"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp38-cp38-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "6e626adde44df133f1c9924c9ee38bb1",
            "packagetype": "bdist_wheel",
            "python_version": "cp38",
            "requires_python": ">=3.8",
            "size": 4068497,
            "upload_time": "2025-10-21T20:51:32",
            "upload_time_iso_8601": "2025-10-21T20:51:32.910369Z",
            "url": "https://files.pythonhosted.org/packages/2e/1f/6cf2848450c0c6d150317e1f2ce817e4f1cd2ebbc5ceff1e0f26b20c6e9a/confluent_kafka-2.12.1-cp38-cp38-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "1ab8f90c6c4a27a97f9e129d20a54badbf728ee5178a4a53f69474d60858e2d3",
                "md5": "c3885c5c9c2b4a7e406395580ad5a7d2",
                "sha256": "5a751d39a9efa82b49d212d17a88a3d3fe075b70c80fa39313443ac3ac95a967"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp39-cp39-macosx_10_9_x86_64.whl",
            "has_sig": false,
            "md5_digest": "c3885c5c9c2b4a7e406395580ad5a7d2",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": ">=3.8",
            "size": 3593788,
            "upload_time": "2025-10-21T20:51:35",
            "upload_time_iso_8601": "2025-10-21T20:51:35.541129Z",
            "url": "https://files.pythonhosted.org/packages/1a/b8/f90c6c4a27a97f9e129d20a54badbf728ee5178a4a53f69474d60858e2d3/confluent_kafka-2.12.1-cp39-cp39-macosx_10_9_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "6bcef7245d10022dac67d4bc62bf3da11ed1c576e2d1dd9a902b308a005e9154",
                "md5": "00ac551576e47c1290171349ce6bd377",
                "sha256": "e230fbd5a0e4d4228ebb9e0b6b92757c84757904f01060bc01ac4c42c50f619a"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp39-cp39-macosx_11_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "00ac551576e47c1290171349ce6bd377",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": ">=3.8",
            "size": 3153466,
            "upload_time": "2025-10-21T20:51:40",
            "upload_time_iso_8601": "2025-10-21T20:51:40.139119Z",
            "url": "https://files.pythonhosted.org/packages/6b/ce/f7245d10022dac67d4bc62bf3da11ed1c576e2d1dd9a902b308a005e9154/confluent_kafka-2.12.1-cp39-cp39-macosx_11_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "956aeeaebdf3c06ed2d9f5395b644d2505032847b6de25647d8511734c7530b9",
                "md5": "077916bb6a81079d0e755f6b1a35e16b",
                "sha256": "1ca2519bf27ea71ca32feb01859a086183614c44f8d843a917fcc1c135c24706"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp39-cp39-manylinux_2_28_aarch64.whl",
            "has_sig": false,
            "md5_digest": "077916bb6a81079d0e755f6b1a35e16b",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": ">=3.8",
            "size": 3691634,
            "upload_time": "2025-10-21T20:51:42",
            "upload_time_iso_8601": "2025-10-21T20:51:42.334053Z",
            "url": "https://files.pythonhosted.org/packages/95/6a/eeaebdf3c06ed2d9f5395b644d2505032847b6de25647d8511734c7530b9/confluent_kafka-2.12.1-cp39-cp39-manylinux_2_28_aarch64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "1e0982941fedca144b5d54480ae4b9e54b7aa743e977f8c2d53a3d4c4f1af0f4",
                "md5": "303f63bd93febf09b9d40e7234769ba8",
                "sha256": "2b4bb9fefa85462aa0b1f5d96abadffad911d93033b470b3b6fc46d63f2a28e9"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp39-cp39-manylinux_2_28_x86_64.whl",
            "has_sig": false,
            "md5_digest": "303f63bd93febf09b9d40e7234769ba8",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": ">=3.8",
            "size": 3943497,
            "upload_time": "2025-10-21T20:51:44",
            "upload_time_iso_8601": "2025-10-21T20:51:44.881349Z",
            "url": "https://files.pythonhosted.org/packages/1e/09/82941fedca144b5d54480ae4b9e54b7aa743e977f8c2d53a3d4c4f1af0f4/confluent_kafka-2.12.1-cp39-cp39-manylinux_2_28_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "fe5be1aa9991d29ad9879dca916ce89c550be67e5a701d11bbd7eec41b45682f",
                "md5": "98c564069109edbe47013eee0be69e57",
                "sha256": "64a5798ec7fe5a101e055c2729c6ac6d6fd9d462a93a1fb3e314866a70ce804b"
            },
            "downloads": -1,
            "filename": "confluent_kafka-2.12.1-cp39-cp39-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "98c564069109edbe47013eee0be69e57",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": ">=3.8",
            "size": 4068560,
            "upload_time": "2025-10-21T20:51:48",
            "upload_time_iso_8601": "2025-10-21T20:51:48.074759Z",
            "url": "https://files.pythonhosted.org/packages/fe/5b/e1aa9991d29ad9879dca916ce89c550be67e5a701d11bbd7eec41b45682f/confluent_kafka-2.12.1-cp39-cp39-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-10-21 20:49:58",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "confluentinc",
    "github_project": "confluent-kafka-python",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "tox": true,
    "lcname": "confluent-kafka"
}
        
Elapsed time: 3.44283s