# Azure Event Hubs client library for Python
Azure Event Hubs is a highly scalable publish-subscribe service that can ingest millions of events per second and stream
them to multiple consumers. This lets you process and analyze the massive amounts of data produced by your connected
devices and applications. Once Event Hubs has collected the data, you can retrieve, transform, and store it by using
any real-time analytics provider or with batching/storage adapters. If you would like to know more about Azure Event Hubs,
you may wish to review: [What is Event Hubs](https://docs.microsoft.com/azure/event-hubs/event-hubs-about)?
The Azure Event Hubs client library allows for publishing and consuming of Azure Event Hubs events and may be used to:
- Emit telemetry about your application for business intelligence and diagnostic purposes.
- Publish facts about the state of your application which interested parties may observe and use as a trigger for taking action.
- Observe interesting operations and interactions happening within your business or other ecosystem, allowing loosely coupled systems to interact without the need to bind them together.
- Receive events from one or more publishers, transform them to better meet the needs of your ecosystem, then publish the transformed events to a new stream for consumers to observe.
[Source code](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventhub/azure-eventhub/)
| [Package (PyPi)](https://pypi.org/project/azure-eventhub/)
| [Package (Conda)](https://anaconda.org/microsoft/azure-eventhub/)
| [API reference documentation][api_reference]
| [Product documentation](https://docs.microsoft.com/azure/event-hubs/)
| [Samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/eventhub/azure-eventhub/samples)
## Getting started
### Prerequisites
- Python 3.8 or later.
- **Microsoft Azure Subscription:** To use Azure services, including Azure Event Hubs, you'll need a subscription.
If you do not have an existing Azure account, you may sign up for a free trial or use your MSDN subscriber benefits when you [create an account](https://azure.microsoft.com/free/).
- **Event Hubs namespace with an Event Hub:** To interact with Azure Event Hubs, you'll also need to have a namespace and Event Hub available.
If you are not familiar with creating Azure resources, you may wish to follow the step-by-step guide
for [creating an Event Hub using the Azure portal](https://docs.microsoft.com/azure/event-hubs/event-hubs-create).
There, you can also find detailed instructions for using the Azure CLI, Azure PowerShell, or Azure Resource Manager (ARM) templates to create an Event Hub.
### Install the package
Install the Azure Event Hubs client library for Python with pip:
```
$ pip install azure-eventhub
```
### Authenticate the client
Interaction with Event Hubs starts with an instance of EventHubConsumerClient or EventHubProducerClient class. You need either the host name, SAS/AAD credential and event hub name or a connection string to instantiate the client object.
**[Create client from connection string:](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventhub/azure-eventhub/samples/sync_samples/connection_string_authentication.py)**
For the Event Hubs client library to interact with an Event Hub, the easiest means is to use a connection string, which is created automatically when creating an Event Hubs namespace.
If you aren't familiar with shared access policies in Azure, you may wish to follow the step-by-step guide to [get an Event Hubs connection string](https://docs.microsoft.com/azure/event-hubs/event-hubs-get-connection-string).
- The `from_connection_string` method takes the connection string of the form
`Endpoint=sb://<yournamespace>.servicebus.windows.net/;SharedAccessKeyName=<yoursharedaccesskeyname>;SharedAccessKey=<yoursharedaccesskey>` and
entity name to your Event Hub instance. You can get the connection string from the [Azure portal](https://docs.microsoft.com/azure/event-hubs/event-hubs-get-connection-string#get-connection-string-from-the-portal).
**[Create client using the azure-identity library:](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventhub/azure-eventhub/samples/sync_samples/client_identity_authentication.py)**
Alternately, one can use a Credential object to authenticate via AAD with the azure-identity package.
- This constructor demonstrated in the sample linked above takes the host name and entity name of your Event Hub instance and credential that implements the
[TokenCredential](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/core/azure-core/azure/core/credentials.py)
protocol. There are implementations of the `TokenCredential` protocol available in the
[azure-identity package](https://pypi.org/project/azure-identity/). The host name is of the format `<yournamespace.servicebus.windows.net>`.
- To use the credential types provided by `azure-identity`, please install the package:
```pip install azure-identity```
- Additionally, to use the async API, you must first install an async transport, such as [`aiohttp`](https://pypi.org/project/aiohttp/):
```pip install aiohttp```
- When using Azure Active Directory, your principal must be assigned a role which allows access to Event Hubs, such as the
Azure Event Hubs Data Owner role. For more information about using Azure Active Directory authorization with Event Hubs,
please refer to [the associated documentation](https://docs.microsoft.com/azure/event-hubs/authorize-access-azure-active-directory).
## Key concepts
- An **EventHubProducerClient** is a source of telemetry data, diagnostics information, usage logs, or other log data,
as part of an embedded device solution, a mobile device application, a game title running on a console or other device,
some client or server based business solution, or a web site.
- An **EventHubConsumerClient** picks up such information from the Event Hub and processes it. Processing may involve aggregation,
complex computation, and filtering. Processing may also involve distribution or storage of the information in a raw or transformed fashion.
Event Hub consumers are often robust and high-scale platform infrastructure parts with built-in analytics capabilities,
like Azure Stream Analytics, Apache Spark, or Apache Storm.
- A **partition** is an ordered sequence of events that is held in an Event Hub. Azure Event Hubs provides message streaming
through a partitioned consumer pattern in which each consumer only reads a specific subset, or partition, of the message stream.
As newer events arrive, they are added to the end of this sequence. The number of partitions is specified at the time an Event Hub is created and cannot be changed.
- A **consumer group** is a view of an entire Event Hub. Consumer groups enable multiple consuming applications to each
have a separate view of the event stream, and to read the stream independently at their own pace and from their own position.
There can be at most 5 concurrent readers on a partition per consumer group; however it is recommended that there is only
one active consumer for a given partition and consumer group pairing. Each active reader receives all of the events from
its partition; if there are multiple readers on the same partition, then they will receive duplicate events.
For more concepts and deeper discussion, see: [Event Hubs Features](https://docs.microsoft.com/azure/event-hubs/event-hubs-features).
Also, the concepts for AMQP are well documented in [OASIS Advanced Messaging Queuing Protocol (AMQP) Version 1.0](https://docs.oasis-open.org/amqp/core/v1.0/os/amqp-core-overview-v1.0-os.html).
### Thread safety
We do not guarantee that the EventHubProducerClient or EventHubConsumerClient are thread-safe. We do not recommend reusing these instances across threads. It is up to the running application to use these classes in a thread-safe manner.
The data model type, `EventDataBatch` is not thread-safe. It should not be shared across threads nor used concurrently with client methods.
## Examples
The following sections provide several code snippets covering some of the most common Event Hubs tasks, including:
- [Inspect an Event Hub](#inspect-an-event-hub)
- [Publish events to an Event Hub](#publish-events-to-an-event-hub)
- [Consume events from an Event Hub](#consume-events-from-an-event-hub)
- [Consume events from an Event Hub in batches](#consume-events-from-an-event-hub-in-batches)
- [Publish events to an Event Hub asynchronously](#publish-events-to-an-event-hub-asynchronously)
- [Consume events from an Event Hub asynchronously](#consume-events-from-an-event-hub-asynchronously)
- [Consume events from an Event Hub in batches asynchronously](#consume-events-from-an-event-hub-in-batches-asynchronously)
- [Consume events and save checkpoints using a checkpoint store](#consume-events-and-save-checkpoints-using-a-checkpoint-store)
- [Use EventHubConsumerClient to work with IoT Hub](#use-eventhubconsumerclient-to-work-with-iot-hub)
### Inspect an Event Hub
Get the partition ids of an Event Hub.
```python
import os
from azure.eventhub import EventHubConsumerClient
from azure.identity import DefaultAzureCredential
FULLY_QUALIFIED_NAMESPACE = os.environ["EVENT_HUB_HOSTNAME"]
EVENTHUB_NAME = os.environ['EVENT_HUB_NAME']
consumer_client = EventHubConsumerClient(
fully_qualified_namespace=FULLY_QUALIFIED_NAMESPACE,
consumer_group='$Default',
eventhub_name=EVENTHUB_NAME,
credential=DefaultAzureCredential(),
)
with consumer_client:
pass # consumer_client is now ready to be used.
```
### Publish events to an Event Hub
Use the `create_batch` method on `EventHubProducerClient` to create an `EventDataBatch` object which can then be sent using the `send_batch` method.
Events may be added to the `EventDataBatch` using the `add` method until the maximum batch size limit in bytes has been reached.
<!-- SNIPPET:send.send_event_data_batch -->
```python
def send_event_data_batch(producer):
# Without specifying partition_id or partition_key
# the events will be distributed to available partitions via round-robin.
event_data_batch = producer.create_batch()
event_data_batch.add(EventData('Single message'))
producer.send_batch(event_data_batch)
```
<!-- END SNIPPET -->
### Consume events from an Event Hub
There are multiple ways to consume events from an EventHub. To simply trigger a callback when an event is received,
the `EventHubConsumerClient.receive` method will be of use as follows:
```python
import logging
from azure.eventhub import EventHubConsumerClient
from azure.identity import DefaultAzureCredential
fully_qualified_namespace = '<< EVENT HUBS FULLY QUALIFIED NAMESPACE >>'
consumer_group = '<< CONSUMER GROUP >>'
eventhub_name = '<< NAME OF THE EVENT HUB >>'
client = EventHubConsumerClient(
fully_qualified_namespace=fully_qualified_namespace,
eventhub_name=eventhub_name,
consumer_group=consumer_group,
credential=DefaultAzureCredential(),
)
logger = logging.getLogger("azure.eventhub")
logging.basicConfig(level=logging.INFO)
def on_event(partition_context, event):
logger.info("Received event from partition {}".format(partition_context.partition_id))
partition_context.update_checkpoint(event)
with client:
client.receive(
on_event=on_event,
starting_position="-1", # "-1" is from the beginning of the partition.
)
# receive events from specified partition:
# client.receive(on_event=on_event, partition_id='0')
```
### Consume events from an Event Hub in batches
Whereas the above sample triggers the callback for each message as it is received, the following sample
triggers the callback on a batch of events, attempting to receive a number at a time.
```python
import logging
from azure.eventhub import EventHubConsumerClient
from azure.identity import DefaultAzureCredential
fully_qualified_namespace = '<< EVENT HUBS FULLY QUALIFIED NAMESPACE >>'
consumer_group = '<< CONSUMER GROUP >>'
eventhub_name = '<< NAME OF THE EVENT HUB >>'
client = EventHubConsumerClient(
fully_qualified_namespace=fully_qualified_namespace,
eventhub_name=eventhub_name,
consumer_group=consumer_group,
credential=DefaultAzureCredential(),
)
logger = logging.getLogger("azure.eventhub")
logging.basicConfig(level=logging.INFO)
def on_event_batch(partition_context, events):
logger.info("Received event from partition {}".format(partition_context.partition_id))
partition_context.update_checkpoint()
with client:
client.receive_batch(
on_event_batch=on_event_batch,
starting_position="-1", # "-1" is from the beginning of the partition.
)
# receive events from specified partition:
# client.receive_batch(on_event_batch=on_event_batch, partition_id='0')
```
### Publish events to an Event Hub asynchronously
Use the `create_batch` method on `EventHubProducer` to create an `EventDataBatch` object which can then be sent using the `send_batch` method.
Events may be added to the `EventDataBatch` using the `add` method until the maximum batch size limit in bytes has been reached.
```python
import asyncio
from azure.eventhub.aio import EventHubProducerClient # The package name suffixed with ".aio" for async
from azure.eventhub import EventData
from azure.identity.aio import DefaultAzureCredential
fully_qualified_namespace = '<< EVENT HUBS FULLY QUALIFIED NAMESPACE >>'
eventhub_name = '<< NAME OF THE EVENT HUB >>'
async def create_batch(client):
event_data_batch = await client.create_batch()
can_add = True
while can_add:
try:
event_data_batch.add(EventData('Message inside EventBatchData'))
except ValueError:
can_add = False # EventDataBatch object reaches max_size.
return event_data_batch
async def send():
client = EventHubProducerClient(
fully_qualified_namespace=fully_qualified_namespace,
eventhub_name=eventhub_name,
credential=DefaultAzureCredential(),
)
batch_data = await create_batch(client)
async with client:
await client.send_batch(batch_data)
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(send())
```
### Consume events from an Event Hub asynchronously
This SDK supports both synchronous and asyncio based code. To receive as demonstrated in the samples above, but within
aio, one would need the following:
```python
import logging
import asyncio
from azure.eventhub.aio import EventHubConsumerClient
from azure.identity.aio import DefaultAzureCredential
fully_qualified_namespace = '<< EVENT HUBS FULLY QUALIFIED NAMESPACE >>'
consumer_group = '<< CONSUMER GROUP >>'
eventhub_name = '<< NAME OF THE EVENT HUB >>'
logger = logging.getLogger("azure.eventhub")
logging.basicConfig(level=logging.INFO)
async def on_event(partition_context, event):
logger.info("Received event from partition {}".format(partition_context.partition_id))
await partition_context.update_checkpoint(event)
async def receive():
client = EventHubConsumerClient(
fully_qualified_namespace=fully_qualified_namespace,
eventhub_name=eventhub_name,
consumer_group=consumer_group,
credential=DefaultAzureCredential(),
)
async with client:
await client.receive(
on_event=on_event,
starting_position="-1", # "-1" is from the beginning of the partition.
)
# receive events from specified partition:
# await client.receive(on_event=on_event, partition_id='0')
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(receive())
```
### Consume events from an Event Hub in batches asynchronously
All synchronous functions are supported in aio as well. As demonstrated above for synchronous batch receipt, one can accomplish
the same within asyncio as follows:
```python
import logging
import asyncio
from azure.eventhub.aio import EventHubConsumerClient
from azure.identity.aio import DefaultAzureCredential
fully_qualified_namespace = '<< EVENT HUBS FULLY QUALIFIED NAMESPACE >>'
consumer_group = '<< CONSUMER GROUP >>'
eventhub_name = '<< NAME OF THE EVENT HUB >>'
logger = logging.getLogger("azure.eventhub")
logging.basicConfig(level=logging.INFO)
async def on_event_batch(partition_context, events):
logger.info("Received event from partition {}".format(partition_context.partition_id))
await partition_context.update_checkpoint()
async def receive_batch():
client = EventHubConsumerClient(
fully_qualified_namespace=fully_qualified_namespace,
eventhub_name=eventhub_name,
consumer_group=consumer_group,
credential=DefaultAzureCredential(),
)
async with client:
await client.receive_batch(
on_event_batch=on_event_batch,
starting_position="-1", # "-1" is from the beginning of the partition.
)
# receive events from specified partition:
# await client.receive_batch(on_event_batch=on_event_batch, partition_id='0')
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(receive_batch())
```
### Consume events and save checkpoints using a checkpoint store
`EventHubConsumerClient` is a high level construct which allows you to receive events from multiple partitions at once
and load balance with other consumers using the same Event Hub and consumer group.
This also allows the user to track progress when events are processed using checkpoints.
A checkpoint is meant to represent the last successfully processed event by the user from a particular partition of
a consumer group in an Event Hub instance. The `EventHubConsumerClient` uses an instance of `CheckpointStore` to update checkpoints
and to store the relevant information required by the load balancing algorithm.
Search pypi with the prefix `azure-eventhub-checkpointstore` to
find packages that support this and use the `CheckpointStore` implementation from one such package. Please note that both sync and async libraries are provided.
In the below example, we create an instance of `EventHubConsumerClient` and use a `BlobCheckpointStore`. You need
to [create an Azure Storage account](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-portal)
and a [Blob Container](https://docs.microsoft.com/azure/storage/blobs/storage-quickstart-blobs-portal#create-a-container) to run the code.
[Azure Blob Storage Checkpoint Store Async](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventhub/azure-eventhub-checkpointstoreblob-aio)
and [Azure Blob Storage Checkpoint Store Sync](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventhub/azure-eventhub-checkpointstoreblob)
are one of the `CheckpointStore` implementations we provide that applies Azure Blob Storage as the persistent store.
```python
import asyncio
from azure.eventhub.aio import EventHubConsumerClient
from azure.eventhub.extensions.checkpointstoreblobaio import BlobCheckpointStore
from azure.identity.aio import DefaultAzureCredential
fully_qualified_namespace = '<< EVENT HUBS FULLY QUALIFIED NAMESPACE >>'
consumer_group = '<< CONSUMER GROUP >>'
eventhub_name = '<< NAME OF THE EVENT HUB >>'
blob_account_url = '<< STORAGE ACCOUNT URL >>'
container_name = '<<NAME OF THE BLOB CONTAINER>>'
async def on_event(partition_context, event):
# do something
await partition_context.update_checkpoint(event) # Or update_checkpoint every N events for better performance.
async def receive(client):
await client.receive(
on_event=on_event,
starting_position="-1", # "-1" is from the beginning of the partition.
)
async def main():
checkpoint_store = BlobCheckpointStore(
blob_account_url=blob_account_url,
container_name=container_name,
credential=DefaultAzureCredential()
)
client = EventHubConsumerClient(
fully_qualified_namespace=fully_qualified_namespace,
eventhub_name=eventhub_name,
credential=DefaultAzureCredential(),
consumer_group=consumer_group,
checkpoint_store=checkpoint_store, # For load balancing and checkpoint. Leave None for no load balancing
)
async with client:
await receive(client)
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
```
### Use EventHubConsumerClient to work with IoT Hub
You can use `EventHubConsumerClient` to work with IoT Hub as well. This is useful for receiving telemetry data of IoT Hub from the
linked EventHub. The associated connection string will not have send claims, hence sending events is not possible.
Please notice that the connection string needs to be for an [Event Hub-compatible endpoint](https://docs.microsoft.com/azure/iot-hub/iot-hub-devguide-messages-read-builtin),
e.g. "Endpoint=sb://my-iothub-namespace-[uid].servicebus.windows.net/;SharedAccessKeyName=my-SA-name;SharedAccessKey=my-SA-key;EntityPath=my-iot-hub-name"
There are two ways to get the Event Hubs compatible endpoint:
- Manually get the "Built-in endpoints" of the IoT Hub in Azure Portal and receive from it.
```python
from azure.eventhub import EventHubConsumerClient
connection_str = 'Endpoint=sb://my-iothub-namespace-[uid].servicebus.windows.net/;SharedAccessKeyName=my-SA-name;SharedAccessKey=my-SA-key;EntityPath=my-iot-hub-name'
consumer_group = '<< CONSUMER GROUP >>'
client = EventHubConsumerClient.from_connection_string(connection_str, consumer_group)
partition_ids = client.get_partition_ids()
```
- Programmatically retrieve the built-in Event Hubs compatible endpoint.
Refer to [IoT Hub Connection String Sample](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventhub/azure-eventhub/samples/async_samples/iot_hub_connection_string_receive_async.py).
## Troubleshooting
See the `azure-eventhub` [troubleshooting guide](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventhub/azure-eventhub/TROUBLESHOOTING.md) for details on how to diagnose various failure scenarios.
### Logging
- Enable `azure.eventhub` logger to collect traces from the library.
- Enable AMQP frame level trace by setting `logging_enable=True` when creating the client.
- Refer to [this guide](https://learn.microsoft.com/azure/developer/python/sdk/azure-sdk-logging) on configuring logging for Azure libraries for Python for additional information.
```python
import logging
import sys
handler = logging.StreamHandler(stream=sys.stdout)
logger = logging.getLogger('azure.eventhub')
logger.setLevel(logging.DEBUG)
logger.addHandler(handler)
...
from azure.eventhub import EventHubProducerClient, EventHubConsumerClient
producer = EventHubProducerClient(..., logging_enable=True)
consumer = EventHubConsumerClient(..., logging_enable=True)
```
## Next steps
### More sample code
Please take a look at the [samples](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventhub/azure-eventhub/samples) directory for detailed examples of how to use this library to send and receive events to/from Event Hubs.
### Documentation
Reference documentation is available [here](https://azuresdkdocs.blob.core.windows.net/$web/python/azure-eventhub/latest/azure.eventhub.html).
### Schema Registry and Avro Encoder
The EventHubs SDK integrates nicely with the [Schema Registry][schemaregistry_service] service and [Avro][avro].
For more information, please refer to [Schema Registry SDK][schemaregistry_repo] and [Schema Registry Avro Encoder SDK][schemaregistry_avroencoder_repo].
### Pure Python AMQP Transport and Backward Compatibility Support
The Azure Event Hubs client library is now based on a pure Python AMQP implementation. `uAMQP` has been removed as required dependency.
To use `uAMQP` as the underlying transport:
1. Install `uamqp` with pip.
```
$ pip install uamqp
```
2. Pass `uamqp_transport=True` during client construction.
```python
from azure.eventhub import EventHubProducerClient, EventHubConsumerClient
from azure.identity import DefaultAzureCredential
fully_qualified_namespace = '<< EVENT HUBS FULLY QUALIFIED NAMESPACE >>'
consumer_group = '<< CONSUMER GROUP >>'
eventhub_name = '<< NAME OF THE EVENT HUB >>'
client = EventHubProducerClient(
fully_qualified_namespace=fully_qualified_namespace,
eventhub_name=eventhub_name,
credential=DefaultAzureCredential(),
uamqp_transport=True
)
client = EventHubConsumerClient(
fully_qualified_namespace=fully_qualified_namespace,
eventhub_name=eventhub_name,
credential=DefaultAzureCredential(),
consumer_group=consumer_group,
uamqp_transport=True
)
```
Note: The `message` attribute on `EventData`/`EventDataBatch`, which previously exposed the `uamqp.Message`, has been deprecated.
The "Legacy" objects returned by `EventData.message`/`EventDataBatch.message` have been introduced to help facilitate the transition.
### Building uAMQP wheel from source
If [uAMQP](https://pypi.org/project/uamqp/) is intended to be used as the underlying AMQP protocol implementation for `azure-eventhub`,
uAMQP wheels can be found for most major operating systems.
If you intend to use `uAMQP` and you're running on a platform for which uAMQP wheels are not provided, please follow
the [uAMQP Installation](https://github.com/Azure/azure-uamqp-python#installation) guidance to install from source.
### Provide Feedback
If you encounter any bugs or have suggestions, please file an issue in the [Issues](https://github.com/Azure/azure-sdk-for-python/issues) section of the project.
## Contributing
This project welcomes contributions and suggestions. Most contributions require you to agree to a
Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the
PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).
For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.
[avro]: https://avro.apache.org/
[api_reference]: https://docs.microsoft.com/python/api/overview/azure/eventhub-readme
[schemaregistry_service]: https://aka.ms/schemaregistry
[schemaregistry_repo]: https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/schemaregistry/azure-schemaregistry
[schemaregistry_avroencoder_repo]: https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/schemaregistry/azure-schemaregistry-avroencoder
![Impressions](https://azure-sdk-impressions.azurewebsites.net/api/impressions/azure-sdk-for-python/sdk/eventhub/azure-eventhub/README.png)
# Release History
## 5.12.2 (2024-10-02)
### Bugs Fixed
- Implemented backpressure for async consumer to address a memory leak issue. ([#36398](https://github.com/Azure/azure-sdk-for-python/issues/36398))
## 5.12.1 (2024-06-11)
### Bugs Fixed
- Fixed a bug where the correct URI was not being used for consumer authentication, causing issues when assigning roles at the consumer group level. ([#35337](https://github.com/Azure/azure-sdk-for-python/issues/35337))
## 5.12.0 (2024-05-16)
### Features Added
- Added support for non-tls connections (#34272)
## 5.11.7 (2024-04-10)
### Bugs Fixed
- Fixed a bug where using `EventHubProducerClient` in buffered mode could potentially drop a buffered message without actually sending it. ([#34712](https://github.com/Azure/azure-sdk-for-python/pull/34712))
### Other Changes
- Updated network trace logging to replace `None` values in AMQP connection info with empty strings as per the OpenTelemetry specification.
## 5.11.6 (2024-02-12)
This version and all future versions will require Python 3.8+. Python 3.7 is no longer supported.
### Features Added
- Added `keep_alive` functionality on EventHubProducerClient to allow for long-living producers. [#33726](https://github.com/Azure/azure-sdk-for-python/issues/33726)
### Other Changes
- Added support for Python 3.12.
## 5.11.5 (2023-11-13)
### Bugs Fixed
- Fixed a pyAMQP error where events split across multiple TransferFrames were depleting the link credit by more than 1 credit per message. ([#32767](https://github.com/Azure/azure-sdk-for-python/pull/32767))
## 5.11.4 (2023-08-08)
### Features Added
- A new float keyword argument `socket_timeout` has been added to the sync and async `EventHubConsumerClient` and `EventHubProducerClient`.
### Bugs Fixed
- Fixed bug [#31258](https://github.com/Azure/azure-sdk-for-python/issues/31258) where async `BlobCheckpointStore` was reprocessing old events after an error.
## 5.11.3 (2023-07-12)
### Bugs Fixed
- Fixed the error `end frame received on invalid channel` which was raised when a disconnect was sent by the service ([#30860](https://github.com/Azure/azure-sdk-for-python/pull/30860))
- Fixed the error `link already closed` which was raised when the client was closing and disconnecting from the service ([#30836](https://github.com/Azure/azure-sdk-for-python/pull/30836))
### Other Changes
- Updated tracing ([#29934](https://github.com/Azure/azure-sdk-for-python/pull/29934)):
- Span names renamed:
- `Azure.EventHubs.send` to `EventHubs.send`
- `Azure.EventHubs.message` to `EventHubs.message`
- `Azure.EventHubs.process` to `EventHubs.process`
- An `EventHubs.receive` span will be created upon receiving events.
- Additional attributes added to spans:
- `messaging.system` - messaging system (i.e., `eventhubs`)
- `messaging.operation` - type of operation (i.e., `publish`, `receive`, or `process`)
- `messaging.batch.message_count` - number of messages sent, received, or processed (if more than one)
- The `component` attribute was removed from all spans.
- All `send` spans now contain links to `message` spans. Now, `message` spans will no longer contain a link to the `send` span.
- Message application properties will now contain values for `traceparent` (and `tracestate` if applicable)
- Process spans will now be a direct children of message span contexts in when event handling on a per-message basis. ([#30537](https://github.com/Azure/azure-sdk-for-python/pull/30537))
## 5.11.2 (2023-03-20)
### Bugs Fixed
- Fixed a bug that would prevent reconnect after a long idle period, network drop (issue #28996)
## 5.11.1 (2023-01-25)
### Bugs Fixed
- Fixed a bug where, when `websocket-client` was not installed, the error was not caught/raised properly (issue #28453).
## 5.11.0 (2023-01-19)
Version 5.11.0 is our first stable release of the Azure Event Hubs client library based on a pure Python implemented AMQP stack.
### Features Added
- A new boolean keyword argument `uamqp_transport` has been added to sync and async `EventHubProducerClient`/`EventHubConsumerClient` constructors which indicates whether to use the `uamqp` library or the default pure Python AMQP library as the underlying transport.
### Bugs Fixed
- Fixed a bug that caused an error when sending batches with tracing enabled (issue #27986).
- Fixed a bug where `EventHubSharedKeyCredential` returned an `AccessToken.token` of type `bytes` and not `str`, now matching the documentation.
### Other Changes
- The `message` attribute on `EventData`/`EventDataBatch`, which previously exposed the `uamqp.Message`, has been deprecated.
- `LegacyMessage`/`LegacyBatchMessage` objects returned by the `message` attribute on `EventData`/`EventDataBatch` have been introduced to help facilitate the transition.
- Removed uAMQP from required dependencies.
- Adding `uamqp >= 1.6.3` as an optional dependency for use with the `uamqp_transport` keyword.
- Added support for Python 3.11.
## 5.8.0b2 (2022-10-11)
### Features Added
- Updated the optional dependency for async transport using AMQP over WebSocket from `websocket-client` to `aiohttp` (Issue #24315, thanks @hansmbakker for the suggestion).
## 5.8.0b1 (2022-09-22)
This version and all future versions will require Python 3.7+. Python 3.6 is no longer supported.
### Other Changes
- Added the `uamqp_transport` optional parameter to the clients, to allow switching to the `uamqp` library as the transport.
## 5.10.1 (2022-08-22)
This version and all future versions will require Python 3.7+, Python 3.6 is no longer supported.
### Bugs Fixed
- Fixed a bug in async `BufferedProducer` that would block when flushing the queue causing the client to freeze up (issue #23510).
- Fixed a bug in the async `EventHubProducerClient` and `EventHubConsumerClient` that set the default value of the `transport_type` parameter in the `from_connection_string` methods to `None` rather than `TransportType.Amqp`.
### Other Changes
- Internal refactoring to support upcoming Pure Python AMQP-based release.
- Updated uAMQP dependency to 1.6.0.
## 5.8.0a5 (2022-07-19)
### Bugs Fixed
- Fixed bug that prevented token refresh at regular intervals.
- Fixed bug that was improperly passing the debug keyword argument, so that network trace debug logs are output when requested.
### Other Changes
- Added logging added in to track proper token refreshes & fetches, output exception reason for producer init failure.
## 5.10.0 (2022-06-08)
### Features Added
- Includes the following features related to buffered sending of events:
- A new method `send_event` to `EventHubProducerClient` which allows sending single `EventData` or `AmqpAnnotatedMessage`.
- Buffered mode sending to `EventHubProducerClient` which is intended to allow for efficient publishing of events
without having to explicitly manage batches in the application.
- The constructor of `EventHubProducerClient` and `from_connection_string` method takes the following new keyword arguments
for configuration:
- `buffered_mode`: The flag to enable/disable buffered mode sending.
- `on_success`: The callback to be called once events have been successfully published.
- `on_error`: The callback to be called once events have failed to be published.
- `max_buffer_length`: The total number of events per partition that can be buffered before a flush will be triggered.
- `max_wait_time`: The amount of time to wait for a batch to be built with events in the buffer before publishing.
- A new method `EventHubProducerClient.flush` which flushes events in the buffer to be sent immediately.
- A new method `EventHubProducerClient.get_buffered_event_count` which returns the number of events that are buffered and waiting to be published for a given partition.
- A new property `EventHubProducerClient.total_buffered_event_count` which returns the total number of events that are currently buffered and waiting to be published, across all partitions.
- A new boolean keyword argument `flush` to `EventHubProducerClient.close` which indicates whether to flush the buffer or not while closing.
## 5.8.0a4 (2022-06-07)
### Features Added
- Added support for connection using websocket and http proxy.
- Added support for custom endpoint connection over websocket.
## 5.9.0 (2022-05-10)
### Features Added
- The classmethod `from_message_content` has been added to `EventData` for interoperability with the Schema Registry Avro Encoder library, and takes `content` and `content_type` as positional parameters.
### Other Changes
- Features related to buffered sending of events are still in beta and will not be included in this release.
## 5.9.0b3 (2022-04-20)
### Features Added
- Introduced new method `send_event` to `EventHubProducerClient` which allows sending single `EventData` or `AmqpAnnotatedMessage`.
- Introduced buffered mode sending to `EventHubProducerClient` which is intended to allow for efficient publishing of events
without having to explicitly manage batches in the application.
- The constructor of `EventHubProducerClient` and `from_connection_string` method now takes the following new keyword arguments
for configuration:
- `buffered_mode`: The flag to enable/disable buffered mode sending.
- `on_success`: The callback to be called once events have been successfully published.
- `on_error`: The callback to be called once events have failed to be published.
- `max_buffer_length`: The total number of events per partition that can be buffered before a flush will be triggered.
- `max_wait_time`: The amount of time to wait for a batch to be built with events in the buffer before publishing.
- Introduced new method `EventHubProducerClient.flush` which flushes events in the buffer to be sent immediately.
- Introduced new method `EventHubProducerClient.get_buffered_event_count` which returns the number of events that are buffered and waiting to be published for a given partition.
- Introduced new property `EventHubProducerClient.total_buffered_event_count` which returns the total number of events that are currently buffered and waiting to be published, across all partitions.
- Introduced new boolean keyword argument `flush` to `EventHubProducerClient.close` which indicates whether to flush the buffer or not while closing.
### Other Changes
- Updated `EventData` internals for interoperability with the Schema Registry Avro Encoder library.
## 5.9.0b2 (2022-03-09)
### Breaking Changes
- `from_message_data` on `EventData` has been renamed `from_message_content` for interoperability with the Schema Registry Avro Encoder library. The `data` parameter has been renamed to `content`.
## 5.8.0a3 (2022-03-08)
### Other Changes
- Improved the performance of async sending and receiving.
## 5.9.0b1 (2022-02-09)
- The following features have been temporarily pulled out of async `EventHubProducerClient` and `EventHubConsumerClient` which will be added back in future previews as we work towards a stable release:
- Passing the following keyword arguments to the constructors and `from_connection_string` methods of the `EventHubProducerClient` and `EventHubConsumerClient` is not supported: `transport_type`, `http_proxy`, `custom_endpoint_address`, and `connection_verify`.
## 5.8.0a2 (2022-02-09)
### Features Added
- Added support for async `EventHubProducerClient` and `EventHubConsumerClient`.
## 5.8.0a1 (2022-01-13)
Version 5.8.0a1 is our first efforts to build an Azure Event Hubs client library based on pure python implemented AMQP stack.
### Breaking changes
- The following features have been temporarily pulled out which will be added back in future previews as we work towards a stable release:
- Async is not supported.
- Passing the following keyword arguments to the constructors and `from_connection_string` methods of the `EventHubProducerClient` and `EventHubConsumerClient` is not supported: `transport_type`, `http_proxy`, `custom_endpoint_address`, and `connection_verify`.
### Other Changes
- uAMQP dependency is removed.
## 5.7.0 (2022-01-12)
This version and all future versions will require Python 3.6+. Python 2.7 is no longer supported.
### Features Added
- Added support for fixed (linear) retry backoff:
- Sync/async `EventHubProducerClient` and `EventHubConsumerClient` constructors and `from_connection_string` take `retry_mode` as a keyword argument.
### Bugs Fixed
- Fixed a bug that `EventHubProducerClient` could be reopened for sending events instead of encountering with `KeyError` when the client is previously closed (issue #21849).
### Other Changes
- Improved token refresh timing to prevent potentially blocking main flow when the token is about to get expired soon.
- Updated uAMQP dependency to 1.5.1.
## 5.6.1 (2021-10-06)
### Bugs Fixed
- Fixed a bug for checking that `azure.eventhub.amqp.AmqpMessageHeader` and `azure.eventhub.amqp.AmqpMessageProperties` contain specific properties using the `in` keyword.
### Other Changes
- Updated uAMQP dependency to 1.4.3.
- Added support for Python 3.10.
- Fixed memory leak in win32 socketio and tlsio (issue #19777).
- Fixed memory leak in the process of converting AMQPValue into string (issue #19777).
## 5.6.0 (2021-07-07)
### Features Added
- Added support for sending AMQP annotated message which allows full access to the AMQP message fields.
- Introduced new namespace `azure.eventhub.amqp`.
- Added new enum class `azure.eventhub.amqp.AmqpMessageBodyType` to represent the body type of the message which includes:
- `DATA`: The body of message consists of one or more data sections and each section contains opaque binary data.
- `SEQUENCE`: The body of message consists of one or more sequence sections and each section contains an arbitrary number of structured data elements.
- `VALUE`: The body of message consists of one amqp-value section and the section contains a single AMQP value.
- Introduced new class `azure.eventhub.amqp.AmqpAnnotatedMessage` for accessing low-level amqp message sections which can be instantiated for sending.
- Introduced new classes `azure.eventhub.amqp.AmqpMessageHeader` and `azure.eventhub.amqp.AmqpMessageProperties` for accessing amqp header and properties.
- Added new property `body_type` on `azure.eventhub.EventData` which returns `azure.eventhub.amqp.AmqpMessageBodyType`.
- Added new read-only property `raw_amqp_message` on `azure.eventhub.EventData` which returns `azure.eventhub.amqp.AmqpAnnotatedMessage`.
### Fixed
- Updated uAMQP dependency to 1.4.1.
- Fixed a bug that attributes creation_time, absolute_expiry_time and group_sequence on MessageProperties should be compatible with integer types on Python 2.7.
## 5.5.0 (2021-05-13)
**New Features**
- Added support for using `azure.core.credentials.AzureNamedKeyCredential` as credential for authenticating producer and consumer clients.
**Bug Fixes**
- Fixed bug that custom user agent string should be put in front of the built-in user agent string instead of being appended.
- Updated uAMQP dependency to 1.4.0.
- Fixed memory leaks in the process of link attach where source and target cython objects are not properly deallocated (#15747).
- Improved management operation callback not to parse description value of non AMQP_TYPE_STRING type as string (#18361).
**Notes**
- Updated azure-core dependency to 1.14.0.
## 5.4.0 (2021-04-07)
This version follows from version 5.3.1, rather than 5.4.0b1 so that the preview idempotent producer feature is not included.
**New Features**
- Added support for using `azure.core.credentials.AzureSasCredential` as credential for authenticating producer and consumer clients.
- Updated `list_ownership`, `claim_ownership`, `update_checkpoint`, `list_checkpoints` on sync and async `CheckpointStore` to support taking `**kwargs`.
- WARNING: Implementing a custom checkpointstore that does not support taking `**kwargs` in the methods listed previously will result in the following pylint error: `W0221: Parameters differ from overridden ________ method (arguments-differ)`.
- Updated `update_checkpoint` on sync and async `PartitionContext` to support taking `**kwargs`.
**Bug Fixes**
- Updated uAMQP dependency to 1.3.0.
- Fixed bug that sending message of large size triggering segmentation fault when the underlying socket connection is lost (#13739, #14543).
- Fixed bug in link flow control where link credit and delivery count should be calculated based on per message instead of per transfer frame (#16934).
**Notes**
- Updated azure-core dependency to 1.13.0.
## 5.4.0b1 (2021-03-09)
This version and all future versions will require Python 2.7 or Python 3.6+, Python 3.5 is no longer supported.
**New Features**
- Added support for idempotent publishing which is supported by the service to endeavor to reduce the number of duplicate
events that are published.
- `EventHubProducerClient` constructor accepts two new parameters for idempotent publishing:
- `enable_idempotent_partitions`: A boolean value to tell the `EventHubProducerClient` whether to enable idempotency.
- `partition_config`: The set of configurations that can be specified to influence publishing behavior
specific to the configured Event Hub partition.
- Introduced a new method `get_partition_publishing_properties` on `EventHubProducerClient` to inspect the information
about the state of publishing for a partition.
- Introduced a new property `published_sequence_number` on `EventData` to get the publishing sequence number assigned
to the event at the time it was successfully published.
- Introduced a new property `starting_published_sequence_number` on `EventDataBatch` to get the publishing sequence
number assigned to the first event in the batch at the time the batch was successfully published.
- Introduced a new class `azure.eventhub.PartitionPublishingConfiguration` which is a set of configurations that can be
specified to influence the behavior when publishing directly to an Event Hub partition.
**Notes**
- Updated uAMQP dependency to 1.2.15.
## 5.3.1 (2021-03-09)
This version will be the last version to officially support Python 3.5, future versions will require Python 2.7 or Python 3.6+.
**Bug fixes**
- Sending empty `event_data_batch` will be a no-op now instead of raising error.
## 5.3.0 (2021-02-08)
**New Features**
- Added a `parse_connection_string` method which parses a connection string into a properties bag, `EventHubConnectionStringProperties`, containing its component parts.
- The constructor and `from_connection_string` method of `EventHubConsumerClient` and `EventHubProducerClient` now accept two new optional arguments:
- `custom_endpoint_address` which allows for specifying a custom endpoint to use when communicating with the Event Hubs service,
and is useful when your network does not allow communicating to the standard Event Hubs endpoint.
- `connection_verify` which allows for specifying the path to the custom CA_BUNDLE file of the SSL certificate which is used to authenticate
the identity of the connection endpoint.
**Notes**
- Updated uAMQP dependency to 1.2.14.
## 5.2.1 (2021-01-11)
**Bug fixes**
- Updated `azure.eventhub.extension.__init__.py` to be compatible with pkgutil-style namespace (PR #13210, thanks @pjachowi).
- Updated uAMQP dependency to 1.2.13
- Added support for Python 3.9.
- Fixed bug that macOS was unable to detect network error (#15473).
- Fixed bug that `uamqp.ReceiveClient` and `uamqp.ReceiveClientAsync` receive messages during connection establishment (#15555).
- Fixed bug where connection establishment on macOS with Clang 12 triggering unrecognized selector exception (#15567).
- Fixed bug in accessing message properties triggering segmentation fault when the underlying C bytes are NULL (#15568).
## 5.2.0 (2020-09-08)
**New Features**
- Connection strings used with `from_connection_string` methods now supports using the `SharedAccessSignature` key in leiu of `sharedaccesskey` and `sharedaccesskeyname`, taking the string of the properly constructed token as value.
## 5.2.0b1 (2020-07-06)
**New Features**
- `EventHubConsumerClient` constructor accepts two new parameters for the load balancer.
- `load_balancing_strategy`, which can be "greedy" or "balanced".
With greedy strategy, one execution of load balancing will claim as many partitions as required to balance the load
whereas with balanced strategy one execution of load balancing will claim at most 1 partition.
- `partition_ownership_expiration_interval`, which allows you to customize the partition ownership expiration for load balancing.
A consumer client may lose its owned partitions more often with a smaller expiration interval. But a larger interval
may result in idle partitions not being claimed for longer time.
- Added enum class `azure.eventhub.LoadBalancingStrategy` for `load_balancing_strategy`.
## 5.1.0 (2020-05-04)
**New Features**
- `EventHubProducerClient.send_batch` accepts either an `EventDataBatch` or a finite list of `EventData`. #9181
- Added enqueueTime to span links of distributed tracing. #9599
**Bug fixes**
- Fixed a bug that turned `azure.eventhub.EventhubConsumerClient` into an exclusive receiver when it has no checkpoint store. #11181
- Updated uAMQP dependency to 1.2.7.
- Fixed bug in setting certificate of tlsio on MacOS. #7201
- Fixed bug that caused segmentation fault in network tracing on MacOS when setting `logging_enable` to `True` in `EventHubConsumerClient` and `EventHubProducerClient`.
## 5.1.0b1 (2020-04-06)
**New Features**
- Added `EventHubConsumerClient.receive_batch()` to receive and process events in batches instead of one by one. #9184
- `EventHubConsumerCliuent.receive()` has a new param `max_wait_time`.
`on_event` is called every `max_wait_time` when no events are received and `max_wait_time` is not `None` or 0.
- Param event of `PartitionContext.update_checkpoint` is now optional. The last received event is used when param event is not passed in.
- `EventData.system_properties` has added missing properties when consuming messages from IotHub. #10408
## 5.0.1 (2020-03-09)
**Bug fixes**
- Fixed a bug that swallowed errors when receiving events with `azure.eventhub.EventHubConsumerClient` #9660
- Fixed a bug that caused `get_eventhub_properties`, `get_partition_ids`, and `get_partition_properties` to raise
an error on Azure Stack #9920
## 5.0.0 (2020-01-13)
**Breaking changes**
- `EventData`
- Removed deprecated property `application_properties` and deprecated method `encode_message()`.
- `EventHubConsumerClient`
- `on_error` would be called when `EventHubConsumerClient` failed to claim ownership of partitions.
- `on_partition_close` and `on_partition_initialize` would be called in the case of exceptions raised by `on_event` callback.
- `EventHubConsumerClient` would close and re-open the internal partition receiver in this case.
- Default starting position from where `EventHubConsumerClient` should resume receiving after recovering from an error has been re-prioritized.
- If there is checkpoint, it will resume from the checkpoint.
- If there is no checkpoint but `starting_position` is provided, it will resume from `starting_posititon`.
- If there is no checkpoint or `starting_position`, it will resume from the latest position.
- `PartitionContext`
- `update_checkpoint` would do in-memory checkpoint instead of doing nothing when checkpoint store is not explicitly provided.
- The in-memory checkpoints would be used for `EventHubConsumerClient` receiving recovering.
- `get_partition_ids`, `get_partition_properties`, `get_eventhub_properties` would raise error in the case of service returning an error status code.
- `AuthenticationError` would be raised when service returning error code 401.
- `ConnectError` would be raised when service returning error code 404.
- `EventHubError` would be raised when service returning other error codes.
## 5.0.0b6 (2019-12-03)
**Breaking changes**
- All exceptions should now be imported from `azure.eventhub.exceptions`.
- Introduced separate `EventHubSharedKeyCredential` objects for synchronous and asynchronous operations.
For async, import the credentials object from the `azure.eventhub.aio` namespace.
- `EventData`
- Renamed property `application_properties` to `properties`.
- `EventData` no longer has attribute `last_enqueued_event_properties` - use this on `PartitionContext` instead.
- `EvenDataBatch`
- `EventDataBatch.try_add` has been renamed to `EventDataBatch.add`.
- Renamed property `size` to `size_in_bytes`.
- Renamed attribute `max_size` to `max_size_in_bytes`.
- `EventHubConsumerClient` and `EventHubProducerClient`
- Renamed method `get_properties` to `get_eventhub_properties`.
- Renamed parameters in constructor: `host` to `fully_qualified_namespace`, `event_hub_path` to `eventhub_name`.
- Renamed parameters in `get_partition_properties`: `partition` to `partition_id`.
- Renamed parameter `consumer_group_name` to `consumer_group` and moved that parameter from `receive` method to the constructor of `EventHubConsumerClient`.
- Renamed parameter `initial_event_position` to `starting_position` on the `receive` method of `EventHubConsumerClient`.
- Renamed parameter `event_hub_path` to `eventhub_name` in constructor and `from_connection_string` method of the client object.
- `EventHubProducerClient.send` has been renamed to `send_batch` which will only accept `EventDataBatch` object as input.
- `EventHubProducerClient.create_batch` now also takes the `partition_id` and `partition_key` as optional parameters (which are no longer specified at send).
- Renamed module `PartitionManager` to `CheckpointStore`.
- Receive event callback parameter has been renamed to `on_event` and now operates on a single event rather than a list of events.
- Removed class `EventPostition`.
- The `starting_position` parameter of the `receive` method accepts offset(`str`), sequence number(`int`), datetime (`datetime.datetime`) or `dict` of these types.
- The `starting_position_inclusive` parameter of the `receive` method accepts `bool` or `dict` indicating whether the given event position is inclusive or not.
- `PartitionContext` no longer has attribute `owner_id`.
- `PartitionContext` now has attribute `last_enqueued_event_properties` which is populated if `track_last_enqueued_event_properties` is set to `True` in the `receive` method.
**New features**
- Added new parameter `idle_timeout` in construct and `from_connection_string` to `EventHubConsumerClient` and `EventHubProducerClient`
after which the underlying connection will close if there is no further activity.
## 5.0.0b5 (2019-11-04)
**Breaking changes**
- `EventHubClient`, `EventHubConsumer` and `EventHubProducer` has been removed. Use `EventHubProducerClient` and `EventHubConsumerClient` instead.
- Construction of both objects is the same as it was for the previous client.
- Introduced `EventHubProducerClient` as substitution for`EventHubProducer`.
- `EventHubProducerClient` supports sending events to different partitions.
- Introduced `EventHubConsumerClient` as substitution for `EventHubConsumer`.
- `EventHubConsumerClient` supports receiving events from single/all partitions.
- There are no longer methods which directly return `EventData`, all receiving is done via callback method: `on_events`.
- `EventHubConsumerClient` has taken on the responsibility of `EventProcessor`.
- `EventHubConsumerClient` now accepts `PartitionManager` to do load-balancing and checkpoint.
- Replaced `PartitionProcessor`by four independent callback methods accepted by the `receive` method on `EventHubConsumerClient`.
- `on_events(partition_context, events)` called when events are received.
- `on_error(partition_context, exception` called when errors occur.
- `on_partition_initialize(partition_context)` called when a partition consumer is opened.
- `on_partition_close(partition_context, reason)` called when a partition consumer is closed.
- Some modules and classes that were importable from several different places have been removed:
- `azure.eventhub.common` has been removed. Import from `azure.eventhub` instead.
- `azure.eventhub.client_abstract` has been removed. Use `azure.eventhub.EventHubProducerClient` or `azure.eventhub.EventHubConsumerClient` instead.
- `azure.eventhub.client` has been removed. Use `azure.eventhub.EventHubProducerClient` or `azure.eventhub.EventHubConsumerClient` instead.
- `azure.eventhub.producer` has been removed. Use `azure.eventhub.EventHubProducerClient` instead.
- `azure.eventhub.consumer` has been removed. Use `azure.eventhub.EventHubConsumerClient` instead.
- `azure.eventhub.aio.client_async` has been removed. Use `azure.eventhub.aio.EventHubProducerClient` or `azure.eventhub.aio.EventHubConsumerClient` instead.
- `azure.eventhub.aio.producer_async` has been removed. Use `azure.eventhub.aio.EventHubProducerClient` instead.
- `azure.eventhub.aio.consumer_async` has been removed. Use `azure.eventhub.aio.EventHubConsumerClient` instead.
- `azure.eventhub.aio.event_processor.event_processor` has been removed. Use `azure.eventhub.aio.EventHubConsumerClient` instead.
- `azure.eventhub.aio.event_processor.partition_processor` has been removed. Use callback methods instead.
- `azure.eventhub.aio.event_processor.partition_manager` has been removed. Import from `azure.eventhub.aio` instead.
- `azure.eventhub.aio.event_processor.partition_context` has been removed. Import from `azure.eventhub.aio` instead.
- `azure.eventhub.aio.event_processor.sample_partition_manager` has been removed.
**Bug fixes**
- Fixed bug in user-agent string not being parsed.
## 5.0.0b4 (2019-10-08)
**New features**
- Added support for tracing (issue #7153).
- Added the capability of tracking last enqueued event properties of the partition to `EventHubConsumer` .
- Added new boolean type parameter`track_last_enqueued_event_properties` in method `EventHubClient.create_consumer()`.
- Added new property `last_enqueued_event_properties` of `EventHubConsumer` which contains sequence_number, offset, enqueued_time and retrieval_time information.
- By default the capability is disabled as it will cost extra bandwidth for transferring more information if turned on.
**Breaking changes**
- Removed support for IoT Hub direct connection.
- [EventHubs compatible connection string](https://docs.microsoft.com/azure/iot-hub/iot-hub-devguide-messages-read-builtin) of an IotHub can be used to create `EventHubClient` and read properties or events from an IoT Hub.
- Removed support for sending EventData to IoT Hub.
- Removed parameter `exception` in method `close()` of `EventHubConsumer` and `EventHubProcuer`.
- Updated uAMQP dependency to 1.2.3.
## 5.0.0b3 (2019-09-10)
**New features**
- Added support for automatic load balancing among multiple `EventProcessor`.
- Added `BlobPartitionManager` which implements `PartitionManager`.
- Azure Blob Storage is applied for storing data used by `EventProcessor`.
- Packaged separately as a plug-in to `EventProcessor`.
- For details, please refer to [Azure Blob Storage Partition Manager](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/eventhub/azure-eventhub-checkpointstoreblob-aio).
- Added property `system_properties` on `EventData`.
**Breaking changes**
- Removed constructor method of `PartitionProcessor`. For initialization please implement the method `initialize`.
- Replaced `CheckpointManager` by `PartitionContext`.
- `PartitionContext` has partition context information and method `update_checkpoint`.
- Updated all methods of `PartitionProcessor` to include `PartitionContext` as part of the arguments.
- Updated accessibility of class members in `EventHub/EventHubConsumer/EventHubProducer`to be private.
- Moved `azure.eventhub.eventprocessor` under `aio` package, which now becomes `azure.eventhub.aio.eventprocessor`.
## 5.0.0b2 (2019-08-06)
**New features**
- Added method `create_batch` on the `EventHubProducer` to create an `EventDataBatch` that can then be used to add events until the maximum size is reached.
- This batch object can then be used in the `send()` method to send all the added events to Event Hubs.
- This allows publishers to build batches without the possibility of encountering the error around the message size exceeding the supported limit when sending events.
- It also allows publishers with bandwidth concerns to control the size of each batch published.
- Added new configuration parameters for exponential delay between retry operations.
- `retry_total`: The total number of attempts to redo the failed operation.
- `backoff_factor`: The delay time factor.
- `backoff_max`: The maximum delay time in total.
- Added support for context manager on `EventHubClient`.
- Added new error type `OperationTimeoutError` for send operation.
- Introduced a new class `EventProcessor` which replaces the older concept of [Event Processor Host](https://docs.microsoft.com/azure/event-hubs/event-hubs-event-processor-host). This early preview is intended to allow users to test the new design using a single instance of `EventProcessor`. The ability to checkpoints to a durable store will be added in future updates.
- `EventProcessor`: EventProcessor creates and runs consumers for all partitions of the eventhub.
- `PartitionManager`: PartitionManager defines the interface for getting/claiming ownerships of partitions and updating checkpoints.
- `PartitionProcessor`: PartitionProcessor defines the interface for processing events.
- `CheckpointManager`: CheckpointManager takes responsibility for updating checkpoints during events processing.
**Breaking changes**
- `EventProcessorHost` was replaced by `EventProcessor`, please read the new features for details.
- Replaced `max_retries` configuration parameter of the EventHubClient with `retry_total`.
## 5.0.0b1 (2019-06-25)
Version 5.0.0b1 is a preview of our efforts to create a client library that is user friendly and idiomatic to the Python ecosystem. The reasons for most of the changes in this update can be found in the [Azure SDK Design Guidelines for Python](https://azuresdkspecs.z5.web.core.windows.net/PythonSpec.html). For more information, please visit https://aka.ms/azure-sdk-preview1-python.
**New features**
- Added new configuration parameters for creating EventHubClient.
- `credential`: The credential object used for authentication which implements `TokenCredential` interface of getting tokens.
- `transport_type`: The type of transport protocol that will be used for communicating with the Event Hubs service.
- `max_retries`: The max number of attempts to redo the failed operation when an error happened.
- for detailed information about the configuration parameters, please read the reference documentation.
- Added new methods `get_partition_properties` and `get_partition_ids` to EventHubClient.
- Added support for http proxy.
- Added support for authentication using azure-identity credential.
- Added support for transport using AMQP over WebSocket.
**Breaking changes**
- New error hierarchy
- `azure.error.EventHubError`
- `azure.error.ConnectionLostError`
- `azure.error.ConnectError`
- `azure.error.AuthenticationError`
- `azure.error.EventDataError`
- `azure.error.EventDataSendError`
- Renamed Sender/Receiver to EventHubProducer/EventHubConsumer.
- Renamed `add_sender` to `create_producer` and `add_receiver` to `create_consumer` in EventHubClient.
- EventHubConsumer is now iterable.
- Rename class azure.eventhub.Offset to azure.eventhub.EventPosition.
- Rename method `get_eventhub_info` to `get_properties` of EventHubClient.
- Reorganized connection management, EventHubClient is no longer responsible for opening/closing EventHubProducer/EventHubConsumer.
- Each EventHubProducer/EventHubConsumer is responsible for its own connection management.
- Added support for context manager on EventHubProducer and EventHubConsumer.
- Reorganized async APIs into "azure.eventhub.aio" namespace and rename to drop the "_async" suffix.
- Updated uAMQP dependency to 1.2.
## 1.3.1 (2019-02-28)
**BugFixes**
- Fixed bug where datetime offset filter was using a local timestamp rather than UTC.
- Fixed stackoverflow error in continuous connection reconnect attempts.
## 1.3.0 (2019-01-29)
**BugFixes**
- Added support for auto reconnect on token expiration and other auth errors (issue #89).
**Features**
- Added ability to create ServiceBusClient from an existing SAS auth token, including
providing a function to auto-renew that token on expiry.
- Added support for storing a custom EPH context value in checkpoint (PR #84, thanks @konstantinmiller)
## 1.2.0 (2018-11-29)
- Support for Python 2.7 in azure.eventhub module (azure.eventprocessorhost will not support Python 2.7).
- Parse EventData.enqueued_time as a UTC timestamp (issue #72, thanks @vjrantal)
## 1.1.1 (2018-10-03)
- Fixed bug in Azure namespace package.
## 1.1.0 (2018-09-21)
- Changes to `AzureStorageCheckpointLeaseManager` parameters to support other connection options (issue #61):
- The `storage_account_name`, `storage_account_key` and `lease_container_name` arguments are now optional keyword arguments.
- Added a `sas_token` argument that must be specified with `storage_account_name` in place of `storage_account_key`.
- Added an `endpoint_suffix` argument to support storage endpoints in National Clouds.
- Added a `connection_string` argument that, if specified, overrides all other endpoint arguments.
- The `lease_container_name` argument now defaults to `"eph-leases"` if not specified.
- Fix for clients failing to start if run called multipled times (issue #64).
- Added convenience methods `body_as_str` and `body_as_json` to EventData object for easier processing of message data.
## 1.0.0 (2018-08-22)
- API stable.
- Renamed internal `_async` module to `async_ops` for docs generation.
- Added optional `auth_timeout` parameter to `EventHubClient` and `EventHubClientAsync` to configure how long to allow for token
negotiation to complete. Default is 60 seconds.
- Added optional `send_timeout` parameter to `EventHubClient.add_sender` and `EventHubClientAsync.add_async_sender` to determine the
timeout for Events to be successfully sent. Default value is 60 seconds.
- Reformatted logging for performance.
## 0.2.0 (2018-08-06)
- Stability improvements for EPH.
- Updated uAMQP version.
- Added new configuration options for Sender and Receiver; `keep_alive` and `auto_reconnect`.
These flags have been added to the following:
- `EventHubClient.add_receiver`
- `EventHubClient.add_sender`
- `EventHubClientAsync.add_async_receiver`
- `EventHubClientAsync.add_async_sender`
- `EPHOptions.keey_alive_interval`
- `EPHOptions.auto_reconnect_on_error`
## 0.2.0rc2 (2018-07-29)
- **Breaking change** `EventData.offset` will now return an object of type `~uamqp.common.Offset` rather than str.
The original string value can be retrieved from `~uamqp.common.Offset.value`.
- Each sender/receiver will now run in its own independent connection.
- Updated uAMQP dependency to 0.2.0
- Fixed issue with IoTHub clients not being able to retrieve partition information.
- Added support for HTTP proxy settings to both EventHubClient and EPH.
- Added error handling policy to automatically reconnect on retryable error.
- Added keep-alive thread for maintaining an unused connection.
## 0.2.0rc1 (2018-07-06)
- **Breaking change** Restructured library to support Python 3.7. Submodule `async` has been renamed and all classes from
this module can now be imported from azure.eventhub directly.
- **Breaking change** Removed optional `callback` argument from `Receiver.receive` and `AsyncReceiver.receive`.
- **Breaking change** `EventData.properties` has been renamed to `EventData.application_properties`.
This removes the potential for messages to be processed via callback for not yet returned
in the batch.
- Updated uAMQP dependency to v0.1.0
- Added support for constructing IoTHub connections.
- Fixed memory leak in receive operations.
- Dropped Python 2.7 wheel support.
## 0.2.0b2 (2018-05-29)
- Added `namespace_suffix` to EventHubConfig() to support national clouds.
- Added `device_id` attribute to EventData to support IoT Hub use cases.
- Added message header to workaround service bug for PartitionKey support.
- Updated uAMQP dependency to vRC1.
## 0.2.0b1 (2018-04-20)
- Updated uAMQP to latest version.
- Further testing and minor bug fixes.
## 0.2.0a2 (2018-04-02)
- Updated uAQMP dependency.
Raw data
{
"_id": null,
"home_page": "https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/eventhub/azure-eventhub",
"name": "azure-eventhub",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "azure, azure sdk",
"author": "Microsoft Corporation",
"author_email": "azpysdkhelp@microsoft.com",
"download_url": "https://files.pythonhosted.org/packages/65/31/bf73f1f2816dc2b606b06b85b557637e7322bd4cac963d9f720aedb570b0/azure_eventhub-5.12.2.tar.gz",
"platform": null,
"description": "# Azure Event Hubs client library for Python\n\nAzure Event Hubs is a highly scalable publish-subscribe service that can ingest millions of events per second and stream\nthem to multiple consumers. This lets you process and analyze the massive amounts of data produced by your connected\ndevices and applications. Once Event Hubs has collected the data, you can retrieve, transform, and store it by using\nany real-time analytics provider or with batching/storage adapters. If you would like to know more about Azure Event Hubs,\nyou may wish to review: [What is Event Hubs](https://docs.microsoft.com/azure/event-hubs/event-hubs-about)?\n\nThe Azure Event Hubs client library allows for publishing and consuming of Azure Event Hubs events and may be used to:\n\n- Emit telemetry about your application for business intelligence and diagnostic purposes.\n- Publish facts about the state of your application which interested parties may observe and use as a trigger for taking action.\n- Observe interesting operations and interactions happening within your business or other ecosystem, allowing loosely coupled systems to interact without the need to bind them together.\n- Receive events from one or more publishers, transform them to better meet the needs of your ecosystem, then publish the transformed events to a new stream for consumers to observe.\n\n[Source code](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventhub/azure-eventhub/)\n| [Package (PyPi)](https://pypi.org/project/azure-eventhub/)\n| [Package (Conda)](https://anaconda.org/microsoft/azure-eventhub/)\n| [API reference documentation][api_reference]\n| [Product documentation](https://docs.microsoft.com/azure/event-hubs/)\n| [Samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/eventhub/azure-eventhub/samples)\n\n## Getting started\n\n### Prerequisites\n\n- Python 3.8 or later.\n- **Microsoft Azure Subscription:** To use Azure services, including Azure Event Hubs, you'll need a subscription.\nIf you do not have an existing Azure account, you may sign up for a free trial or use your MSDN subscriber benefits when you [create an account](https://azure.microsoft.com/free/).\n\n- **Event Hubs namespace with an Event Hub:** To interact with Azure Event Hubs, you'll also need to have a namespace and Event Hub available.\nIf you are not familiar with creating Azure resources, you may wish to follow the step-by-step guide\nfor [creating an Event Hub using the Azure portal](https://docs.microsoft.com/azure/event-hubs/event-hubs-create).\nThere, you can also find detailed instructions for using the Azure CLI, Azure PowerShell, or Azure Resource Manager (ARM) templates to create an Event Hub.\n\n### Install the package\n\nInstall the Azure Event Hubs client library for Python with pip:\n\n```\n$ pip install azure-eventhub\n```\n\n### Authenticate the client\n\nInteraction with Event Hubs starts with an instance of EventHubConsumerClient or EventHubProducerClient class. You need either the host name, SAS/AAD credential and event hub name or a connection string to instantiate the client object.\n\n**[Create client from connection string:](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventhub/azure-eventhub/samples/sync_samples/connection_string_authentication.py)**\n\nFor the Event Hubs client library to interact with an Event Hub, the easiest means is to use a connection string, which is created automatically when creating an Event Hubs namespace.\nIf you aren't familiar with shared access policies in Azure, you may wish to follow the step-by-step guide to [get an Event Hubs connection string](https://docs.microsoft.com/azure/event-hubs/event-hubs-get-connection-string).\n\n- The `from_connection_string` method takes the connection string of the form\n`Endpoint=sb://<yournamespace>.servicebus.windows.net/;SharedAccessKeyName=<yoursharedaccesskeyname>;SharedAccessKey=<yoursharedaccesskey>` and\nentity name to your Event Hub instance. You can get the connection string from the [Azure portal](https://docs.microsoft.com/azure/event-hubs/event-hubs-get-connection-string#get-connection-string-from-the-portal).\n\n**[Create client using the azure-identity library:](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventhub/azure-eventhub/samples/sync_samples/client_identity_authentication.py)**\n\nAlternately, one can use a Credential object to authenticate via AAD with the azure-identity package.\n\n- This constructor demonstrated in the sample linked above takes the host name and entity name of your Event Hub instance and credential that implements the\n[TokenCredential](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/core/azure-core/azure/core/credentials.py)\nprotocol. There are implementations of the `TokenCredential` protocol available in the\n[azure-identity package](https://pypi.org/project/azure-identity/). The host name is of the format `<yournamespace.servicebus.windows.net>`.\n- To use the credential types provided by `azure-identity`, please install the package:\n```pip install azure-identity```\n- Additionally, to use the async API, you must first install an async transport, such as [`aiohttp`](https://pypi.org/project/aiohttp/):\n```pip install aiohttp```\n- When using Azure Active Directory, your principal must be assigned a role which allows access to Event Hubs, such as the\nAzure Event Hubs Data Owner role. For more information about using Azure Active Directory authorization with Event Hubs,\nplease refer to [the associated documentation](https://docs.microsoft.com/azure/event-hubs/authorize-access-azure-active-directory).\n\n## Key concepts\n\n- An **EventHubProducerClient** is a source of telemetry data, diagnostics information, usage logs, or other log data,\nas part of an embedded device solution, a mobile device application, a game title running on a console or other device,\nsome client or server based business solution, or a web site.\n\n- An **EventHubConsumerClient** picks up such information from the Event Hub and processes it. Processing may involve aggregation,\ncomplex computation, and filtering. Processing may also involve distribution or storage of the information in a raw or transformed fashion.\nEvent Hub consumers are often robust and high-scale platform infrastructure parts with built-in analytics capabilities,\nlike Azure Stream Analytics, Apache Spark, or Apache Storm.\n\n- A **partition** is an ordered sequence of events that is held in an Event Hub. Azure Event Hubs provides message streaming\nthrough a partitioned consumer pattern in which each consumer only reads a specific subset, or partition, of the message stream.\nAs newer events arrive, they are added to the end of this sequence. The number of partitions is specified at the time an Event Hub is created and cannot be changed.\n\n- A **consumer group** is a view of an entire Event Hub. Consumer groups enable multiple consuming applications to each\nhave a separate view of the event stream, and to read the stream independently at their own pace and from their own position.\nThere can be at most 5 concurrent readers on a partition per consumer group; however it is recommended that there is only\none active consumer for a given partition and consumer group pairing. Each active reader receives all of the events from\nits partition; if there are multiple readers on the same partition, then they will receive duplicate events.\n\nFor more concepts and deeper discussion, see: [Event Hubs Features](https://docs.microsoft.com/azure/event-hubs/event-hubs-features).\nAlso, the concepts for AMQP are well documented in [OASIS Advanced Messaging Queuing Protocol (AMQP) Version 1.0](https://docs.oasis-open.org/amqp/core/v1.0/os/amqp-core-overview-v1.0-os.html).\n\n### Thread safety\n\nWe do not guarantee that the EventHubProducerClient or EventHubConsumerClient are thread-safe. We do not recommend reusing these instances across threads. It is up to the running application to use these classes in a thread-safe manner.\n\nThe data model type, `EventDataBatch` is not thread-safe. It should not be shared across threads nor used concurrently with client methods.\n\n## Examples\n\nThe following sections provide several code snippets covering some of the most common Event Hubs tasks, including:\n\n- [Inspect an Event Hub](#inspect-an-event-hub)\n- [Publish events to an Event Hub](#publish-events-to-an-event-hub)\n- [Consume events from an Event Hub](#consume-events-from-an-event-hub)\n- [Consume events from an Event Hub in batches](#consume-events-from-an-event-hub-in-batches)\n- [Publish events to an Event Hub asynchronously](#publish-events-to-an-event-hub-asynchronously)\n- [Consume events from an Event Hub asynchronously](#consume-events-from-an-event-hub-asynchronously)\n- [Consume events from an Event Hub in batches asynchronously](#consume-events-from-an-event-hub-in-batches-asynchronously)\n- [Consume events and save checkpoints using a checkpoint store](#consume-events-and-save-checkpoints-using-a-checkpoint-store)\n- [Use EventHubConsumerClient to work with IoT Hub](#use-eventhubconsumerclient-to-work-with-iot-hub)\n\n### Inspect an Event Hub\n\nGet the partition ids of an Event Hub.\n\n```python\nimport os\nfrom azure.eventhub import EventHubConsumerClient\nfrom azure.identity import DefaultAzureCredential\n\nFULLY_QUALIFIED_NAMESPACE = os.environ[\"EVENT_HUB_HOSTNAME\"]\nEVENTHUB_NAME = os.environ['EVENT_HUB_NAME']\n\nconsumer_client = EventHubConsumerClient(\n fully_qualified_namespace=FULLY_QUALIFIED_NAMESPACE,\n consumer_group='$Default',\n eventhub_name=EVENTHUB_NAME,\n credential=DefaultAzureCredential(),\n)\n\nwith consumer_client:\n pass # consumer_client is now ready to be used.\n```\n\n### Publish events to an Event Hub\n\nUse the `create_batch` method on `EventHubProducerClient` to create an `EventDataBatch` object which can then be sent using the `send_batch` method.\nEvents may be added to the `EventDataBatch` using the `add` method until the maximum batch size limit in bytes has been reached.\n\n<!-- SNIPPET:send.send_event_data_batch -->\n\n```python\ndef send_event_data_batch(producer):\n # Without specifying partition_id or partition_key\n # the events will be distributed to available partitions via round-robin.\n event_data_batch = producer.create_batch()\n event_data_batch.add(EventData('Single message'))\n producer.send_batch(event_data_batch)\n```\n\n<!-- END SNIPPET -->\n\n### Consume events from an Event Hub\n\nThere are multiple ways to consume events from an EventHub. To simply trigger a callback when an event is received,\nthe `EventHubConsumerClient.receive` method will be of use as follows:\n\n```python\nimport logging\nfrom azure.eventhub import EventHubConsumerClient\nfrom azure.identity import DefaultAzureCredential\n\nfully_qualified_namespace = '<< EVENT HUBS FULLY QUALIFIED NAMESPACE >>'\nconsumer_group = '<< CONSUMER GROUP >>'\neventhub_name = '<< NAME OF THE EVENT HUB >>'\nclient = EventHubConsumerClient(\n fully_qualified_namespace=fully_qualified_namespace,\n eventhub_name=eventhub_name,\n consumer_group=consumer_group,\n credential=DefaultAzureCredential(),\n)\n\nlogger = logging.getLogger(\"azure.eventhub\")\nlogging.basicConfig(level=logging.INFO)\n\ndef on_event(partition_context, event):\n logger.info(\"Received event from partition {}\".format(partition_context.partition_id))\n partition_context.update_checkpoint(event)\n\nwith client:\n client.receive(\n on_event=on_event,\n starting_position=\"-1\", # \"-1\" is from the beginning of the partition.\n )\n # receive events from specified partition:\n # client.receive(on_event=on_event, partition_id='0')\n```\n\n### Consume events from an Event Hub in batches\n\nWhereas the above sample triggers the callback for each message as it is received, the following sample\ntriggers the callback on a batch of events, attempting to receive a number at a time.\n\n```python\nimport logging\nfrom azure.eventhub import EventHubConsumerClient\nfrom azure.identity import DefaultAzureCredential\n\nfully_qualified_namespace = '<< EVENT HUBS FULLY QUALIFIED NAMESPACE >>'\nconsumer_group = '<< CONSUMER GROUP >>'\neventhub_name = '<< NAME OF THE EVENT HUB >>'\nclient = EventHubConsumerClient(\n fully_qualified_namespace=fully_qualified_namespace,\n eventhub_name=eventhub_name,\n consumer_group=consumer_group,\n credential=DefaultAzureCredential(),\n)\n\nlogger = logging.getLogger(\"azure.eventhub\")\nlogging.basicConfig(level=logging.INFO)\n\ndef on_event_batch(partition_context, events):\n logger.info(\"Received event from partition {}\".format(partition_context.partition_id))\n partition_context.update_checkpoint()\n\nwith client:\n client.receive_batch(\n on_event_batch=on_event_batch,\n starting_position=\"-1\", # \"-1\" is from the beginning of the partition.\n )\n # receive events from specified partition:\n # client.receive_batch(on_event_batch=on_event_batch, partition_id='0')\n```\n\n### Publish events to an Event Hub asynchronously\n\nUse the `create_batch` method on `EventHubProducer` to create an `EventDataBatch` object which can then be sent using the `send_batch` method.\nEvents may be added to the `EventDataBatch` using the `add` method until the maximum batch size limit in bytes has been reached.\n```python\nimport asyncio\nfrom azure.eventhub.aio import EventHubProducerClient # The package name suffixed with \".aio\" for async\nfrom azure.eventhub import EventData\nfrom azure.identity.aio import DefaultAzureCredential\n\nfully_qualified_namespace = '<< EVENT HUBS FULLY QUALIFIED NAMESPACE >>'\neventhub_name = '<< NAME OF THE EVENT HUB >>'\n\nasync def create_batch(client):\n event_data_batch = await client.create_batch()\n can_add = True\n while can_add:\n try:\n event_data_batch.add(EventData('Message inside EventBatchData'))\n except ValueError:\n can_add = False # EventDataBatch object reaches max_size.\n return event_data_batch\n\nasync def send():\n client = EventHubProducerClient(\n fully_qualified_namespace=fully_qualified_namespace,\n eventhub_name=eventhub_name,\n credential=DefaultAzureCredential(),\n )\n batch_data = await create_batch(client)\n async with client:\n await client.send_batch(batch_data)\n\nif __name__ == '__main__':\n loop = asyncio.get_event_loop()\n loop.run_until_complete(send())\n```\n\n### Consume events from an Event Hub asynchronously\n\nThis SDK supports both synchronous and asyncio based code. To receive as demonstrated in the samples above, but within\naio, one would need the following:\n\n```python\nimport logging\nimport asyncio\nfrom azure.eventhub.aio import EventHubConsumerClient\nfrom azure.identity.aio import DefaultAzureCredential\n\nfully_qualified_namespace = '<< EVENT HUBS FULLY QUALIFIED NAMESPACE >>'\nconsumer_group = '<< CONSUMER GROUP >>'\neventhub_name = '<< NAME OF THE EVENT HUB >>'\n\nlogger = logging.getLogger(\"azure.eventhub\")\nlogging.basicConfig(level=logging.INFO)\n\nasync def on_event(partition_context, event):\n logger.info(\"Received event from partition {}\".format(partition_context.partition_id))\n await partition_context.update_checkpoint(event)\n\nasync def receive():\n client = EventHubConsumerClient(\n fully_qualified_namespace=fully_qualified_namespace,\n eventhub_name=eventhub_name,\n consumer_group=consumer_group,\n credential=DefaultAzureCredential(),\n )\n async with client:\n await client.receive(\n on_event=on_event,\n starting_position=\"-1\", # \"-1\" is from the beginning of the partition.\n )\n # receive events from specified partition:\n # await client.receive(on_event=on_event, partition_id='0')\n\nif __name__ == '__main__':\n loop = asyncio.get_event_loop()\n loop.run_until_complete(receive())\n```\n\n### Consume events from an Event Hub in batches asynchronously\n\nAll synchronous functions are supported in aio as well. As demonstrated above for synchronous batch receipt, one can accomplish\nthe same within asyncio as follows:\n\n```python\nimport logging\nimport asyncio\nfrom azure.eventhub.aio import EventHubConsumerClient\nfrom azure.identity.aio import DefaultAzureCredential\n\nfully_qualified_namespace = '<< EVENT HUBS FULLY QUALIFIED NAMESPACE >>'\nconsumer_group = '<< CONSUMER GROUP >>'\neventhub_name = '<< NAME OF THE EVENT HUB >>'\n\nlogger = logging.getLogger(\"azure.eventhub\")\nlogging.basicConfig(level=logging.INFO)\n\nasync def on_event_batch(partition_context, events):\n logger.info(\"Received event from partition {}\".format(partition_context.partition_id))\n await partition_context.update_checkpoint()\n\nasync def receive_batch():\n client = EventHubConsumerClient(\n fully_qualified_namespace=fully_qualified_namespace,\n eventhub_name=eventhub_name,\n consumer_group=consumer_group,\n credential=DefaultAzureCredential(),\n )\n async with client:\n await client.receive_batch(\n on_event_batch=on_event_batch,\n starting_position=\"-1\", # \"-1\" is from the beginning of the partition.\n )\n # receive events from specified partition:\n # await client.receive_batch(on_event_batch=on_event_batch, partition_id='0')\n\nif __name__ == '__main__':\n loop = asyncio.get_event_loop()\n loop.run_until_complete(receive_batch())\n```\n\n### Consume events and save checkpoints using a checkpoint store\n\n`EventHubConsumerClient` is a high level construct which allows you to receive events from multiple partitions at once\nand load balance with other consumers using the same Event Hub and consumer group.\n\nThis also allows the user to track progress when events are processed using checkpoints.\n\nA checkpoint is meant to represent the last successfully processed event by the user from a particular partition of\na consumer group in an Event Hub instance. The `EventHubConsumerClient` uses an instance of `CheckpointStore` to update checkpoints\nand to store the relevant information required by the load balancing algorithm.\n\nSearch pypi with the prefix `azure-eventhub-checkpointstore` to\nfind packages that support this and use the `CheckpointStore` implementation from one such package. Please note that both sync and async libraries are provided.\n\nIn the below example, we create an instance of `EventHubConsumerClient` and use a `BlobCheckpointStore`. You need\nto [create an Azure Storage account](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-portal)\nand a [Blob Container](https://docs.microsoft.com/azure/storage/blobs/storage-quickstart-blobs-portal#create-a-container) to run the code.\n\n[Azure Blob Storage Checkpoint Store Async](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventhub/azure-eventhub-checkpointstoreblob-aio)\nand [Azure Blob Storage Checkpoint Store Sync](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventhub/azure-eventhub-checkpointstoreblob)\nare one of the `CheckpointStore` implementations we provide that applies Azure Blob Storage as the persistent store.\n\n\n```python\nimport asyncio\n\nfrom azure.eventhub.aio import EventHubConsumerClient\nfrom azure.eventhub.extensions.checkpointstoreblobaio import BlobCheckpointStore\nfrom azure.identity.aio import DefaultAzureCredential\n\nfully_qualified_namespace = '<< EVENT HUBS FULLY QUALIFIED NAMESPACE >>'\nconsumer_group = '<< CONSUMER GROUP >>'\neventhub_name = '<< NAME OF THE EVENT HUB >>'\nblob_account_url = '<< STORAGE ACCOUNT URL >>'\ncontainer_name = '<<NAME OF THE BLOB CONTAINER>>'\n\nasync def on_event(partition_context, event):\n # do something\n await partition_context.update_checkpoint(event) # Or update_checkpoint every N events for better performance.\n\nasync def receive(client):\n await client.receive(\n on_event=on_event,\n starting_position=\"-1\", # \"-1\" is from the beginning of the partition.\n )\n\nasync def main():\n checkpoint_store = BlobCheckpointStore(\n blob_account_url=blob_account_url,\n container_name=container_name,\n credential=DefaultAzureCredential()\n )\n client = EventHubConsumerClient(\n fully_qualified_namespace=fully_qualified_namespace,\n eventhub_name=eventhub_name,\n credential=DefaultAzureCredential(),\n consumer_group=consumer_group,\n checkpoint_store=checkpoint_store, # For load balancing and checkpoint. Leave None for no load balancing\n )\n async with client:\n await receive(client)\n\nif __name__ == '__main__':\n loop = asyncio.get_event_loop()\n loop.run_until_complete(main())\n```\n\n### Use EventHubConsumerClient to work with IoT Hub\n\nYou can use `EventHubConsumerClient` to work with IoT Hub as well. This is useful for receiving telemetry data of IoT Hub from the\nlinked EventHub. The associated connection string will not have send claims, hence sending events is not possible.\n\nPlease notice that the connection string needs to be for an [Event Hub-compatible endpoint](https://docs.microsoft.com/azure/iot-hub/iot-hub-devguide-messages-read-builtin),\ne.g. \"Endpoint=sb://my-iothub-namespace-[uid].servicebus.windows.net/;SharedAccessKeyName=my-SA-name;SharedAccessKey=my-SA-key;EntityPath=my-iot-hub-name\"\n\nThere are two ways to get the Event Hubs compatible endpoint:\n- Manually get the \"Built-in endpoints\" of the IoT Hub in Azure Portal and receive from it.\n```python\nfrom azure.eventhub import EventHubConsumerClient\n\nconnection_str = 'Endpoint=sb://my-iothub-namespace-[uid].servicebus.windows.net/;SharedAccessKeyName=my-SA-name;SharedAccessKey=my-SA-key;EntityPath=my-iot-hub-name'\nconsumer_group = '<< CONSUMER GROUP >>'\nclient = EventHubConsumerClient.from_connection_string(connection_str, consumer_group)\n\npartition_ids = client.get_partition_ids()\n```\n- Programmatically retrieve the built-in Event Hubs compatible endpoint.\nRefer to [IoT Hub Connection String Sample](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventhub/azure-eventhub/samples/async_samples/iot_hub_connection_string_receive_async.py).\n\n## Troubleshooting\n\nSee the `azure-eventhub` [troubleshooting guide](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventhub/azure-eventhub/TROUBLESHOOTING.md) for details on how to diagnose various failure scenarios.\n\n### Logging\n\n- Enable `azure.eventhub` logger to collect traces from the library.\n- Enable AMQP frame level trace by setting `logging_enable=True` when creating the client.\n- Refer to [this guide](https://learn.microsoft.com/azure/developer/python/sdk/azure-sdk-logging) on configuring logging for Azure libraries for Python for additional information.\n\n```python\nimport logging\nimport sys\n\nhandler = logging.StreamHandler(stream=sys.stdout)\nlogger = logging.getLogger('azure.eventhub')\nlogger.setLevel(logging.DEBUG)\nlogger.addHandler(handler)\n\n...\n\nfrom azure.eventhub import EventHubProducerClient, EventHubConsumerClient\n\nproducer = EventHubProducerClient(..., logging_enable=True)\nconsumer = EventHubConsumerClient(..., logging_enable=True)\n```\n\n## Next steps\n\n### More sample code\n\nPlease take a look at the [samples](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/eventhub/azure-eventhub/samples) directory for detailed examples of how to use this library to send and receive events to/from Event Hubs.\n\n### Documentation\n\nReference documentation is available [here](https://azuresdkdocs.blob.core.windows.net/$web/python/azure-eventhub/latest/azure.eventhub.html).\n\n### Schema Registry and Avro Encoder\n\nThe EventHubs SDK integrates nicely with the [Schema Registry][schemaregistry_service] service and [Avro][avro].\nFor more information, please refer to [Schema Registry SDK][schemaregistry_repo] and [Schema Registry Avro Encoder SDK][schemaregistry_avroencoder_repo].\n\n### Pure Python AMQP Transport and Backward Compatibility Support\n\nThe Azure Event Hubs client library is now based on a pure Python AMQP implementation. `uAMQP` has been removed as required dependency.\n\nTo use `uAMQP` as the underlying transport:\n\n1. Install `uamqp` with pip.\n\n```\n$ pip install uamqp \n```\n\n2. Pass `uamqp_transport=True` during client construction.\n\n```python\nfrom azure.eventhub import EventHubProducerClient, EventHubConsumerClient\nfrom azure.identity import DefaultAzureCredential\n\nfully_qualified_namespace = '<< EVENT HUBS FULLY QUALIFIED NAMESPACE >>'\nconsumer_group = '<< CONSUMER GROUP >>'\neventhub_name = '<< NAME OF THE EVENT HUB >>'\n\nclient = EventHubProducerClient(\n fully_qualified_namespace=fully_qualified_namespace,\n eventhub_name=eventhub_name,\n credential=DefaultAzureCredential(),\n uamqp_transport=True\n)\nclient = EventHubConsumerClient(\n fully_qualified_namespace=fully_qualified_namespace,\n eventhub_name=eventhub_name,\n credential=DefaultAzureCredential(),\n consumer_group=consumer_group,\n uamqp_transport=True\n)\n```\n\nNote: The `message` attribute on `EventData`/`EventDataBatch`, which previously exposed the `uamqp.Message`, has been deprecated.\n The \"Legacy\" objects returned by `EventData.message`/`EventDataBatch.message` have been introduced to help facilitate the transition.\n\n### Building uAMQP wheel from source\n\nIf [uAMQP](https://pypi.org/project/uamqp/) is intended to be used as the underlying AMQP protocol implementation for `azure-eventhub`,\nuAMQP wheels can be found for most major operating systems.\n\nIf you intend to use `uAMQP` and you're running on a platform for which uAMQP wheels are not provided, please follow\n the [uAMQP Installation](https://github.com/Azure/azure-uamqp-python#installation) guidance to install from source.\n\n### Provide Feedback\n\nIf you encounter any bugs or have suggestions, please file an issue in the [Issues](https://github.com/Azure/azure-sdk-for-python/issues) section of the project.\n\n## Contributing\n\nThis project welcomes contributions and suggestions. Most contributions require you to agree to a\nContributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.\n\nWhen you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the\nPR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.\n\nThis project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).\nFor more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.\n\n[avro]: https://avro.apache.org/\n[api_reference]: https://docs.microsoft.com/python/api/overview/azure/eventhub-readme\n[schemaregistry_service]: https://aka.ms/schemaregistry\n[schemaregistry_repo]: https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/schemaregistry/azure-schemaregistry\n[schemaregistry_avroencoder_repo]: https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/schemaregistry/azure-schemaregistry-avroencoder\n\n![Impressions](https://azure-sdk-impressions.azurewebsites.net/api/impressions/azure-sdk-for-python/sdk/eventhub/azure-eventhub/README.png)\n\n\n# Release History\n\n## 5.12.2 (2024-10-02)\n\n### Bugs Fixed\n- Implemented backpressure for async consumer to address a memory leak issue. ([#36398](https://github.com/Azure/azure-sdk-for-python/issues/36398))\n\n## 5.12.1 (2024-06-11)\n\n### Bugs Fixed\n\n- Fixed a bug where the correct URI was not being used for consumer authentication, causing issues when assigning roles at the consumer group level. ([#35337](https://github.com/Azure/azure-sdk-for-python/issues/35337))\n\n## 5.12.0 (2024-05-16)\n\n### Features Added\n\n- Added support for non-tls connections (#34272)\n\n## 5.11.7 (2024-04-10)\n\n### Bugs Fixed\n\n- Fixed a bug where using `EventHubProducerClient` in buffered mode could potentially drop a buffered message without actually sending it. ([#34712](https://github.com/Azure/azure-sdk-for-python/pull/34712))\n\n### Other Changes\n\n- Updated network trace logging to replace `None` values in AMQP connection info with empty strings as per the OpenTelemetry specification.\n\n## 5.11.6 (2024-02-12)\n\nThis version and all future versions will require Python 3.8+. Python 3.7 is no longer supported.\n\n### Features Added\n\n- Added `keep_alive` functionality on EventHubProducerClient to allow for long-living producers. [#33726](https://github.com/Azure/azure-sdk-for-python/issues/33726)\n\n### Other Changes\n\n- Added support for Python 3.12.\n\n## 5.11.5 (2023-11-13)\n\n### Bugs Fixed\n\n- Fixed a pyAMQP error where events split across multiple TransferFrames were depleting the link credit by more than 1 credit per message. ([#32767](https://github.com/Azure/azure-sdk-for-python/pull/32767))\n\n## 5.11.4 (2023-08-08)\n\n### Features Added\n\n- A new float keyword argument `socket_timeout` has been added to the sync and async `EventHubConsumerClient` and `EventHubProducerClient`.\n\n### Bugs Fixed\n\n- Fixed bug [#31258](https://github.com/Azure/azure-sdk-for-python/issues/31258) where async `BlobCheckpointStore` was reprocessing old events after an error.\n\n## 5.11.3 (2023-07-12)\n\n### Bugs Fixed\n\n- Fixed the error `end frame received on invalid channel` which was raised when a disconnect was sent by the service ([#30860](https://github.com/Azure/azure-sdk-for-python/pull/30860))\n- Fixed the error `link already closed` which was raised when the client was closing and disconnecting from the service ([#30836](https://github.com/Azure/azure-sdk-for-python/pull/30836))\n\n### Other Changes\n\n - Updated tracing ([#29934](https://github.com/Azure/azure-sdk-for-python/pull/29934)):\n - Span names renamed:\n - `Azure.EventHubs.send` to `EventHubs.send`\n - `Azure.EventHubs.message` to `EventHubs.message`\n - `Azure.EventHubs.process` to `EventHubs.process`\n - An `EventHubs.receive` span will be created upon receiving events.\n - Additional attributes added to spans:\n - `messaging.system` - messaging system (i.e., `eventhubs`)\n - `messaging.operation` - type of operation (i.e., `publish`, `receive`, or `process`)\n - `messaging.batch.message_count` - number of messages sent, received, or processed (if more than one)\n - The `component` attribute was removed from all spans.\n - All `send` spans now contain links to `message` spans. Now, `message` spans will no longer contain a link to the `send` span.\n - Message application properties will now contain values for `traceparent` (and `tracestate` if applicable)\n - Process spans will now be a direct children of message span contexts in when event handling on a per-message basis. ([#30537](https://github.com/Azure/azure-sdk-for-python/pull/30537))\n\n## 5.11.2 (2023-03-20)\n\n### Bugs Fixed\n\n- Fixed a bug that would prevent reconnect after a long idle period, network drop (issue #28996)\n\n## 5.11.1 (2023-01-25)\n\n### Bugs Fixed\n\n- Fixed a bug where, when `websocket-client` was not installed, the error was not caught/raised properly (issue #28453).\n\n## 5.11.0 (2023-01-19)\n\nVersion 5.11.0 is our first stable release of the Azure Event Hubs client library based on a pure Python implemented AMQP stack.\n\n### Features Added\n\n- A new boolean keyword argument `uamqp_transport` has been added to sync and async `EventHubProducerClient`/`EventHubConsumerClient` constructors which indicates whether to use the `uamqp` library or the default pure Python AMQP library as the underlying transport.\n\n### Bugs Fixed\n\n- Fixed a bug that caused an error when sending batches with tracing enabled (issue #27986).\n- Fixed a bug where `EventHubSharedKeyCredential` returned an `AccessToken.token` of type `bytes` and not `str`, now matching the documentation.\n\n### Other Changes\n\n- The `message` attribute on `EventData`/`EventDataBatch`, which previously exposed the `uamqp.Message`, has been deprecated.\n - `LegacyMessage`/`LegacyBatchMessage` objects returned by the `message` attribute on `EventData`/`EventDataBatch` have been introduced to help facilitate the transition.\n- Removed uAMQP from required dependencies.\n- Adding `uamqp >= 1.6.3` as an optional dependency for use with the `uamqp_transport` keyword.\n - Added support for Python 3.11.\n\n## 5.8.0b2 (2022-10-11)\n\n### Features Added\n\n- Updated the optional dependency for async transport using AMQP over WebSocket from `websocket-client` to `aiohttp` (Issue #24315, thanks @hansmbakker for the suggestion).\n\n## 5.8.0b1 (2022-09-22)\n\nThis version and all future versions will require Python 3.7+. Python 3.6 is no longer supported.\n\n### Other Changes\n\n- Added the `uamqp_transport` optional parameter to the clients, to allow switching to the `uamqp` library as the transport.\n\n## 5.10.1 (2022-08-22)\n\nThis version and all future versions will require Python 3.7+, Python 3.6 is no longer supported.\n\n### Bugs Fixed\n\n- Fixed a bug in async `BufferedProducer` that would block when flushing the queue causing the client to freeze up (issue #23510).\n- Fixed a bug in the async `EventHubProducerClient` and `EventHubConsumerClient` that set the default value of the `transport_type` parameter in the `from_connection_string` methods to `None` rather than `TransportType.Amqp`.\n\n### Other Changes\n\n- Internal refactoring to support upcoming Pure Python AMQP-based release.\n- Updated uAMQP dependency to 1.6.0.\n\n## 5.8.0a5 (2022-07-19)\n\n### Bugs Fixed\n\n- Fixed bug that prevented token refresh at regular intervals.\n- Fixed bug that was improperly passing the debug keyword argument, so that network trace debug logs are output when requested.\n\n### Other Changes\n\n- Added logging added in to track proper token refreshes & fetches, output exception reason for producer init failure.\n\n## 5.10.0 (2022-06-08)\n\n### Features Added\n\n- Includes the following features related to buffered sending of events:\n - A new method `send_event` to `EventHubProducerClient` which allows sending single `EventData` or `AmqpAnnotatedMessage`.\n - Buffered mode sending to `EventHubProducerClient` which is intended to allow for efficient publishing of events\n without having to explicitly manage batches in the application.\n - The constructor of `EventHubProducerClient` and `from_connection_string` method takes the following new keyword arguments\n for configuration:\n - `buffered_mode`: The flag to enable/disable buffered mode sending.\n - `on_success`: The callback to be called once events have been successfully published.\n - `on_error`: The callback to be called once events have failed to be published.\n - `max_buffer_length`: The total number of events per partition that can be buffered before a flush will be triggered.\n - `max_wait_time`: The amount of time to wait for a batch to be built with events in the buffer before publishing.\n - A new method `EventHubProducerClient.flush` which flushes events in the buffer to be sent immediately.\n - A new method `EventHubProducerClient.get_buffered_event_count` which returns the number of events that are buffered and waiting to be published for a given partition.\n - A new property `EventHubProducerClient.total_buffered_event_count` which returns the total number of events that are currently buffered and waiting to be published, across all partitions.\n - A new boolean keyword argument `flush` to `EventHubProducerClient.close` which indicates whether to flush the buffer or not while closing.\n\n## 5.8.0a4 (2022-06-07)\n\n### Features Added\n\n- Added support for connection using websocket and http proxy.\n- Added support for custom endpoint connection over websocket.\n\n## 5.9.0 (2022-05-10)\n\n### Features Added\n\n- The classmethod `from_message_content` has been added to `EventData` for interoperability with the Schema Registry Avro Encoder library, and takes `content` and `content_type` as positional parameters.\n\n### Other Changes\n\n- Features related to buffered sending of events are still in beta and will not be included in this release.\n\n## 5.9.0b3 (2022-04-20)\n\n### Features Added\n\n- Introduced new method `send_event` to `EventHubProducerClient` which allows sending single `EventData` or `AmqpAnnotatedMessage`.\n- Introduced buffered mode sending to `EventHubProducerClient` which is intended to allow for efficient publishing of events\n without having to explicitly manage batches in the application.\n - The constructor of `EventHubProducerClient` and `from_connection_string` method now takes the following new keyword arguments\n for configuration:\n - `buffered_mode`: The flag to enable/disable buffered mode sending.\n - `on_success`: The callback to be called once events have been successfully published.\n - `on_error`: The callback to be called once events have failed to be published.\n - `max_buffer_length`: The total number of events per partition that can be buffered before a flush will be triggered.\n - `max_wait_time`: The amount of time to wait for a batch to be built with events in the buffer before publishing.\n - Introduced new method `EventHubProducerClient.flush` which flushes events in the buffer to be sent immediately.\n - Introduced new method `EventHubProducerClient.get_buffered_event_count` which returns the number of events that are buffered and waiting to be published for a given partition.\n - Introduced new property `EventHubProducerClient.total_buffered_event_count` which returns the total number of events that are currently buffered and waiting to be published, across all partitions.\n - Introduced new boolean keyword argument `flush` to `EventHubProducerClient.close` which indicates whether to flush the buffer or not while closing.\n\n### Other Changes\n\n- Updated `EventData` internals for interoperability with the Schema Registry Avro Encoder library.\n\n## 5.9.0b2 (2022-03-09)\n\n### Breaking Changes\n\n- `from_message_data` on `EventData` has been renamed `from_message_content` for interoperability with the Schema Registry Avro Encoder library. The `data` parameter has been renamed to `content`.\n\n## 5.8.0a3 (2022-03-08)\n\n### Other Changes\n\n- Improved the performance of async sending and receiving.\n\n## 5.9.0b1 (2022-02-09)\n\n- The following features have been temporarily pulled out of async `EventHubProducerClient` and `EventHubConsumerClient` which will be added back in future previews as we work towards a stable release:\n - Passing the following keyword arguments to the constructors and `from_connection_string` methods of the `EventHubProducerClient` and `EventHubConsumerClient` is not supported: `transport_type`, `http_proxy`, `custom_endpoint_address`, and `connection_verify`.\n\n## 5.8.0a2 (2022-02-09)\n\n### Features Added\n\n- Added support for async `EventHubProducerClient` and `EventHubConsumerClient`.\n\n## 5.8.0a1 (2022-01-13)\n\nVersion 5.8.0a1 is our first efforts to build an Azure Event Hubs client library based on pure python implemented AMQP stack.\n\n### Breaking changes\n\n- The following features have been temporarily pulled out which will be added back in future previews as we work towards a stable release:\n - Async is not supported.\n - Passing the following keyword arguments to the constructors and `from_connection_string` methods of the `EventHubProducerClient` and `EventHubConsumerClient` is not supported: `transport_type`, `http_proxy`, `custom_endpoint_address`, and `connection_verify`.\n\n### Other Changes\n\n- uAMQP dependency is removed.\n\n## 5.7.0 (2022-01-12)\n\nThis version and all future versions will require Python 3.6+. Python 2.7 is no longer supported.\n\n### Features Added\n\n- Added support for fixed (linear) retry backoff:\n - Sync/async `EventHubProducerClient` and `EventHubConsumerClient` constructors and `from_connection_string` take `retry_mode` as a keyword argument.\n\n### Bugs Fixed\n\n- Fixed a bug that `EventHubProducerClient` could be reopened for sending events instead of encountering with `KeyError` when the client is previously closed (issue #21849).\n\n### Other Changes\n\n- Improved token refresh timing to prevent potentially blocking main flow when the token is about to get expired soon.\n- Updated uAMQP dependency to 1.5.1.\n\n## 5.6.1 (2021-10-06)\n\n### Bugs Fixed\n\n- Fixed a bug for checking that `azure.eventhub.amqp.AmqpMessageHeader` and `azure.eventhub.amqp.AmqpMessageProperties` contain specific properties using the `in` keyword.\n\n### Other Changes\n\n- Updated uAMQP dependency to 1.4.3.\n - Added support for Python 3.10.\n - Fixed memory leak in win32 socketio and tlsio (issue #19777).\n - Fixed memory leak in the process of converting AMQPValue into string (issue #19777).\n\n## 5.6.0 (2021-07-07)\n\n### Features Added\n\n- Added support for sending AMQP annotated message which allows full access to the AMQP message fields.\n - Introduced new namespace `azure.eventhub.amqp`.\n - Added new enum class `azure.eventhub.amqp.AmqpMessageBodyType` to represent the body type of the message which includes:\n - `DATA`: The body of message consists of one or more data sections and each section contains opaque binary data.\n - `SEQUENCE`: The body of message consists of one or more sequence sections and each section contains an arbitrary number of structured data elements.\n - `VALUE`: The body of message consists of one amqp-value section and the section contains a single AMQP value.\n - Introduced new class `azure.eventhub.amqp.AmqpAnnotatedMessage` for accessing low-level amqp message sections which can be instantiated for sending.\n - Introduced new classes `azure.eventhub.amqp.AmqpMessageHeader` and `azure.eventhub.amqp.AmqpMessageProperties` for accessing amqp header and properties.\n - Added new property `body_type` on `azure.eventhub.EventData` which returns `azure.eventhub.amqp.AmqpMessageBodyType`.\n - Added new read-only property `raw_amqp_message` on `azure.eventhub.EventData` which returns `azure.eventhub.amqp.AmqpAnnotatedMessage`.\n\n### Fixed\n\n- Updated uAMQP dependency to 1.4.1.\n - Fixed a bug that attributes creation_time, absolute_expiry_time and group_sequence on MessageProperties should be compatible with integer types on Python 2.7.\n\n## 5.5.0 (2021-05-13)\n\n**New Features**\n\n- Added support for using `azure.core.credentials.AzureNamedKeyCredential` as credential for authenticating producer and consumer clients.\n\n**Bug Fixes**\n\n- Fixed bug that custom user agent string should be put in front of the built-in user agent string instead of being appended.\n- Updated uAMQP dependency to 1.4.0.\n - Fixed memory leaks in the process of link attach where source and target cython objects are not properly deallocated (#15747).\n - Improved management operation callback not to parse description value of non AMQP_TYPE_STRING type as string (#18361).\n\n**Notes**\n\n- Updated azure-core dependency to 1.14.0.\n\n## 5.4.0 (2021-04-07)\n\nThis version follows from version 5.3.1, rather than 5.4.0b1 so that the preview idempotent producer feature is not included.\n\n**New Features**\n\n- Added support for using `azure.core.credentials.AzureSasCredential` as credential for authenticating producer and consumer clients.\n- Updated `list_ownership`, `claim_ownership`, `update_checkpoint`, `list_checkpoints` on sync and async `CheckpointStore` to support taking `**kwargs`.\n - WARNING: Implementing a custom checkpointstore that does not support taking `**kwargs` in the methods listed previously will result in the following pylint error: `W0221: Parameters differ from overridden ________ method (arguments-differ)`.\n- Updated `update_checkpoint` on sync and async `PartitionContext` to support taking `**kwargs`.\n\n**Bug Fixes**\n\n- Updated uAMQP dependency to 1.3.0.\n - Fixed bug that sending message of large size triggering segmentation fault when the underlying socket connection is lost (#13739, #14543).\n - Fixed bug in link flow control where link credit and delivery count should be calculated based on per message instead of per transfer frame (#16934).\n\n**Notes**\n\n- Updated azure-core dependency to 1.13.0.\n\n## 5.4.0b1 (2021-03-09)\n\nThis version and all future versions will require Python 2.7 or Python 3.6+, Python 3.5 is no longer supported.\n\n**New Features**\n\n- Added support for idempotent publishing which is supported by the service to endeavor to reduce the number of duplicate\n events that are published.\n - `EventHubProducerClient` constructor accepts two new parameters for idempotent publishing:\n - `enable_idempotent_partitions`: A boolean value to tell the `EventHubProducerClient` whether to enable idempotency.\n - `partition_config`: The set of configurations that can be specified to influence publishing behavior\n specific to the configured Event Hub partition.\n - Introduced a new method `get_partition_publishing_properties` on `EventHubProducerClient` to inspect the information\n about the state of publishing for a partition.\n - Introduced a new property `published_sequence_number` on `EventData` to get the publishing sequence number assigned\n to the event at the time it was successfully published.\n - Introduced a new property `starting_published_sequence_number` on `EventDataBatch` to get the publishing sequence\n number assigned to the first event in the batch at the time the batch was successfully published.\n - Introduced a new class `azure.eventhub.PartitionPublishingConfiguration` which is a set of configurations that can be\n specified to influence the behavior when publishing directly to an Event Hub partition.\n\n**Notes**\n\n- Updated uAMQP dependency to 1.2.15.\n\n## 5.3.1 (2021-03-09)\n\nThis version will be the last version to officially support Python 3.5, future versions will require Python 2.7 or Python 3.6+.\n\n**Bug fixes**\n\n- Sending empty `event_data_batch` will be a no-op now instead of raising error.\n\n## 5.3.0 (2021-02-08)\n\n**New Features**\n\n- Added a `parse_connection_string` method which parses a connection string into a properties bag, `EventHubConnectionStringProperties`, containing its component parts.\n- The constructor and `from_connection_string` method of `EventHubConsumerClient` and `EventHubProducerClient` now accept two new optional arguments:\n - `custom_endpoint_address` which allows for specifying a custom endpoint to use when communicating with the Event Hubs service,\nand is useful when your network does not allow communicating to the standard Event Hubs endpoint.\n - `connection_verify` which allows for specifying the path to the custom CA_BUNDLE file of the SSL certificate which is used to authenticate\nthe identity of the connection endpoint.\n\n**Notes**\n\n- Updated uAMQP dependency to 1.2.14.\n\n## 5.2.1 (2021-01-11)\n\n**Bug fixes**\n\n- Updated `azure.eventhub.extension.__init__.py` to be compatible with pkgutil-style namespace (PR #13210, thanks @pjachowi).\n- Updated uAMQP dependency to 1.2.13\n - Added support for Python 3.9.\n - Fixed bug that macOS was unable to detect network error (#15473).\n - Fixed bug that `uamqp.ReceiveClient` and `uamqp.ReceiveClientAsync` receive messages during connection establishment (#15555).\n - Fixed bug where connection establishment on macOS with Clang 12 triggering unrecognized selector exception (#15567).\n - Fixed bug in accessing message properties triggering segmentation fault when the underlying C bytes are NULL (#15568).\n\n## 5.2.0 (2020-09-08)\n\n**New Features**\n\n- Connection strings used with `from_connection_string` methods now supports using the `SharedAccessSignature` key in leiu of `sharedaccesskey` and `sharedaccesskeyname`, taking the string of the properly constructed token as value.\n\n## 5.2.0b1 (2020-07-06)\n\n**New Features**\n\n- `EventHubConsumerClient` constructor accepts two new parameters for the load balancer.\n - `load_balancing_strategy`, which can be \"greedy\" or \"balanced\".\n With greedy strategy, one execution of load balancing will claim as many partitions as required to balance the load\n whereas with balanced strategy one execution of load balancing will claim at most 1 partition.\n - `partition_ownership_expiration_interval`, which allows you to customize the partition ownership expiration for load balancing.\n A consumer client may lose its owned partitions more often with a smaller expiration interval. But a larger interval\n may result in idle partitions not being claimed for longer time.\n- Added enum class `azure.eventhub.LoadBalancingStrategy` for `load_balancing_strategy`.\n\n## 5.1.0 (2020-05-04)\n\n**New Features**\n\n- `EventHubProducerClient.send_batch` accepts either an `EventDataBatch` or a finite list of `EventData`. #9181\n- Added enqueueTime to span links of distributed tracing. #9599\n\n**Bug fixes**\n\n- Fixed a bug that turned `azure.eventhub.EventhubConsumerClient` into an exclusive receiver when it has no checkpoint store. #11181\n- Updated uAMQP dependency to 1.2.7.\n - Fixed bug in setting certificate of tlsio on MacOS. #7201\n - Fixed bug that caused segmentation fault in network tracing on MacOS when setting `logging_enable` to `True` in `EventHubConsumerClient` and `EventHubProducerClient`.\n\n## 5.1.0b1 (2020-04-06)\n\n**New Features**\n\n- Added `EventHubConsumerClient.receive_batch()` to receive and process events in batches instead of one by one. #9184\n- `EventHubConsumerCliuent.receive()` has a new param `max_wait_time`.\n`on_event` is called every `max_wait_time` when no events are received and `max_wait_time` is not `None` or 0.\n- Param event of `PartitionContext.update_checkpoint` is now optional. The last received event is used when param event is not passed in.\n- `EventData.system_properties` has added missing properties when consuming messages from IotHub. #10408\n\n## 5.0.1 (2020-03-09)\n\n**Bug fixes**\n\n- Fixed a bug that swallowed errors when receiving events with `azure.eventhub.EventHubConsumerClient` #9660\n- Fixed a bug that caused `get_eventhub_properties`, `get_partition_ids`, and `get_partition_properties` to raise\nan error on Azure Stack #9920\n\n## 5.0.0 (2020-01-13)\n\n**Breaking changes**\n\n- `EventData`\n - Removed deprecated property `application_properties` and deprecated method `encode_message()`.\n- `EventHubConsumerClient`\n - `on_error` would be called when `EventHubConsumerClient` failed to claim ownership of partitions.\n - `on_partition_close` and `on_partition_initialize` would be called in the case of exceptions raised by `on_event` callback.\n - `EventHubConsumerClient` would close and re-open the internal partition receiver in this case.\n - Default starting position from where `EventHubConsumerClient` should resume receiving after recovering from an error has been re-prioritized.\n - If there is checkpoint, it will resume from the checkpoint.\n - If there is no checkpoint but `starting_position` is provided, it will resume from `starting_posititon`.\n - If there is no checkpoint or `starting_position`, it will resume from the latest position.\n- `PartitionContext`\n - `update_checkpoint` would do in-memory checkpoint instead of doing nothing when checkpoint store is not explicitly provided.\n - The in-memory checkpoints would be used for `EventHubConsumerClient` receiving recovering.\n- `get_partition_ids`, `get_partition_properties`, `get_eventhub_properties` would raise error in the case of service returning an error status code.\n - `AuthenticationError` would be raised when service returning error code 401.\n - `ConnectError` would be raised when service returning error code 404.\n - `EventHubError` would be raised when service returning other error codes.\n\n## 5.0.0b6 (2019-12-03)\n\n**Breaking changes**\n\n- All exceptions should now be imported from `azure.eventhub.exceptions`.\n- Introduced separate `EventHubSharedKeyCredential` objects for synchronous and asynchronous operations.\n For async, import the credentials object from the `azure.eventhub.aio` namespace.\n- `EventData`\n - Renamed property `application_properties` to `properties`.\n - `EventData` no longer has attribute `last_enqueued_event_properties` - use this on `PartitionContext` instead.\n- `EvenDataBatch`\n - `EventDataBatch.try_add` has been renamed to `EventDataBatch.add`.\n - Renamed property `size` to `size_in_bytes`.\n - Renamed attribute `max_size` to `max_size_in_bytes`.\n- `EventHubConsumerClient` and `EventHubProducerClient`\n - Renamed method `get_properties` to `get_eventhub_properties`.\n - Renamed parameters in constructor: `host` to `fully_qualified_namespace`, `event_hub_path` to `eventhub_name`.\n - Renamed parameters in `get_partition_properties`: `partition` to `partition_id`.\n - Renamed parameter `consumer_group_name` to `consumer_group` and moved that parameter from `receive` method to the constructor of `EventHubConsumerClient`.\n - Renamed parameter `initial_event_position` to `starting_position` on the `receive` method of `EventHubConsumerClient`.\n - Renamed parameter `event_hub_path` to `eventhub_name` in constructor and `from_connection_string` method of the client object.\n - `EventHubProducerClient.send` has been renamed to `send_batch` which will only accept `EventDataBatch` object as input.\n - `EventHubProducerClient.create_batch` now also takes the `partition_id` and `partition_key` as optional parameters (which are no longer specified at send).\n- Renamed module `PartitionManager` to `CheckpointStore`.\n- Receive event callback parameter has been renamed to `on_event` and now operates on a single event rather than a list of events.\n- Removed class `EventPostition`.\n - The `starting_position` parameter of the `receive` method accepts offset(`str`), sequence number(`int`), datetime (`datetime.datetime`) or `dict` of these types.\n - The `starting_position_inclusive` parameter of the `receive` method accepts `bool` or `dict` indicating whether the given event position is inclusive or not.\n- `PartitionContext` no longer has attribute `owner_id`.\n- `PartitionContext` now has attribute `last_enqueued_event_properties` which is populated if `track_last_enqueued_event_properties` is set to `True` in the `receive` method.\n\n**New features**\n\n- Added new parameter `idle_timeout` in construct and `from_connection_string` to `EventHubConsumerClient` and `EventHubProducerClient`\nafter which the underlying connection will close if there is no further activity.\n\n## 5.0.0b5 (2019-11-04)\n\n**Breaking changes**\n\n- `EventHubClient`, `EventHubConsumer` and `EventHubProducer` has been removed. Use `EventHubProducerClient` and `EventHubConsumerClient` instead.\n - Construction of both objects is the same as it was for the previous client.\n- Introduced `EventHubProducerClient` as substitution for`EventHubProducer`.\n - `EventHubProducerClient` supports sending events to different partitions.\n- Introduced `EventHubConsumerClient` as substitution for `EventHubConsumer`.\n - `EventHubConsumerClient` supports receiving events from single/all partitions.\n - There are no longer methods which directly return `EventData`, all receiving is done via callback method: `on_events`.\n- `EventHubConsumerClient` has taken on the responsibility of `EventProcessor`.\n - `EventHubConsumerClient` now accepts `PartitionManager` to do load-balancing and checkpoint.\n- Replaced `PartitionProcessor`by four independent callback methods accepted by the `receive` method on `EventHubConsumerClient`.\n - `on_events(partition_context, events)` called when events are received.\n - `on_error(partition_context, exception` called when errors occur.\n - `on_partition_initialize(partition_context)` called when a partition consumer is opened.\n - `on_partition_close(partition_context, reason)` called when a partition consumer is closed.\n- Some modules and classes that were importable from several different places have been removed:\n - `azure.eventhub.common` has been removed. Import from `azure.eventhub` instead.\n - `azure.eventhub.client_abstract` has been removed. Use `azure.eventhub.EventHubProducerClient` or `azure.eventhub.EventHubConsumerClient` instead.\n - `azure.eventhub.client` has been removed. Use `azure.eventhub.EventHubProducerClient` or `azure.eventhub.EventHubConsumerClient` instead.\n - `azure.eventhub.producer` has been removed. Use `azure.eventhub.EventHubProducerClient` instead.\n - `azure.eventhub.consumer` has been removed. Use `azure.eventhub.EventHubConsumerClient` instead.\n - `azure.eventhub.aio.client_async` has been removed. Use `azure.eventhub.aio.EventHubProducerClient` or `azure.eventhub.aio.EventHubConsumerClient` instead.\n - `azure.eventhub.aio.producer_async` has been removed. Use `azure.eventhub.aio.EventHubProducerClient` instead.\n - `azure.eventhub.aio.consumer_async` has been removed. Use `azure.eventhub.aio.EventHubConsumerClient` instead.\n - `azure.eventhub.aio.event_processor.event_processor` has been removed. Use `azure.eventhub.aio.EventHubConsumerClient` instead.\n - `azure.eventhub.aio.event_processor.partition_processor` has been removed. Use callback methods instead.\n - `azure.eventhub.aio.event_processor.partition_manager` has been removed. Import from `azure.eventhub.aio` instead.\n - `azure.eventhub.aio.event_processor.partition_context` has been removed. Import from `azure.eventhub.aio` instead.\n - `azure.eventhub.aio.event_processor.sample_partition_manager` has been removed.\n\n**Bug fixes**\n\n- Fixed bug in user-agent string not being parsed.\n\n## 5.0.0b4 (2019-10-08)\n\n**New features**\n\n- Added support for tracing (issue #7153).\n- Added the capability of tracking last enqueued event properties of the partition to `EventHubConsumer` .\n - Added new boolean type parameter`track_last_enqueued_event_properties` in method `EventHubClient.create_consumer()`.\n - Added new property `last_enqueued_event_properties` of `EventHubConsumer` which contains sequence_number, offset, enqueued_time and retrieval_time information.\n - By default the capability is disabled as it will cost extra bandwidth for transferring more information if turned on.\n\n**Breaking changes**\n\n- Removed support for IoT Hub direct connection.\n - [EventHubs compatible connection string](https://docs.microsoft.com/azure/iot-hub/iot-hub-devguide-messages-read-builtin) of an IotHub can be used to create `EventHubClient` and read properties or events from an IoT Hub.\n- Removed support for sending EventData to IoT Hub.\n- Removed parameter `exception` in method `close()` of `EventHubConsumer` and `EventHubProcuer`.\n- Updated uAMQP dependency to 1.2.3.\n\n## 5.0.0b3 (2019-09-10)\n\n**New features**\n\n- Added support for automatic load balancing among multiple `EventProcessor`.\n- Added `BlobPartitionManager` which implements `PartitionManager`.\n - Azure Blob Storage is applied for storing data used by `EventProcessor`.\n - Packaged separately as a plug-in to `EventProcessor`.\n - For details, please refer to [Azure Blob Storage Partition Manager](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/eventhub/azure-eventhub-checkpointstoreblob-aio).\n- Added property `system_properties` on `EventData`.\n\n**Breaking changes**\n\n- Removed constructor method of `PartitionProcessor`. For initialization please implement the method `initialize`.\n- Replaced `CheckpointManager` by `PartitionContext`.\n - `PartitionContext` has partition context information and method `update_checkpoint`.\n- Updated all methods of `PartitionProcessor` to include `PartitionContext` as part of the arguments.\n- Updated accessibility of class members in `EventHub/EventHubConsumer/EventHubProducer`to be private.\n- Moved `azure.eventhub.eventprocessor` under `aio` package, which now becomes `azure.eventhub.aio.eventprocessor`.\n\n## 5.0.0b2 (2019-08-06)\n\n**New features**\n\n- Added method `create_batch` on the `EventHubProducer` to create an `EventDataBatch` that can then be used to add events until the maximum size is reached.\n - This batch object can then be used in the `send()` method to send all the added events to Event Hubs.\n - This allows publishers to build batches without the possibility of encountering the error around the message size exceeding the supported limit when sending events.\n - It also allows publishers with bandwidth concerns to control the size of each batch published.\n- Added new configuration parameters for exponential delay between retry operations.\n - `retry_total`: The total number of attempts to redo the failed operation.\n - `backoff_factor`: The delay time factor.\n - `backoff_max`: The maximum delay time in total.\n- Added support for context manager on `EventHubClient`.\n- Added new error type `OperationTimeoutError` for send operation.\n- Introduced a new class `EventProcessor` which replaces the older concept of [Event Processor Host](https://docs.microsoft.com/azure/event-hubs/event-hubs-event-processor-host). This early preview is intended to allow users to test the new design using a single instance of `EventProcessor`. The ability to checkpoints to a durable store will be added in future updates.\n - `EventProcessor`: EventProcessor creates and runs consumers for all partitions of the eventhub.\n - `PartitionManager`: PartitionManager defines the interface for getting/claiming ownerships of partitions and updating checkpoints.\n - `PartitionProcessor`: PartitionProcessor defines the interface for processing events.\n - `CheckpointManager`: CheckpointManager takes responsibility for updating checkpoints during events processing.\n\n**Breaking changes**\n\n- `EventProcessorHost` was replaced by `EventProcessor`, please read the new features for details.\n- Replaced `max_retries` configuration parameter of the EventHubClient with `retry_total`.\n\n\n## 5.0.0b1 (2019-06-25)\n\nVersion 5.0.0b1 is a preview of our efforts to create a client library that is user friendly and idiomatic to the Python ecosystem. The reasons for most of the changes in this update can be found in the [Azure SDK Design Guidelines for Python](https://azuresdkspecs.z5.web.core.windows.net/PythonSpec.html). For more information, please visit https://aka.ms/azure-sdk-preview1-python.\n\n**New features**\n\n- Added new configuration parameters for creating EventHubClient.\n - `credential`: The credential object used for authentication which implements `TokenCredential` interface of getting tokens.\n - `transport_type`: The type of transport protocol that will be used for communicating with the Event Hubs service.\n - `max_retries`: The max number of attempts to redo the failed operation when an error happened.\n - for detailed information about the configuration parameters, please read the reference documentation.\n- Added new methods `get_partition_properties` and `get_partition_ids` to EventHubClient.\n- Added support for http proxy.\n- Added support for authentication using azure-identity credential.\n- Added support for transport using AMQP over WebSocket.\n\n**Breaking changes**\n\n- New error hierarchy\n - `azure.error.EventHubError`\n - `azure.error.ConnectionLostError`\n - `azure.error.ConnectError`\n - `azure.error.AuthenticationError`\n - `azure.error.EventDataError`\n - `azure.error.EventDataSendError`\n- Renamed Sender/Receiver to EventHubProducer/EventHubConsumer.\n - Renamed `add_sender` to `create_producer` and `add_receiver` to `create_consumer` in EventHubClient.\n - EventHubConsumer is now iterable.\n- Rename class azure.eventhub.Offset to azure.eventhub.EventPosition.\n- Rename method `get_eventhub_info` to `get_properties` of EventHubClient.\n- Reorganized connection management, EventHubClient is no longer responsible for opening/closing EventHubProducer/EventHubConsumer.\n - Each EventHubProducer/EventHubConsumer is responsible for its own connection management.\n - Added support for context manager on EventHubProducer and EventHubConsumer.\n- Reorganized async APIs into \"azure.eventhub.aio\" namespace and rename to drop the \"_async\" suffix.\n- Updated uAMQP dependency to 1.2.\n\n## 1.3.1 (2019-02-28)\n\n**BugFixes**\n\n- Fixed bug where datetime offset filter was using a local timestamp rather than UTC.\n- Fixed stackoverflow error in continuous connection reconnect attempts.\n\n\n## 1.3.0 (2019-01-29)\n\n**BugFixes**\n\n- Added support for auto reconnect on token expiration and other auth errors (issue #89).\n\n**Features**\n\n- Added ability to create ServiceBusClient from an existing SAS auth token, including\n providing a function to auto-renew that token on expiry.\n- Added support for storing a custom EPH context value in checkpoint (PR #84, thanks @konstantinmiller)\n\n\n## 1.2.0 (2018-11-29)\n\n- Support for Python 2.7 in azure.eventhub module (azure.eventprocessorhost will not support Python 2.7).\n- Parse EventData.enqueued_time as a UTC timestamp (issue #72, thanks @vjrantal)\n\n\n## 1.1.1 (2018-10-03)\n\n- Fixed bug in Azure namespace package.\n\n\n## 1.1.0 (2018-09-21)\n\n- Changes to `AzureStorageCheckpointLeaseManager` parameters to support other connection options (issue #61):\n - The `storage_account_name`, `storage_account_key` and `lease_container_name` arguments are now optional keyword arguments.\n - Added a `sas_token` argument that must be specified with `storage_account_name` in place of `storage_account_key`.\n - Added an `endpoint_suffix` argument to support storage endpoints in National Clouds.\n - Added a `connection_string` argument that, if specified, overrides all other endpoint arguments.\n - The `lease_container_name` argument now defaults to `\"eph-leases\"` if not specified.\n\n- Fix for clients failing to start if run called multipled times (issue #64).\n- Added convenience methods `body_as_str` and `body_as_json` to EventData object for easier processing of message data.\n\n\n## 1.0.0 (2018-08-22)\n\n- API stable.\n- Renamed internal `_async` module to `async_ops` for docs generation.\n- Added optional `auth_timeout` parameter to `EventHubClient` and `EventHubClientAsync` to configure how long to allow for token\n negotiation to complete. Default is 60 seconds.\n- Added optional `send_timeout` parameter to `EventHubClient.add_sender` and `EventHubClientAsync.add_async_sender` to determine the\n timeout for Events to be successfully sent. Default value is 60 seconds.\n- Reformatted logging for performance.\n\n\n## 0.2.0 (2018-08-06)\n\n- Stability improvements for EPH.\n- Updated uAMQP version.\n- Added new configuration options for Sender and Receiver; `keep_alive` and `auto_reconnect`.\n These flags have been added to the following:\n\n - `EventHubClient.add_receiver`\n - `EventHubClient.add_sender`\n - `EventHubClientAsync.add_async_receiver`\n - `EventHubClientAsync.add_async_sender`\n - `EPHOptions.keey_alive_interval`\n - `EPHOptions.auto_reconnect_on_error`\n\n\n## 0.2.0rc2 (2018-07-29)\n\n- **Breaking change** `EventData.offset` will now return an object of type `~uamqp.common.Offset` rather than str.\n The original string value can be retrieved from `~uamqp.common.Offset.value`.\n- Each sender/receiver will now run in its own independent connection.\n- Updated uAMQP dependency to 0.2.0\n- Fixed issue with IoTHub clients not being able to retrieve partition information.\n- Added support for HTTP proxy settings to both EventHubClient and EPH.\n- Added error handling policy to automatically reconnect on retryable error.\n- Added keep-alive thread for maintaining an unused connection.\n\n\n## 0.2.0rc1 (2018-07-06)\n\n- **Breaking change** Restructured library to support Python 3.7. Submodule `async` has been renamed and all classes from\n this module can now be imported from azure.eventhub directly.\n- **Breaking change** Removed optional `callback` argument from `Receiver.receive` and `AsyncReceiver.receive`.\n- **Breaking change** `EventData.properties` has been renamed to `EventData.application_properties`.\n This removes the potential for messages to be processed via callback for not yet returned\n in the batch.\n- Updated uAMQP dependency to v0.1.0\n- Added support for constructing IoTHub connections.\n- Fixed memory leak in receive operations.\n- Dropped Python 2.7 wheel support.\n\n\n## 0.2.0b2 (2018-05-29)\n\n- Added `namespace_suffix` to EventHubConfig() to support national clouds.\n- Added `device_id` attribute to EventData to support IoT Hub use cases.\n- Added message header to workaround service bug for PartitionKey support.\n- Updated uAMQP dependency to vRC1.\n\n\n## 0.2.0b1 (2018-04-20)\n\n- Updated uAMQP to latest version.\n- Further testing and minor bug fixes.\n\n\n## 0.2.0a2 (2018-04-02)\n\n- Updated uAQMP dependency.\n",
"bugtrack_url": null,
"license": "MIT License",
"summary": "Microsoft Azure Event Hubs Client Library for Python",
"version": "5.12.2",
"project_urls": {
"Homepage": "https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/eventhub/azure-eventhub"
},
"split_keywords": [
"azure",
" azure sdk"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "ea9ab53475d934953e57e9bba18faecd762cc43bbb9525df72fa508001c3713e",
"md5": "48a0aa40a92c2244857a1fff8c860221",
"sha256": "a75fc0e21919cd164b111e9a09b6e3e59749dd35956402294af8d0d2a5bfca27"
},
"downloads": -1,
"filename": "azure_eventhub-5.12.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "48a0aa40a92c2244857a1fff8c860221",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 326569,
"upload_time": "2024-10-02T21:08:47",
"upload_time_iso_8601": "2024-10-02T21:08:47.032163Z",
"url": "https://files.pythonhosted.org/packages/ea/9a/b53475d934953e57e9bba18faecd762cc43bbb9525df72fa508001c3713e/azure_eventhub-5.12.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "6531bf73f1f2816dc2b606b06b85b557637e7322bd4cac963d9f720aedb570b0",
"md5": "1150cac315c47265588f1f02baf20f3e",
"sha256": "92bd42a73e426947d98e7688837e99df4b5028ab245677680c6135836ca59558"
},
"downloads": -1,
"filename": "azure_eventhub-5.12.2.tar.gz",
"has_sig": false,
"md5_digest": "1150cac315c47265588f1f02baf20f3e",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 395393,
"upload_time": "2024-10-02T21:08:45",
"upload_time_iso_8601": "2024-10-02T21:08:45.048662Z",
"url": "https://files.pythonhosted.org/packages/65/31/bf73f1f2816dc2b606b06b85b557637e7322bd4cac963d9f720aedb570b0/azure_eventhub-5.12.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-10-02 21:08:45",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Azure",
"github_project": "azure-sdk-for-python",
"travis_ci": false,
"coveralls": true,
"github_actions": true,
"lcname": "azure-eventhub"
}