aiometer


Nameaiometer JSON
Version 0.5.0 PyPI version JSON
download
home_page
SummaryA Python concurrency scheduling library, compatible with asyncio and trio
upload_time2023-12-11 19:51:58
maintainer
docs_urlNone
author
requires_python>=3.8
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # aiometer

[![Build Status](https://dev.azure.com/florimondmanca/public/_apis/build/status/florimondmanca.aiometer?branchName=master)](https://dev.azure.com/florimondmanca/public/_build/latest?definitionId=4&branchName=master)
[![Coverage](https://codecov.io/gh/florimondmanca/aiometer/branch/master/graph/badge.svg)](https://codecov.io/gh/florimondmanca/aiometer)
![Python versions](https://img.shields.io/pypi/pyversions/aiometer.svg)
[![Package version](https://badge.fury.io/py/aiometer.svg)](https://pypi.org/project/aiometer)

`aiometer` is a concurrency scheduling library compatible with `asyncio` and `trio` and inspired by [Trimeter](https://github.com/python-trio/trimeter). It makes it easier to execute lots of tasks concurrently while controlling concurrency limits (i.e. applying _[backpressure](https://lucumr.pocoo.org/2020/1/1/async-pressure/)_) and collecting results in a predictable manner.

**Content**

- [Example](#example)
- [Features](#features)
- [Installation](#installation)
- [Usage](#usage)
  - [Flow control](#flow-control)
  - [Running tasks](#running-tasks)
- [How To](#how-to)
- [API Reference](#api-reference)
- [Contributing](#contributing)
- [License](#license)

## Example

Let's use [HTTPX](https://github.com/encode/httpx) to make web requests concurrently...

_Try this code interactively using [IPython](https://ipython.org/install.html)._

```python
>>> import asyncio
>>> import functools
>>> import random
>>> import aiometer
>>> import httpx
>>>
>>> client = httpx.AsyncClient()
>>>
>>> async def fetch(client, request):
...     response = await client.send(request)
...     # Simulate extra processing...
...     await asyncio.sleep(2 * random.random())
...     return response.json()["json"]
...
>>> requests = [
...     httpx.Request("POST", "https://httpbin.org/anything", json={"index": index})
...     for index in range(100)
... ]
...
>>> # Send requests, and process responses as they're made available:
>>> async with aiometer.amap(
...     functools.partial(fetch, client),
...     requests,
...     max_at_once=10, # Limit maximum number of concurrently running tasks.
...     max_per_second=5,  # Limit request rate to not overload the server.
... ) as results:
...     async for data in results:
...         print(data)
...
{'index': 3}
{'index': 4}
{'index': 1}
{'index': 2}
{'index': 0}
...
>>> # Alternatively, fetch and aggregate responses into an (ordered) list...
>>> jobs = [functools.partial(fetch, client, request) for request in requests]
>>> results = await aiometer.run_all(jobs, max_at_once=10, max_per_second=5)
>>> results
[{'index': 0}, {'index': 1}, {'index': 2}, {'index': 3}, {'index': 4}, ...]
```

## Installation

_This project is in beta and maturing. Be sure to pin any dependencies to the latest minor._

```bash
pip install "aiometer==0.5.*"
```

## Features

- Concurrency management and throttling helpers.
- `asyncio` and `trio` support.
- Fully type annotated.
- 100% test coverage.

## Usage

### Flow control

The key highlight of `aiometer` is allowing you to apply flow control strategies in order to limit the degree of concurrency of your programs.

There are two knobs you can play with to fine-tune concurrency:

- `max_at_once`: this is used to limit the maximum number of concurrently running tasks at any given time. (If you have 100 tasks and set `max_at_once=10`, then `aiometer` will ensure that no more than 10 run at the same time.)
- `max_per_second`: this option limits the number of tasks spawned per second. This is useful to not overload I/O resources, such as servers that may have a rate limiting policy in place.

Example usage:

```python
>>> import asyncio
>>> import aiometer
>>> async def make_query(query):
...     await asyncio.sleep(0.05)  # Simulate a database request.
...
>>> queries = ['SELECT * from authors'] * 1000
>>> # Allow at most 5 queries to run concurrently at any given time:
>>> await aiometer.run_on_each(make_query, queries, max_at_once=5)
...
>>> # Make at most 10 queries per second:
>>> await aiometer.run_on_each(make_query, queries, max_per_second=10)
...
>>> # Run at most 10 concurrent jobs, spawning new ones at least every 5 seconds:
>>> async def job(id):
...     await asyncio.sleep(10)  # A very long task.
...
>>> await aiometer.run_on_each(job, range(100),  max_at_once=10, max_per_second=0.2)
```

### Running tasks

`aiometer` provides 4 different ways to run tasks concurrently in the form of 4 different run functions. Each function accepts all the options documented in [Flow control](#flow-control), and runs tasks in a slightly different way, allowing to address a variety of use cases. Here's a handy table for reference (see also the [API Reference](#api-reference)):

| Entrypoint      | Use case                                       |
| --------------- | ---------------------------------------------- |
| `run_on_each()` | Execute async callbacks in any order.          |
| `run_all()`     | Return results as an ordered list.             |
| `amap()`        | Iterate over results as they become available. |
| `run_any()`     | Return result of first completed function.     |

To illustrate the behavior of each run function, let's first setup a hello world async program:

```python
>>> import asyncio
>>> import random
>>> from functools import partial
>>> import aiometer
>>>
>>> async def get_greeting(name):
...     await asyncio.sleep(random.random())  # Simulate I/O
...     return f"Hello, {name}"
...
>>> async def greet(name):
...     greeting = await get_greeting(name)
...     print(greeting)
...
>>> names = ["Robert", "Carmen", "Lucas"]
```

Let's start with `run_on_each()`. It executes an async function once for each item in a list passed as argument:

```python
>>> await aiometer.run_on_each(greet, names)
'Hello, Robert!'
'Hello, Lucas!'
'Hello, Carmen!'
```

If we'd like to get the list of greetings in the same order as `names`, in a fashion similar to [`Promise.all()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/all), we can use `run_all()`:

```python
>>> await aiometer.run_all([partial(get_greeting, name) for name in names])
['Hello, Robert', 'Hello, Carmen!', 'Hello, Lucas!']
```

`amap()` allows us to process each greeting as it becomes available (which means maintaining order is not guaranteed):

```python
>>> async with aiometer.amap(get_greeting, names) as greetings:
...     async for greeting in greetings:
...         print(greeting)
'Hello, Lucas!'
'Hello, Robert!'
'Hello, Carmen!'
```

Lastly, `run_any()` can be used to run async functions until the first one completes, similarly to [`Promise.any()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/any):

```python
>>> await aiometer.run_any([partial(get_greeting, name) for name in names])
'Hello, Carmen!'
```

As a last fun example, let's use `amap()` to implement a no-threads async version of [sleep sort](https://rosettacode.org/wiki/Sorting_algorithms/Sleep_sort):

```python
>>> import asyncio
>>> from functools import partial
>>> import aiometer
>>> numbers = [0.3, 0.1, 0.6, 0.2, 0.7, 0.5, 0.5, 0.2]
>>> async def process(n):
...     await asyncio.sleep(n)
...     return n
...
>>> async with aiometer.amap(process, numbers) as results:
...     sorted_numbers = [n async for n in results]
...
>>> sorted_numbers
[0.1, 0.2, 0.2, 0.3, 0.5, 0.5, 0.6, 0.7]
```

## How To

### Multiple parametrized values in `run_on_each` and `amap`

`run_on_each` and `amap` only accept functions that accept a single positional argument (i.e. `(Any) -> Awaitable`).

So if you have a function that is parametrized by multiple values, you should refactor it to match this form.

This can generally be achieved like this:

1. Build a proxy container type (eg. a `namedtuple`), eg `T`.
2. Refactor your function so that its signature is now `(T) -> Awaitable`.
3. Build a list of these proxy containers, and pass it to `aiometer`.

For example, assuming you have a function that processes X/Y coordinates...

```python
async def process(x: float, y: float) -> None:
    pass

xs = list(range(100))
ys = list(range(100))

for x, y in zip(xs, ys):
    await process(x, y)
```

You could use it with `amap` by refactoring it like this:

```python
from typing import NamedTuple

# Proxy container type:
class Point(NamedTuple):
    x: float
    y: float

# Rewrite to accept a proxy as a single positional argument:
async def process(point: Point) -> None:
    x = point.x
    y = point.y
    ...

xs = list(range(100))
ys = list(range(100))

# Build a list of proxy containers:
points = [Point(x, y) for x, y in zip(x, y)]

# Use it:
async with aiometer.amap(process, points) as results:
    ...
```

## API Reference

### Common options

* `max_at_once` (_Optional_, `int`): the maximum number of concurrently running tasks at any given time.
* `max_per_second` (_Optional_, `int`): the maximum number of tasks spawned per second.

### `aiometer.run_on_each()`

**Signature**: _async_ aiometer.run_on_each(*async_fn*, *args*, *, *max_at_once=None*, *max_per_second=None*) -> *None*

Concurrently run the equivalent of `async_fn(arg) for arg in args`. Does not return any value. To get return values back, use [`aiometer.run_all()`](#aiometerrun_all).

### `aiometer.run_all()`

**Signature**: _async_ aiometer.run_all(*async_fns*, *max_at_once=None*, *max_per_second=None*) -> *list*

Concurrently run the `async_fns` functions, and return the list of results in the same order.

### `aiometer.amap()`

**Signature**: _async_ aiometer.amap(*async_fn*, *args*, *max_at_once=None*, *max_per_second=None*) -> *async iterator*

Concurrently run the equivalent of `async_fn(arg) for arg in args`, and return an async iterator that yields results as they become available.

### `aiometer.run_any()`

**Signature**: _async_ aiometer.run_any(*async_fns*, *max_at_once=None*, *max_per_second=None*) -> *Any*

Concurrently run the `async_fns` functions, and return the first available result.

## Contributing

See [CONTRIBUTING.md](./CONTRIBUTING.md).

## License

MIT

# Changelog

All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).

## 0.5.0 - 2023-12-11

### Removed

- Drop support for Python 3.7, as it has reached EOL. (Pull #44)

### Added

- Add official support for Python 3.12. (Pull #44)
- Add support for anyio 4. This allows catching exception groups using the native ExceptionGroup. On anyio 3.2+, anyio would throw its own ExceptionGroup type. Compatibility with anyio 3.2+ is retained. (Pull #43)

## 0.4.0 - 2023-01-18

### Removed

- Drop support for Python 3.6, which has reached EOL. (Pull #38)

### Added

- Add official support for Python 3.10 and 3.11. (Pull #38)

### Fixed

- Relax version requirements for `typing_extensions` and address `mypy>=0.981` strict optional changes. (Pull #38)

## 0.3.0 - 2021-07-06

### Changed

- Update `anyio` dependency to v3 (previously v1). (Pull #25)
  - _NB: no API change, but dependency mismatches may occur. Be sure to port your codebase to anyio v3 before upgrading `aiometer`._

### Added

- Add support for Python 3.6 (installs the `contextlib2` backport library there). (Pull #26)
- Officialize support for Python 3.9. (Pull #26)

## 0.2.1 - 2020-03-26

### Fixed

- Improve robustness of the `max_per_second` implementation by using the generic cell rate algorithm (GCRA) instead of leaky bucket. (Pull #5)

## 0.2.0 - 2020-03-22

### Added

- Add support for Python 3.7. (Pull #3)

## 0.1.0 - 2020-03-21

### Added

- Add `run_on_each()`, `run_all()`, `amap()` and `run_any()`, with `max_at_once` and `max_per_second` options. (Pull #1)

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "aiometer",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "",
    "keywords": "",
    "author": "",
    "author_email": "Florimond Manca <florimond.manca@protonmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/ca/94/2aed63a1f34116be83cb07f7d5f74f9d85cf06a920076a94307ba2ba385a/aiometer-0.5.0.tar.gz",
    "platform": null,
    "description": "# aiometer\n\n[![Build Status](https://dev.azure.com/florimondmanca/public/_apis/build/status/florimondmanca.aiometer?branchName=master)](https://dev.azure.com/florimondmanca/public/_build/latest?definitionId=4&branchName=master)\n[![Coverage](https://codecov.io/gh/florimondmanca/aiometer/branch/master/graph/badge.svg)](https://codecov.io/gh/florimondmanca/aiometer)\n![Python versions](https://img.shields.io/pypi/pyversions/aiometer.svg)\n[![Package version](https://badge.fury.io/py/aiometer.svg)](https://pypi.org/project/aiometer)\n\n`aiometer` is a concurrency scheduling library compatible with `asyncio` and `trio` and inspired by [Trimeter](https://github.com/python-trio/trimeter). It makes it easier to execute lots of tasks concurrently while controlling concurrency limits (i.e. applying _[backpressure](https://lucumr.pocoo.org/2020/1/1/async-pressure/)_) and collecting results in a predictable manner.\n\n**Content**\n\n- [Example](#example)\n- [Features](#features)\n- [Installation](#installation)\n- [Usage](#usage)\n  - [Flow control](#flow-control)\n  - [Running tasks](#running-tasks)\n- [How To](#how-to)\n- [API Reference](#api-reference)\n- [Contributing](#contributing)\n- [License](#license)\n\n## Example\n\nLet's use [HTTPX](https://github.com/encode/httpx) to make web requests concurrently...\n\n_Try this code interactively using [IPython](https://ipython.org/install.html)._\n\n```python\n>>> import asyncio\n>>> import functools\n>>> import random\n>>> import aiometer\n>>> import httpx\n>>>\n>>> client = httpx.AsyncClient()\n>>>\n>>> async def fetch(client, request):\n...     response = await client.send(request)\n...     # Simulate extra processing...\n...     await asyncio.sleep(2 * random.random())\n...     return response.json()[\"json\"]\n...\n>>> requests = [\n...     httpx.Request(\"POST\", \"https://httpbin.org/anything\", json={\"index\": index})\n...     for index in range(100)\n... ]\n...\n>>> # Send requests, and process responses as they're made available:\n>>> async with aiometer.amap(\n...     functools.partial(fetch, client),\n...     requests,\n...     max_at_once=10, # Limit maximum number of concurrently running tasks.\n...     max_per_second=5,  # Limit request rate to not overload the server.\n... ) as results:\n...     async for data in results:\n...         print(data)\n...\n{'index': 3}\n{'index': 4}\n{'index': 1}\n{'index': 2}\n{'index': 0}\n...\n>>> # Alternatively, fetch and aggregate responses into an (ordered) list...\n>>> jobs = [functools.partial(fetch, client, request) for request in requests]\n>>> results = await aiometer.run_all(jobs, max_at_once=10, max_per_second=5)\n>>> results\n[{'index': 0}, {'index': 1}, {'index': 2}, {'index': 3}, {'index': 4}, ...]\n```\n\n## Installation\n\n_This project is in beta and maturing. Be sure to pin any dependencies to the latest minor._\n\n```bash\npip install \"aiometer==0.5.*\"\n```\n\n## Features\n\n- Concurrency management and throttling helpers.\n- `asyncio` and `trio` support.\n- Fully type annotated.\n- 100% test coverage.\n\n## Usage\n\n### Flow control\n\nThe key highlight of `aiometer` is allowing you to apply flow control strategies in order to limit the degree of concurrency of your programs.\n\nThere are two knobs you can play with to fine-tune concurrency:\n\n- `max_at_once`: this is used to limit the maximum number of concurrently running tasks at any given time. (If you have 100 tasks and set `max_at_once=10`, then `aiometer` will ensure that no more than 10 run at the same time.)\n- `max_per_second`: this option limits the number of tasks spawned per second. This is useful to not overload I/O resources, such as servers that may have a rate limiting policy in place.\n\nExample usage:\n\n```python\n>>> import asyncio\n>>> import aiometer\n>>> async def make_query(query):\n...     await asyncio.sleep(0.05)  # Simulate a database request.\n...\n>>> queries = ['SELECT * from authors'] * 1000\n>>> # Allow at most 5 queries to run concurrently at any given time:\n>>> await aiometer.run_on_each(make_query, queries, max_at_once=5)\n...\n>>> # Make at most 10 queries per second:\n>>> await aiometer.run_on_each(make_query, queries, max_per_second=10)\n...\n>>> # Run at most 10 concurrent jobs, spawning new ones at least every 5 seconds:\n>>> async def job(id):\n...     await asyncio.sleep(10)  # A very long task.\n...\n>>> await aiometer.run_on_each(job, range(100),  max_at_once=10, max_per_second=0.2)\n```\n\n### Running tasks\n\n`aiometer` provides 4 different ways to run tasks concurrently in the form of 4 different run functions. Each function accepts all the options documented in [Flow control](#flow-control), and runs tasks in a slightly different way, allowing to address a variety of use cases. Here's a handy table for reference (see also the [API Reference](#api-reference)):\n\n| Entrypoint      | Use case                                       |\n| --------------- | ---------------------------------------------- |\n| `run_on_each()` | Execute async callbacks in any order.          |\n| `run_all()`     | Return results as an ordered list.             |\n| `amap()`        | Iterate over results as they become available. |\n| `run_any()`     | Return result of first completed function.     |\n\nTo illustrate the behavior of each run function, let's first setup a hello world async program:\n\n```python\n>>> import asyncio\n>>> import random\n>>> from functools import partial\n>>> import aiometer\n>>>\n>>> async def get_greeting(name):\n...     await asyncio.sleep(random.random())  # Simulate I/O\n...     return f\"Hello, {name}\"\n...\n>>> async def greet(name):\n...     greeting = await get_greeting(name)\n...     print(greeting)\n...\n>>> names = [\"Robert\", \"Carmen\", \"Lucas\"]\n```\n\nLet's start with `run_on_each()`. It executes an async function once for each item in a list passed as argument:\n\n```python\n>>> await aiometer.run_on_each(greet, names)\n'Hello, Robert!'\n'Hello, Lucas!'\n'Hello, Carmen!'\n```\n\nIf we'd like to get the list of greetings in the same order as `names`, in a fashion similar to [`Promise.all()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/all), we can use `run_all()`:\n\n```python\n>>> await aiometer.run_all([partial(get_greeting, name) for name in names])\n['Hello, Robert', 'Hello, Carmen!', 'Hello, Lucas!']\n```\n\n`amap()` allows us to process each greeting as it becomes available (which means maintaining order is not guaranteed):\n\n```python\n>>> async with aiometer.amap(get_greeting, names) as greetings:\n...     async for greeting in greetings:\n...         print(greeting)\n'Hello, Lucas!'\n'Hello, Robert!'\n'Hello, Carmen!'\n```\n\nLastly, `run_any()` can be used to run async functions until the first one completes, similarly to [`Promise.any()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/any):\n\n```python\n>>> await aiometer.run_any([partial(get_greeting, name) for name in names])\n'Hello, Carmen!'\n```\n\nAs a last fun example, let's use `amap()` to implement a no-threads async version of [sleep sort](https://rosettacode.org/wiki/Sorting_algorithms/Sleep_sort):\n\n```python\n>>> import asyncio\n>>> from functools import partial\n>>> import aiometer\n>>> numbers = [0.3, 0.1, 0.6, 0.2, 0.7, 0.5, 0.5, 0.2]\n>>> async def process(n):\n...     await asyncio.sleep(n)\n...     return n\n...\n>>> async with aiometer.amap(process, numbers) as results:\n...     sorted_numbers = [n async for n in results]\n...\n>>> sorted_numbers\n[0.1, 0.2, 0.2, 0.3, 0.5, 0.5, 0.6, 0.7]\n```\n\n## How To\n\n### Multiple parametrized values in `run_on_each` and `amap`\n\n`run_on_each` and `amap` only accept functions that accept a single positional argument (i.e. `(Any) -> Awaitable`).\n\nSo if you have a function that is parametrized by multiple values, you should refactor it to match this form.\n\nThis can generally be achieved like this:\n\n1. Build a proxy container type (eg. a `namedtuple`), eg `T`.\n2. Refactor your function so that its signature is now `(T) -> Awaitable`.\n3. Build a list of these proxy containers, and pass it to `aiometer`.\n\nFor example, assuming you have a function that processes X/Y coordinates...\n\n```python\nasync def process(x: float, y: float) -> None:\n    pass\n\nxs = list(range(100))\nys = list(range(100))\n\nfor x, y in zip(xs, ys):\n    await process(x, y)\n```\n\nYou could use it with `amap` by refactoring it like this:\n\n```python\nfrom typing import NamedTuple\n\n# Proxy container type:\nclass Point(NamedTuple):\n    x: float\n    y: float\n\n# Rewrite to accept a proxy as a single positional argument:\nasync def process(point: Point) -> None:\n    x = point.x\n    y = point.y\n    ...\n\nxs = list(range(100))\nys = list(range(100))\n\n# Build a list of proxy containers:\npoints = [Point(x, y) for x, y in zip(x, y)]\n\n# Use it:\nasync with aiometer.amap(process, points) as results:\n    ...\n```\n\n## API Reference\n\n### Common options\n\n* `max_at_once` (_Optional_, `int`): the maximum number of concurrently running tasks at any given time.\n* `max_per_second` (_Optional_, `int`): the maximum number of tasks spawned per second.\n\n### `aiometer.run_on_each()`\n\n**Signature**: _async_ aiometer.run_on_each(*async_fn*, *args*, *, *max_at_once=None*, *max_per_second=None*) -> *None*\n\nConcurrently run the equivalent of `async_fn(arg) for arg in args`. Does not return any value. To get return values back, use [`aiometer.run_all()`](#aiometerrun_all).\n\n### `aiometer.run_all()`\n\n**Signature**: _async_ aiometer.run_all(*async_fns*, *max_at_once=None*, *max_per_second=None*) -> *list*\n\nConcurrently run the `async_fns` functions, and return the list of results in the same order.\n\n### `aiometer.amap()`\n\n**Signature**: _async_ aiometer.amap(*async_fn*, *args*, *max_at_once=None*, *max_per_second=None*) -> *async iterator*\n\nConcurrently run the equivalent of `async_fn(arg) for arg in args`, and return an async iterator that yields results as they become available.\n\n### `aiometer.run_any()`\n\n**Signature**: _async_ aiometer.run_any(*async_fns*, *max_at_once=None*, *max_per_second=None*) -> *Any*\n\nConcurrently run the `async_fns` functions, and return the first available result.\n\n## Contributing\n\nSee [CONTRIBUTING.md](./CONTRIBUTING.md).\n\n## License\n\nMIT\n\n# Changelog\n\nAll notable changes to this project will be documented in this file.\n\nThe format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).\n\n## 0.5.0 - 2023-12-11\n\n### Removed\n\n- Drop support for Python 3.7, as it has reached EOL. (Pull #44)\n\n### Added\n\n- Add official support for Python 3.12. (Pull #44)\n- Add support for anyio 4. This allows catching exception groups using the native ExceptionGroup. On anyio 3.2+, anyio would throw its own ExceptionGroup type. Compatibility with anyio 3.2+ is retained. (Pull #43)\n\n## 0.4.0 - 2023-01-18\n\n### Removed\n\n- Drop support for Python 3.6, which has reached EOL. (Pull #38)\n\n### Added\n\n- Add official support for Python 3.10 and 3.11. (Pull #38)\n\n### Fixed\n\n- Relax version requirements for `typing_extensions` and address `mypy>=0.981` strict optional changes. (Pull #38)\n\n## 0.3.0 - 2021-07-06\n\n### Changed\n\n- Update `anyio` dependency to v3 (previously v1). (Pull #25)\n  - _NB: no API change, but dependency mismatches may occur. Be sure to port your codebase to anyio v3 before upgrading `aiometer`._\n\n### Added\n\n- Add support for Python 3.6 (installs the `contextlib2` backport library there). (Pull #26)\n- Officialize support for Python 3.9. (Pull #26)\n\n## 0.2.1 - 2020-03-26\n\n### Fixed\n\n- Improve robustness of the `max_per_second` implementation by using the generic cell rate algorithm (GCRA) instead of leaky bucket. (Pull #5)\n\n## 0.2.0 - 2020-03-22\n\n### Added\n\n- Add support for Python 3.7. (Pull #3)\n\n## 0.1.0 - 2020-03-21\n\n### Added\n\n- Add `run_on_each()`, `run_all()`, `amap()` and `run_any()`, with `max_at_once` and `max_per_second` options. (Pull #1)\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A Python concurrency scheduling library, compatible with asyncio and trio",
    "version": "0.5.0",
    "project_urls": {
        "Homepage": "https://github.com/florimondmanca/aiometer"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8d48bfed7b1373890a7946cd7523c125e6469a7add944712cb328be190032b0e",
                "md5": "1c19284fa9876dc77c7ed3ec24609230",
                "sha256": "57e54a067e6918504844610da23aabf6e179cb0f01104503b3db4bfb39f5e1f2"
            },
            "downloads": -1,
            "filename": "aiometer-0.5.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "1c19284fa9876dc77c7ed3ec24609230",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 12316,
            "upload_time": "2023-12-11T19:51:57",
            "upload_time_iso_8601": "2023-12-11T19:51:57.059279Z",
            "url": "https://files.pythonhosted.org/packages/8d/48/bfed7b1373890a7946cd7523c125e6469a7add944712cb328be190032b0e/aiometer-0.5.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ca942aed63a1f34116be83cb07f7d5f74f9d85cf06a920076a94307ba2ba385a",
                "md5": "d84394daa5f97faaabd62f403cf5abf8",
                "sha256": "1d441ad3c16eaa56d438e5f9a2f9fbccdb29f6140fb36f9cb68183545ec48691"
            },
            "downloads": -1,
            "filename": "aiometer-0.5.0.tar.gz",
            "has_sig": false,
            "md5_digest": "d84394daa5f97faaabd62f403cf5abf8",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 18256,
            "upload_time": "2023-12-11T19:51:58",
            "upload_time_iso_8601": "2023-12-11T19:51:58.294169Z",
            "url": "https://files.pythonhosted.org/packages/ca/94/2aed63a1f34116be83cb07f7d5f74f9d85cf06a920076a94307ba2ba385a/aiometer-0.5.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-12-11 19:51:58",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "florimondmanca",
    "github_project": "aiometer",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [],
    "lcname": "aiometer"
}
        
Elapsed time: 1.14966s