creature


Namecreature JSON
Version 0.1.17 PyPI version JSON
download
home_page
Summary
upload_time2023-01-28 04:25:36
maintainer
docs_urlNone
authorpwtail
requires_python>=3.10,<4.0
license
keywords greenlet fibers async sqlalchemy
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # creature

## Intro

This library demonstrates the use of **greenlet as a bridge to async 
code**. It makes the code written for blocking I/O able to work with async I/O.
An example can be an ORM that previously used blocking 
database drivers and 
starts using an async database driver.

Actually, it was not intended as a demo from the start, it was intended to be
used in production, therefore the code quality is good. Actually, I have 
written an 
async 
django database [backend](https://github.com/Bi-Coloured-Python-Rock-Snake/pgbackend) using it.

The same approach is used by sqlalchemy to provide its async features. However, 
this library is more advanced in terms of features as it **provides support 
for REPL, context managers, iterators, context vars**.

However, I decided to discontinue it. The thing is there are two approaches to 
async programming: one uses 
the "colored 
functions" approach (a term first used in a [post](
https://journal.stuffwithstuff.com/2015/02/01/what-color-is-your-function/) 
by Bob Nystrom),
and the other one does not. A vivid example is **Python vs. Ruby**: Python 
coroutines are colored and Ruby Fibers (appeared in 1.9) are not.

As you can 
see, this repository is placed in the "Bi-Coloured-Python-Rock-Snake" group. 
When I titled it that way, I considered function colors a bad thing, however I 
have come to a conclusion that the matter of things is far from 
unambiguity. There 
are many nuances to this but shortly speaking, I embraced the **Python way of 
things**.

Still, the library can be a nice demo of what sqlalchemy is doing under the 
hood. The name is a tribute 
to sqlalchemy.

## Description

This library makes it possible for the regular (sync-looking) functions to have 
async 
implementation under the hood.
It does that by using
[greenlet](https://github.com/python-greenlet/greenlet).

The main principle is to separate sync and async code by two different 
greenlets. Then, all async tasks are being sent to the async greenlet and 
executed there,
while the sync greenlet doesn't do any I/O itself.

## Install

```
pip install creature
```

## Usage

You can turn an async function into a sync one by using `exempt` decorator:

```python
from creature import exempt


@exempt
async def sleep(secs):
    await asyncio.sleep(secs)
    print(f'Slept for {secs} seconds')


@exempt
async def download(url):
    async with httpx.AsyncClient() as client:
        resp = await client.get(url)
        print(f'Downloaded {len(resp.content)} bytes')
```

"Exempt" means that coroutines are exempted from the current greenlet, and 
sent to another.

Now, to call those functions you have two options: 

**1. as_async decorator**

You can use `as_async` decorator to make the main function async again and 
run it with an event loop:

```python
from creature import as_async


@as_async
def main():
    sleep(0.5)
    download('https://www.python.org')


asyncio.run(main())
```

Which will print

```commandline
Slept for 0.5 seconds
Downloaded 50856 bytes
```

**2. start_loop()**

You can also start an event loop yourself - this may be useful for interactive 
use or scripts.

```python
import creature
creature.start_loop()

sleep(0.5)
download('https://www.python.org')
```

Which will print the same.

**The REPL**

start_loop() doesn't work in IPython REPL. It works in the PyCharm Console 
though.
The reason is that IPython starts the asyncio loop itself. The 
prompt_toolkit used by IPython, also needs an async loop.

So, for IPython there 
is a 
different solution:

```python
import creature
creature.ipy.enable()
```

This works in both PyCharm Console and IPython.

**Context vars**

asyncio has `contextvars` module that gives coroutines access to their 
"context". creature has its own contextvars for the very same purpose.

greenlet itself has support for contextvars: all greenlets have different 
contexts.
However, as you should know by now, we have two 
greenlets, sync and async, and it's natural for them to share the context.

The class intentionally is called context_var, not to import the standard 
ContextVar by mistake.

```python
creature.start_loop()

var = creature.context_var(__name__, 'var', default=-1)


@exempt
async def f1():
    # set() returns the previous value
    assert var.set(1) == -1


def f2():
    assert var.get() == 1


f1()
f2()
```

As you can see, sync and async functions can use shared context.

**Context managers**

Functions are not the only thing you can come across, sometimes you have to 
deal with context managers too. creature can map async context managers to the 
sync ones. Here is how it is done:

```python
@exempt_cm
@asynccontextmanager
async def have_rest(before, after):
    await asyncio.sleep(before)
    try:
        yield
    finally:
        await asyncio.sleep(after)

with have_rest(1, 3):
    print('Party!')
```

`exempt_cm` stays for "exempt the context manager", of course. This feature 
had been 
useful when I 
was working on the async backend for django, because the psycopg3 driver 
uses context managers extensively.

**Iterators**

You get the principle, don't you? Similarly, we have `exempt_it` for iterators.

```python
@exempt_it
async def counter():
    for i in range(3):
        await asyncio.sleep(0.1 * i)
        yield i

assert list(counter()) == [0, 1, 2]
```


            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "creature",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.10,<4.0",
    "maintainer_email": "",
    "keywords": "greenlet,fibers,async,sqlalchemy",
    "author": "pwtail",
    "author_email": "abvit89s@gmail.com",
    "download_url": "",
    "platform": null,
    "description": "# creature\n\n## Intro\n\nThis library demonstrates the use of **greenlet as a bridge to async \ncode**. It makes the code written for blocking I/O able to work with async I/O.\nAn example can be an ORM that previously used blocking \ndatabase drivers and \nstarts using an async database driver.\n\nActually, it was not intended as a demo from the start, it was intended to be\nused in production, therefore the code quality is good. Actually, I have \nwritten an \nasync \ndjango database [backend](https://github.com/Bi-Coloured-Python-Rock-Snake/pgbackend) using it.\n\nThe same approach is used by sqlalchemy to provide its async features. However, \nthis library is more advanced in terms of features as it **provides support \nfor REPL, context managers, iterators, context vars**.\n\nHowever, I decided to discontinue it. The thing is there are two approaches to \nasync programming: one uses \nthe \"colored \nfunctions\" approach (a term first used in a [post](\nhttps://journal.stuffwithstuff.com/2015/02/01/what-color-is-your-function/) \nby Bob Nystrom),\nand the other one does not. A vivid example is **Python vs. Ruby**: Python \ncoroutines are colored and Ruby Fibers (appeared in 1.9) are not.\n\nAs you can \nsee, this repository is placed in the \"Bi-Coloured-Python-Rock-Snake\" group. \nWhen I titled it that way, I considered function colors a bad thing, however I \nhave come to a conclusion that the matter of things is far from \nunambiguity. There \nare many nuances to this but shortly speaking, I embraced the **Python way of \nthings**.\n\nStill, the library can be a nice demo of what sqlalchemy is doing under the \nhood. The name is a tribute \nto sqlalchemy.\n\n## Description\n\nThis library makes it possible for the regular (sync-looking) functions to have \nasync \nimplementation under the hood.\nIt does that by using\n[greenlet](https://github.com/python-greenlet/greenlet).\n\nThe main principle is to separate sync and async code by two different \ngreenlets. Then, all async tasks are being sent to the async greenlet and \nexecuted there,\nwhile the sync greenlet doesn't do any I/O itself.\n\n## Install\n\n```\npip install creature\n```\n\n## Usage\n\nYou can turn an async function into a sync one by using `exempt` decorator:\n\n```python\nfrom creature import exempt\n\n\n@exempt\nasync def sleep(secs):\n    await asyncio.sleep(secs)\n    print(f'Slept for {secs} seconds')\n\n\n@exempt\nasync def download(url):\n    async with httpx.AsyncClient() as client:\n        resp = await client.get(url)\n        print(f'Downloaded {len(resp.content)} bytes')\n```\n\n\"Exempt\" means that coroutines are exempted from the current greenlet, and \nsent to another.\n\nNow, to call those functions you have two options: \n\n**1. as_async decorator**\n\nYou can use `as_async` decorator to make the main function async again and \nrun it with an event loop:\n\n```python\nfrom creature import as_async\n\n\n@as_async\ndef main():\n    sleep(0.5)\n    download('https://www.python.org')\n\n\nasyncio.run(main())\n```\n\nWhich will print\n\n```commandline\nSlept for 0.5 seconds\nDownloaded 50856 bytes\n```\n\n**2. start_loop()**\n\nYou can also start an event loop yourself - this may be useful for interactive \nuse or scripts.\n\n```python\nimport creature\ncreature.start_loop()\n\nsleep(0.5)\ndownload('https://www.python.org')\n```\n\nWhich will print the same.\n\n**The REPL**\n\nstart_loop() doesn't work in IPython REPL. It works in the PyCharm Console \nthough.\nThe reason is that IPython starts the asyncio loop itself. The \nprompt_toolkit used by IPython, also needs an async loop.\n\nSo, for IPython there \nis a \ndifferent solution:\n\n```python\nimport creature\ncreature.ipy.enable()\n```\n\nThis works in both PyCharm Console and IPython.\n\n**Context vars**\n\nasyncio has `contextvars` module that gives coroutines access to their \n\"context\". creature has its own contextvars for the very same purpose.\n\ngreenlet itself has support for contextvars: all greenlets have different \ncontexts.\nHowever, as you should know by now, we have two \ngreenlets, sync and async, and it's natural for them to share the context.\n\nThe class intentionally is called context_var, not to import the standard \nContextVar by mistake.\n\n```python\ncreature.start_loop()\n\nvar = creature.context_var(__name__, 'var', default=-1)\n\n\n@exempt\nasync def f1():\n    # set() returns the previous value\n    assert var.set(1) == -1\n\n\ndef f2():\n    assert var.get() == 1\n\n\nf1()\nf2()\n```\n\nAs you can see, sync and async functions can use shared context.\n\n**Context managers**\n\nFunctions are not the only thing you can come across, sometimes you have to \ndeal with context managers too. creature can map async context managers to the \nsync ones. Here is how it is done:\n\n```python\n@exempt_cm\n@asynccontextmanager\nasync def have_rest(before, after):\n    await asyncio.sleep(before)\n    try:\n        yield\n    finally:\n        await asyncio.sleep(after)\n\nwith have_rest(1, 3):\n    print('Party!')\n```\n\n`exempt_cm` stays for \"exempt the context manager\", of course. This feature \nhad been \nuseful when I \nwas working on the async backend for django, because the psycopg3 driver \nuses context managers extensively.\n\n**Iterators**\n\nYou get the principle, don't you? Similarly, we have `exempt_it` for iterators.\n\n```python\n@exempt_it\nasync def counter():\n    for i in range(3):\n        await asyncio.sleep(0.1 * i)\n        yield i\n\nassert list(counter()) == [0, 1, 2]\n```\n\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "",
    "version": "0.1.17",
    "split_keywords": [
        "greenlet",
        "fibers",
        "async",
        "sqlalchemy"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c82164bfbeb52558540eb78bb8dea99eaadd912da58c148089be697862a8011b",
                "md5": "4d3967effa683f925577dd5a6de93315",
                "sha256": "6dc2899c241b0d717867a65eb5be1008f54c83ebd70c269ee426d9d3a7baa607"
            },
            "downloads": -1,
            "filename": "creature-0.1.17-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4d3967effa683f925577dd5a6de93315",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10,<4.0",
            "size": 10080,
            "upload_time": "2023-01-28T04:25:36",
            "upload_time_iso_8601": "2023-01-28T04:25:36.570432Z",
            "url": "https://files.pythonhosted.org/packages/c8/21/64bfbeb52558540eb78bb8dea99eaadd912da58c148089be697862a8011b/creature-0.1.17-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-01-28 04:25:36",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "lcname": "creature"
}
        
Elapsed time: 0.03572s