fan-tools


Namefan-tools JSON
Version 4.4.0 PyPI version JSON
download
home_pageNone
SummaryVarious python stuff: testing, aio helpers, etc
upload_time2024-10-03 17:26:00
maintainerNone
docs_urlNone
authorNone
requires_pythonNone
licenseNone
keywords asyncio testing
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # About this package


[![Build Status](https://img.shields.io/github/workflow/status/micro-fan/fan_tools/master)](https://github.com/micro-fan/fan_tools/actions)
[![PyPi version](https://img.shields.io/pypi/v/fan_tools.svg)](https://pypi.python.org/pypi/fan_tools)


Here are set of internal tools that are shared between different projects internally. Originally most tools related to testing, so they provide some base classes for various cases in testing

**NOTE: all our tools are intentially support only 3.8+ python.**
Some might work with other versions, but we're going to be free from all these crutches to backport things like `async/await` to lower versions, so if it works - fine, if not - feel free to send PR, but it isn't going to be merged all times.

## Testing helpers

### Caching decorators


```python
# cache async function that returns pydantic.BaseModel
from fan_tools.python import cache_async

@cache_async[type(dict)](fname, model, {})
async def func():
    return model


# cache sync function that returns json serializable response
from fan_tools.python import memoize

@memoize
def func(*args, **kwargs):
    return json.dumps(
        {
            'args': args,
            'kwargs': kwargs,
        }
    )
```

### ApiUrls

Defined in `fan_tools/testing/__init__.py`. Required for defining nested urls with formatting.

You can use it in fixtures, like:


```python
@pytest.fixture(scope='session')
def api(api_v_base):
    yield ApiUrls('{}/'.format(api_v_base), {
        'password_reset_request': 'password/request/code/',
        'password_reset': 'password/reset/',
        'user_review_list': 'user/{user_id}/review/',
        'user_review': 'user/{user_id}/review/{review_id}/',
        'wine_review': 'wine/{wine_id}/review/',
        'drink_review': 'drink/{drink_id}/review/',
    })


def test_review_list(user, api):
    resp = user.get_json(api.user_review_list(user_id=user1.id), {'page_size': 2})
```




### PropsMeta

You can find source in `fan_tools/testing/meta.py`.

For now it convert methods that are started with `prop__` into descriptors with cache.

```python
class A(metaclass=PropsMeta):
    def prop__conn(self):
        conn = SomeConnection()
        return conn
```

Became:

```python
class A:
    @property
    def conn(self):
        if not hasattr(self, '__conn'):
            setattr(self, '__conn', SomeConnection())
        return self.__conn
```

Thus it allows quite nice style of testing with lazy initialization. Like:

```python
class MyTest(TestCase, metaclass=PropsMeta):
    def prop__conn(self):
        return psycopg2.connect('')

    def prop__cursor(self):
        return self.conn.cursor()

    def test_simple_query(self):
        self.cursor.execute('select 1;')
        row = self.cursor.fetchone()
        assert row[0] == 1, 'Row: {}'.format(row)

```

Here you just get and use `self.cursor`, but automatically you get connection and cursor and cache they.

This is just simple example, complex tests can use more deep relations in tests. And this approach is way more easier and faster than complex `setUp` methods.


## fan_tools.unix helpers

Basic unix helpers

* run - run command in shell
* succ - wrapper around `run` with return code and stderr check
* wait_socket - wait for socket awailable (eg. you can wait for postgresql with `wait_socket('localhost', 5432)`
* asucc - asynchronous version of `succ` for use with `await`. supports realtime logging
* source - acts similar to bash 'source' or '.' commands.
* cd - contextmanager to do something with temporarily changed directory

#### interpolate_sysenv

Format string with system variables + defaults.

```python
PG_DEFAULTS = {
    'PGDATABASE': 'postgres',
    'PGPORT': 5432,
    'PGHOST': 'localhost',
    'PGUSER': 'postgres',
    'PGPASSWORD': '',
    }
DSN = interpolate_sysenv('postgresql://{PGUSER}:{PGPASSWORD}@{PGHOST}:{PGPORT}/{PGDATABASE}', PG_DEFAULTS)
```


## fan_tools.fan_logging.JSFormatter

Enable json output with additional fields, suitable for structured logging into ELK or similar solutions.

Accepts `env_vars` key with environmental keys that should be included into log.

```python
# this example uses safe_logger as handler (pip install safe_logger)
import logging
import logging.config


LOGGING = {
    'version': 1,
    'disable_existing_loggers': True,
    'formatters': {
        'json': {
            '()': 'fan_tools.fan_logging.JSFormatter',
            'env_vars': ['HOME'],
        },
        'standard': {
            'format': '%(asctime)s [%(levelname)s] %(name)s: %(message)s'
        },
    },
    'handlers': {
        'default': {
            'level': 'DEBUG',
            'class': 'safe_logger.TimedRotatingFileHandlerSafe',
            'filename': 'test_json.log',
            'when': 'midnight',
            'interval': 1,
            'backupCount': 30,
            'formatter': 'json',
            },
    },
    'loggers': {
        '': {
            'handlers': ['default'],
            'level': 'DEBUG',
        },
    },
}

logging.config.dictConfig(LOGGING)
log = logging.getLogger('TestLogger')

log.debug('test debug')
log.info('test info')
log.warn('test warn')
log.error('test error')
```


## fan_tools.mon_server.MetricsServer

FastAPI based server that servers metrics in prometheus format.

```
import uvicorn

from fan_tools.mon_server.certs import update_certs_loop

app = FastAPI()
mserver = MetricsServer(app)
mserver.add_task(update_certs_loop, hosts=['perfectlabel.io', 'robopickles.com'])
uvicorn.run(app, host='0.0.0.0', port=os.environ.get('MONITORING_PORT', 8000))
```


## fan_tools.backup

There are two backup helpers: `fan_tools.backup.s3.S3backup` and `fan_tools.backup.gcloud.GCloud`

We're assuming that backup script has access to backup execution and dump directory.

Default setup includes support for docker container that access DB.

By default script provides interface for monitoring (last backup date).

`fan_tools.backup.s3.S3backup` provides external script called `fan_s3_backup` that has accepts some configuration via environmental variables.

* *ENABLE_BACKUP* - you need enable this by setting to non `false` value, default: false
* *BACKUP_DB_CONTAINER* - container for backup command execution
* *BACKUP_DB_SCRIPT* - command for exectuion on db server from above. default: `/create_backup.py`
* *BACKUP_COMMAND* - overrides all above
*  `-b/--bucket` - to define bucket. default for s3: environmental variable *AWS_BACKUP_BUCKET*
* *BACKUP_PREFIX* or `-p/--prefix` - directory backup prefix, usually it is subfolder for dumps, default: `backups/`
* `-d/--daemonize` - should we run in daemonized mode
* *MONITORING_PORT* - port for listen when run in daemonized mode. default: 80

S3 specific:

* *AWS_BACKUP_KEY*
* *AWS_BACKUP_SECRET*
* *AWS_BACKUP_BUCKET*



## fan_tools.drf.serializers.EnumSerializer

Allow you to deserealize incoming strings into `Enum` values.
You should add `EnumSerializer` into your serializers by hand.

```python
from enum import IntEnum

from django.db import models
from rest_framework import serializers

from fan_tools.drf.serializers import EnumSerializer


class MyEnum(IntEnum):
  one = 1
  two = 2

class ExampleModel(models.Model):
  value = models.IntegerField(choices=[(x.name, x.value) for x in MyEnum])

class ExampleSerializer(serializers.ModelSerializer):
  value = EnumSerializer(MyEnum)

# this allows you to post value as: {'value': 'one'}
```

Due to `Enum` and `IntegerField` realizations you may use `Enum.value` in querysets

```python
ExampleModel.objects.filter(value=MyEnum.two)
```

## fan_tools.django.log_requests.LoggerMiddleware

LoggerMiddleware will log request meta + raw post data into log.

For django<1.10 please use `fan_tools.django.log_requests.DeprecatedLoggerMiddleware`


## fan_tools.django.request_uniq

Decorator adds a unique for each uwsgi request dict as first function
 argument.
For tests mock `_get_request_unique_cache`


## fan_tools.django.call_once_on_commit

Make function called only once on transaction commit. Here is examples
 where function `do_some_useful` will be called only once after
 transaction has been committed.
```python
class SomeModel(models.Model):
    name = IntegerField()

@call_once_on_commit
def do_some_useful():
    pass


def hook(sender, instance, **kwargs):
    do_some_useful()

models.signals.post_save.connect(hook, sender=SomeModel)

with transaction.atomic():
    some_model = SomeModel()
    some_model.name = 'One'
    some_model.save()
    some_model.name = 'Two'
    some_model.save()
```

For tests with nested transactions (commit actually most times is not
 called) it is useful to override behaviour `call_once_on_commit`
 when decorated function executed right in place where it is called.
 To do so mock `on_commit` function. Example pytest fixture:
```
@pytest.fixture(scope='session', autouse=True)
def immediate_on_commit():
    def side_effect():
        return lambda f: f()

    with mock.patch('fan_tools.django.on_commit', side_effect=side_effect) as m:
        yield m

```

## fan_tools.django.fields.ChoicesEnum

Used for choices attribute for in model field

```
class FooBarEnum(ChoicesEnum):
    foo = 1
    bar = 2

class ExampleModel(models.Model):
    type = models.IntegerField(choices=FooBarEnum.get_choices())
```


## fan_tools.django.db.utils.set_word_similarity_threshold

Allow to set postgres trigram word similarity threshold for default django database connection

```
set_word_similarity_threshold(0.4)
```


## fan_tools.django.contrib.postgres.models.LTreeModel

Django Model containing postgres ltree

```
class LTreeExampleModel(LTreeModel):
```


## fan_tools.django.contrib.postgres.fields.LTreeDescendants

Lookup for postgres ltree descendants

```
LTreeExampleModel.objects.filter(path__descendants='root.level1')
```

## fan_tools.django.contrib.postgres.fields.LTreeNlevel

Lookup for postgres ltree by level depth

```
LTreeExampleModel.objects.filter(path__nlevel=2)
```

## fan_tools.django.db.pgfields.SimilarityLookup

Postgres `text %> text` operator

```
# Add this import to models.py (file should be imported before lookup usage)
import fan_tools.django.db.pgfields  # noqa

Books.objects.filter(title__similar='Animal Farm')
```

## fan_tools.django.db.pgfields.WordSimilarity

Postgres `text1 <<-> text2` operator. It returns `1 - word_similarity(text1, text2)`

```
from django.db.models import Value, F

similarity = WordSimilarity(Value('Animal Farm'), F('title'))
Books.objects.annotate(similarity=similarity)
```

## fan_tools.drf.filters.NumberInFilter

Django filter that match if integer is in the integers list separated by comma

```
class ExampleFilterSet(FilterSet):
    example_values = NumberInFilter(field_name='example_value', lookup_expr='in')
```

## fan_tools.django.mail.Mail

Send text and html emails using django templates.

```
Mail(
    recipient_list=[user.email],
    template_name='user/emails/reset_password',
    context={
        'frontend_url': settings.FRONTEND_URL,
    },
).send()
```

## fan_tools.django.url.build_absolute_uri

Get domain section of absolute url of current page using django request object.

```
build_absolute_uri(request)
```


## fan_tools.drf.forms.use_form

Helps to use power of serializers for simple APIs checks.


```python
from rest_framework import serializers
from rest_framework.decorators import api_view
from fan_tools.drf import use_form


class SimpleForm(serializers.Serializer):
    test_int = serializers.IntegerField()
    test_str = serializers.CharField()


@api_view(['GET'])
@use_form(SimpleForm)
def my_api(data):
    print(f'Data: {data["test_int"]} and {data["test_str"]}')
```

## fan_tools.drf.pagination.ApiPageNumberPagination

Allow turn off pagination by specifying zero page_zize.

```
REST_FRAMEWORK = {
    'DEFAULT_PAGINATION_CLASS': 'fan_tools.drf.pagination.ApiPageNumberPagination',
    ...
}
```

## fan_tools.rest_framework.renderers.ApiRenderer

Pretty Django Rest Framework API renderer with error codes.

```
REST_FRAMEWORK = {
    'DEFAULT_RENDERER_CLASSES': (
        'fan_tools.drf.renderers.ApiRenderer',
    },
    ...
}
```

## fan_tools.rest_framework.handlers.api_exception_handler

Pretty Django Rest Framework API exception handler with error codes.

```
REST_FRAMEWORK = {
    'EXCEPTION_HANDLER': 'fan_tools.drf.handlers.api_exception_handler',
    ...
}
```

## fan_tools.drf.asserts.assert_validation_error

Helper assert function to be used in tests to match the validation error codes.

```
assert_validation_error(response, 'email', 'unique')
```

## fan_tools.aio_utils.DbRecordsProcessorWorker

Asyncio worker which wait for new records in postgres db table and process them.

## fan_tools.aio_utils.dict_query/sql_update
aiopg shortcuts


## fan_tools.python.execfile

Backport of python's 2 `execfile` function.

Usage: execfile('path/to/file.py', globals(), locals())

Returns: True if file exists and executed, False if file doesn't exist


## fan_tools.doc_utils.fan_sphinx

Sphinx extensions to generate documentation for django restframework serializers and examples for http requests.

In order to use them specify dependency for package installation:
```
pip install fan_tools[doc_utils]
```

Usage:
```
# Add to Sphinx conf.py
extensions = [
    # ...
    'fan_tools.doc_utils.fan_sphinx.dyn_serializer',
    'fan_tools.doc_utils.fan_sphinx.http_log'
]
```

## Commands

### fan_env_yaml

Convert template yaml with substituion of `%{ENV_NAME}` strings to appropriate environment variables.

Usage: `fan_env_yaml src_file dst_file`


### fan_ci_script

Helper to run default CI pipeline. Defaults are set up for [giltab defaults](https://docs.gitlab.com/ee/ci/variables/#predefined-variables-environment-variables). Includes stages:

* build docker image with temporary name (commit sha by default)
* run tests (optional)
* push branch (by default only for master and staging branches)
* push tag, if there are tags
* cache image with common name
* delete image with temporary name

It's optimized for parallel launches, so you need to use unique temporary name (`--temp-name`). We want keep our system clean if possible, so we'll delete this tag in the end. But we don't want to repeat basic steps over and over, so we will cache image with common cache name (`--cache-name`), it will remove previous cached image.

### fan_wait

Wait for socket awailable/not-awailable with timeout.

```
# Wait until database port up for 180 seconds
fan_wait -t 180 postgres 5432

# Wait until nginx port down for 30 seconds
fan_wait -t 30 nginx 80
```

### run_filebeat

* checks environmental variables `-e KEY=VALUE -e KEY2=VALUE2`
* converts yaml template `fan_env_yaml {TEMPLATE} /tmp/filebeat.yml`
* run `/usr/bin/filebeat /tmp/filebeat.yml`

```
run_filebeat -e CHECKME=VALUE path_to_template
```


### doc_serializer

* output rst with list of serializers
* generates documentation artifacts for serializers

```
usage: doc_serializer [-h] [--rst] [--artifacts]

Parse serializers sources

optional arguments:
  -h, --help   show this help message and exit
  --rst        Output rst with serializers
  --artifacts  Write serializers artifacts
```


### image_utils.Transpose

Save rotated by exif tag images. Some browsers/applications don't respect this tag, 
so it is easier to do that explicitly.

```python
class Image(models.Model):
    uploaded_by = models.ForeignKey(User, blank=True, null=True, on_delete=models.SET_NULL)

    image = models.ImageField(blank=True, upload_to=image_upload_to)
    thumb_image = models.ImageField(blank=True, upload_to=thumb_upload_to)

    full_url = models.CharField(blank=True, max_length=255)
    thumb_url = models.CharField(blank=True, max_length=255)

    created = models.DateTimeField(auto_now_add=True)
    updated = models.DateTimeField(auto_now=True)

class ImageSerializer(ModelSerializer):
    class Meta:
        model = Image
        fields = ['id', 'created', 'updated', 'full_url', 'thumb_url']

class UploadImageView(views.GenericAPIView):
    permission_classes = [IsAuthenticated]

    def post(self, request, *args, **kwargs):
        image_data = request.data['image']
        # Fix an image orientation based on exif and remove exif from the resulted image.
        transformed_image = Transpose().process(image_data)
        obj = Image.objects.create(uploaded_by=request.user, image=transformed_image)
        obj.full_url = obj.image.url
        obj.save()

        s = ImageSerializer(instance=obj)
        return Response(s.data)
```


### fan_tools.metrics

Helper to send metrics. [Example for datadog](examples/send_dd_metric.py)

Usually you want to setup some kind of notification for metric with name `error_metric`. It is sent by `send_error_metric`.


For DataDog your metric query will look like:

```
sum:error_metric{service:prod*} by {error_type,service}.as_count()
```


# development

```bash
# keep docker container
tox -e py311-django40 -- --keep-db django_tests
tox -e py311-django40 -- --keep-db --docker-skip django_tests
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "fan-tools",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "asyncio, testing",
    "author": null,
    "author_email": "cybergrind <cybergind+pypi@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/8a/cf/751a09d30426ced8da9cf5facbedb940bda415cff61647767d6d940fedf1/fan_tools-4.4.0.tar.gz",
    "platform": null,
    "description": "# About this package\n\n\n[![Build Status](https://img.shields.io/github/workflow/status/micro-fan/fan_tools/master)](https://github.com/micro-fan/fan_tools/actions)\n[![PyPi version](https://img.shields.io/pypi/v/fan_tools.svg)](https://pypi.python.org/pypi/fan_tools)\n\n\nHere are set of internal tools that are shared between different projects internally. Originally most tools related to testing, so they provide some base classes for various cases in testing\n\n**NOTE: all our tools are intentially support only 3.8+ python.**\nSome might work with other versions, but we're going to be free from all these crutches to backport things like `async/await` to lower versions, so if it works - fine, if not - feel free to send PR, but it isn't going to be merged all times.\n\n## Testing helpers\n\n### Caching decorators\n\n\n```python\n# cache async function that returns pydantic.BaseModel\nfrom fan_tools.python import cache_async\n\n@cache_async[type(dict)](fname, model, {})\nasync def func():\n    return model\n\n\n# cache sync function that returns json serializable response\nfrom fan_tools.python import memoize\n\n@memoize\ndef func(*args, **kwargs):\n    return json.dumps(\n        {\n            'args': args,\n            'kwargs': kwargs,\n        }\n    )\n```\n\n### ApiUrls\n\nDefined in `fan_tools/testing/__init__.py`. Required for defining nested urls with formatting.\n\nYou can use it in fixtures, like:\n\n\n```python\n@pytest.fixture(scope='session')\ndef api(api_v_base):\n    yield ApiUrls('{}/'.format(api_v_base), {\n        'password_reset_request': 'password/request/code/',\n        'password_reset': 'password/reset/',\n        'user_review_list': 'user/{user_id}/review/',\n        'user_review': 'user/{user_id}/review/{review_id}/',\n        'wine_review': 'wine/{wine_id}/review/',\n        'drink_review': 'drink/{drink_id}/review/',\n    })\n\n\ndef test_review_list(user, api):\n    resp = user.get_json(api.user_review_list(user_id=user1.id), {'page_size': 2})\n```\n\n\n\n\n### PropsMeta\n\nYou can find source in `fan_tools/testing/meta.py`.\n\nFor now it convert methods that are started with `prop__` into descriptors with cache.\n\n```python\nclass A(metaclass=PropsMeta):\n    def prop__conn(self):\n        conn = SomeConnection()\n        return conn\n```\n\nBecame:\n\n```python\nclass A:\n    @property\n    def conn(self):\n        if not hasattr(self, '__conn'):\n            setattr(self, '__conn', SomeConnection())\n        return self.__conn\n```\n\nThus it allows quite nice style of testing with lazy initialization. Like:\n\n```python\nclass MyTest(TestCase, metaclass=PropsMeta):\n    def prop__conn(self):\n        return psycopg2.connect('')\n\n    def prop__cursor(self):\n        return self.conn.cursor()\n\n    def test_simple_query(self):\n        self.cursor.execute('select 1;')\n        row = self.cursor.fetchone()\n        assert row[0] == 1, 'Row: {}'.format(row)\n\n```\n\nHere you just get and use `self.cursor`, but automatically you get connection and cursor and cache they.\n\nThis is just simple example, complex tests can use more deep relations in tests. And this approach is way more easier and faster than complex `setUp` methods.\n\n\n## fan_tools.unix helpers\n\nBasic unix helpers\n\n* run - run command in shell\n* succ - wrapper around `run` with return code and stderr check\n* wait_socket - wait for socket awailable (eg. you can wait for postgresql with `wait_socket('localhost', 5432)`\n* asucc - asynchronous version of `succ` for use with `await`. supports realtime logging\n* source - acts similar to bash 'source' or '.' commands.\n* cd - contextmanager to do something with temporarily changed directory\n\n#### interpolate_sysenv\n\nFormat string with system variables + defaults.\n\n```python\nPG_DEFAULTS = {\n    'PGDATABASE': 'postgres',\n    'PGPORT': 5432,\n    'PGHOST': 'localhost',\n    'PGUSER': 'postgres',\n    'PGPASSWORD': '',\n    }\nDSN = interpolate_sysenv('postgresql://{PGUSER}:{PGPASSWORD}@{PGHOST}:{PGPORT}/{PGDATABASE}', PG_DEFAULTS)\n```\n\n\n## fan_tools.fan_logging.JSFormatter\n\nEnable json output with additional fields, suitable for structured logging into ELK or similar solutions.\n\nAccepts `env_vars` key with environmental keys that should be included into log.\n\n```python\n# this example uses safe_logger as handler (pip install safe_logger)\nimport logging\nimport logging.config\n\n\nLOGGING = {\n    'version': 1,\n    'disable_existing_loggers': True,\n    'formatters': {\n        'json': {\n            '()': 'fan_tools.fan_logging.JSFormatter',\n            'env_vars': ['HOME'],\n        },\n        'standard': {\n            'format': '%(asctime)s [%(levelname)s] %(name)s: %(message)s'\n        },\n    },\n    'handlers': {\n        'default': {\n            'level': 'DEBUG',\n            'class': 'safe_logger.TimedRotatingFileHandlerSafe',\n            'filename': 'test_json.log',\n            'when': 'midnight',\n            'interval': 1,\n            'backupCount': 30,\n            'formatter': 'json',\n            },\n    },\n    'loggers': {\n        '': {\n            'handlers': ['default'],\n            'level': 'DEBUG',\n        },\n    },\n}\n\nlogging.config.dictConfig(LOGGING)\nlog = logging.getLogger('TestLogger')\n\nlog.debug('test debug')\nlog.info('test info')\nlog.warn('test warn')\nlog.error('test error')\n```\n\n\n## fan_tools.mon_server.MetricsServer\n\nFastAPI based server that servers metrics in prometheus format.\n\n```\nimport uvicorn\n\nfrom fan_tools.mon_server.certs import update_certs_loop\n\napp = FastAPI()\nmserver = MetricsServer(app)\nmserver.add_task(update_certs_loop, hosts=['perfectlabel.io', 'robopickles.com'])\nuvicorn.run(app, host='0.0.0.0', port=os.environ.get('MONITORING_PORT', 8000))\n```\n\n\n## fan_tools.backup\n\nThere are two backup helpers: `fan_tools.backup.s3.S3backup` and `fan_tools.backup.gcloud.GCloud`\n\nWe're assuming that backup script has access to backup execution and dump directory.\n\nDefault setup includes support for docker container that access DB.\n\nBy default script provides interface for monitoring (last backup date).\n\n`fan_tools.backup.s3.S3backup` provides external script called `fan_s3_backup` that has accepts some configuration via environmental variables.\n\n* *ENABLE_BACKUP* - you need enable this by setting to non `false` value, default: false\n* *BACKUP_DB_CONTAINER* - container for backup command execution\n* *BACKUP_DB_SCRIPT* - command for exectuion on db server from above. default: `/create_backup.py`\n* *BACKUP_COMMAND* - overrides all above\n*  `-b/--bucket` - to define bucket. default for s3: environmental variable *AWS_BACKUP_BUCKET*\n* *BACKUP_PREFIX* or `-p/--prefix` - directory backup prefix, usually it is subfolder for dumps, default: `backups/`\n* `-d/--daemonize` - should we run in daemonized mode\n* *MONITORING_PORT* - port for listen when run in daemonized mode. default: 80\n\nS3 specific:\n\n* *AWS_BACKUP_KEY*\n* *AWS_BACKUP_SECRET*\n* *AWS_BACKUP_BUCKET*\n\n\n\n## fan_tools.drf.serializers.EnumSerializer\n\nAllow you to deserealize incoming strings into `Enum` values.\nYou should add `EnumSerializer` into your serializers by hand.\n\n```python\nfrom enum import IntEnum\n\nfrom django.db import models\nfrom rest_framework import serializers\n\nfrom fan_tools.drf.serializers import EnumSerializer\n\n\nclass MyEnum(IntEnum):\n  one = 1\n  two = 2\n\nclass ExampleModel(models.Model):\n  value = models.IntegerField(choices=[(x.name, x.value) for x in MyEnum])\n\nclass ExampleSerializer(serializers.ModelSerializer):\n  value = EnumSerializer(MyEnum)\n\n# this allows you to post value as: {'value': 'one'}\n```\n\nDue to `Enum` and `IntegerField` realizations you may use `Enum.value` in querysets\n\n```python\nExampleModel.objects.filter(value=MyEnum.two)\n```\n\n## fan_tools.django.log_requests.LoggerMiddleware\n\nLoggerMiddleware will log request meta + raw post data into log.\n\nFor django<1.10 please use `fan_tools.django.log_requests.DeprecatedLoggerMiddleware`\n\n\n## fan_tools.django.request_uniq\n\nDecorator adds a unique for each uwsgi request dict as first function\n argument.\nFor tests mock `_get_request_unique_cache`\n\n\n## fan_tools.django.call_once_on_commit\n\nMake function called only once on transaction commit. Here is examples\n where function `do_some_useful` will be called only once after\n transaction has been committed.\n```python\nclass SomeModel(models.Model):\n    name = IntegerField()\n\n@call_once_on_commit\ndef do_some_useful():\n    pass\n\n\ndef hook(sender, instance, **kwargs):\n    do_some_useful()\n\nmodels.signals.post_save.connect(hook, sender=SomeModel)\n\nwith transaction.atomic():\n    some_model = SomeModel()\n    some_model.name = 'One'\n    some_model.save()\n    some_model.name = 'Two'\n    some_model.save()\n```\n\nFor tests with nested transactions (commit actually most times is not\n called) it is useful to override behaviour `call_once_on_commit`\n when decorated function executed right in place where it is called.\n To do so mock `on_commit` function. Example pytest fixture:\n```\n@pytest.fixture(scope='session', autouse=True)\ndef immediate_on_commit():\n    def side_effect():\n        return lambda f: f()\n\n    with mock.patch('fan_tools.django.on_commit', side_effect=side_effect) as m:\n        yield m\n\n```\n\n## fan_tools.django.fields.ChoicesEnum\n\nUsed for choices attribute for in model field\n\n```\nclass FooBarEnum(ChoicesEnum):\n    foo = 1\n    bar = 2\n\nclass ExampleModel(models.Model):\n    type = models.IntegerField(choices=FooBarEnum.get_choices())\n```\n\n\n## fan_tools.django.db.utils.set_word_similarity_threshold\n\nAllow to set postgres trigram word similarity threshold for default django database connection\n\n```\nset_word_similarity_threshold(0.4)\n```\n\n\n## fan_tools.django.contrib.postgres.models.LTreeModel\n\nDjango Model containing postgres ltree\n\n```\nclass LTreeExampleModel(LTreeModel):\n```\n\n\n## fan_tools.django.contrib.postgres.fields.LTreeDescendants\n\nLookup for postgres ltree descendants\n\n```\nLTreeExampleModel.objects.filter(path__descendants='root.level1')\n```\n\n## fan_tools.django.contrib.postgres.fields.LTreeNlevel\n\nLookup for postgres ltree by level depth\n\n```\nLTreeExampleModel.objects.filter(path__nlevel=2)\n```\n\n## fan_tools.django.db.pgfields.SimilarityLookup\n\nPostgres `text %> text` operator\n\n```\n# Add this import to models.py (file should be imported before lookup usage)\nimport fan_tools.django.db.pgfields  # noqa\n\nBooks.objects.filter(title__similar='Animal Farm')\n```\n\n## fan_tools.django.db.pgfields.WordSimilarity\n\nPostgres `text1 <<-> text2` operator. It returns `1 - word_similarity(text1, text2)`\n\n```\nfrom django.db.models import Value, F\n\nsimilarity = WordSimilarity(Value('Animal Farm'), F('title'))\nBooks.objects.annotate(similarity=similarity)\n```\n\n## fan_tools.drf.filters.NumberInFilter\n\nDjango filter that match if integer is in the integers list separated by comma\n\n```\nclass ExampleFilterSet(FilterSet):\n    example_values = NumberInFilter(field_name='example_value', lookup_expr='in')\n```\n\n## fan_tools.django.mail.Mail\n\nSend text and html emails using django templates.\n\n```\nMail(\n    recipient_list=[user.email],\n    template_name='user/emails/reset_password',\n    context={\n        'frontend_url': settings.FRONTEND_URL,\n    },\n).send()\n```\n\n## fan_tools.django.url.build_absolute_uri\n\nGet domain section of absolute url of current page using django request object.\n\n```\nbuild_absolute_uri(request)\n```\n\n\n## fan_tools.drf.forms.use_form\n\nHelps to use power of serializers for simple APIs checks.\n\n\n```python\nfrom rest_framework import serializers\nfrom rest_framework.decorators import api_view\nfrom fan_tools.drf import use_form\n\n\nclass SimpleForm(serializers.Serializer):\n    test_int = serializers.IntegerField()\n    test_str = serializers.CharField()\n\n\n@api_view(['GET'])\n@use_form(SimpleForm)\ndef my_api(data):\n    print(f'Data: {data[\"test_int\"]} and {data[\"test_str\"]}')\n```\n\n## fan_tools.drf.pagination.ApiPageNumberPagination\n\nAllow turn off pagination by specifying zero page_zize.\n\n```\nREST_FRAMEWORK = {\n    'DEFAULT_PAGINATION_CLASS': 'fan_tools.drf.pagination.ApiPageNumberPagination',\n    ...\n}\n```\n\n## fan_tools.rest_framework.renderers.ApiRenderer\n\nPretty Django Rest Framework API renderer with error codes.\n\n```\nREST_FRAMEWORK = {\n    'DEFAULT_RENDERER_CLASSES': (\n        'fan_tools.drf.renderers.ApiRenderer',\n    },\n    ...\n}\n```\n\n## fan_tools.rest_framework.handlers.api_exception_handler\n\nPretty Django Rest Framework API exception handler with error codes.\n\n```\nREST_FRAMEWORK = {\n    'EXCEPTION_HANDLER': 'fan_tools.drf.handlers.api_exception_handler',\n    ...\n}\n```\n\n## fan_tools.drf.asserts.assert_validation_error\n\nHelper assert function to be used in tests to match the validation error codes.\n\n```\nassert_validation_error(response, 'email', 'unique')\n```\n\n## fan_tools.aio_utils.DbRecordsProcessorWorker\n\nAsyncio worker which wait for new records in postgres db table and process them.\n\n## fan_tools.aio_utils.dict_query/sql_update\naiopg shortcuts\n\n\n## fan_tools.python.execfile\n\nBackport of python's 2 `execfile` function.\n\nUsage: execfile('path/to/file.py', globals(), locals())\n\nReturns: True if file exists and executed, False if file doesn't exist\n\n\n## fan_tools.doc_utils.fan_sphinx\n\nSphinx extensions to generate documentation for django restframework serializers and examples for http requests.\n\nIn order to use them specify dependency for package installation:\n```\npip install fan_tools[doc_utils]\n```\n\nUsage:\n```\n# Add to Sphinx conf.py\nextensions = [\n    # ...\n    'fan_tools.doc_utils.fan_sphinx.dyn_serializer',\n    'fan_tools.doc_utils.fan_sphinx.http_log'\n]\n```\n\n## Commands\n\n### fan_env_yaml\n\nConvert template yaml with substituion of `%{ENV_NAME}` strings to appropriate environment variables.\n\nUsage: `fan_env_yaml src_file dst_file`\n\n\n### fan_ci_script\n\nHelper to run default CI pipeline. Defaults are set up for [giltab defaults](https://docs.gitlab.com/ee/ci/variables/#predefined-variables-environment-variables). Includes stages:\n\n* build docker image with temporary name (commit sha by default)\n* run tests (optional)\n* push branch (by default only for master and staging branches)\n* push tag, if there are tags\n* cache image with common name\n* delete image with temporary name\n\nIt's optimized for parallel launches, so you need to use unique temporary name (`--temp-name`). We want keep our system clean if possible, so we'll delete this tag in the end. But we don't want to repeat basic steps over and over, so we will cache image with common cache name (`--cache-name`), it will remove previous cached image.\n\n### fan_wait\n\nWait for socket awailable/not-awailable with timeout.\n\n```\n# Wait until database port up for 180 seconds\nfan_wait -t 180 postgres 5432\n\n# Wait until nginx port down for 30 seconds\nfan_wait -t 30 nginx 80\n```\n\n### run_filebeat\n\n* checks environmental variables `-e KEY=VALUE -e KEY2=VALUE2`\n* converts yaml template `fan_env_yaml {TEMPLATE} /tmp/filebeat.yml`\n* run `/usr/bin/filebeat /tmp/filebeat.yml`\n\n```\nrun_filebeat -e CHECKME=VALUE path_to_template\n```\n\n\n### doc_serializer\n\n* output rst with list of serializers\n* generates documentation artifacts for serializers\n\n```\nusage: doc_serializer [-h] [--rst] [--artifacts]\n\nParse serializers sources\n\noptional arguments:\n  -h, --help   show this help message and exit\n  --rst        Output rst with serializers\n  --artifacts  Write serializers artifacts\n```\n\n\n### image_utils.Transpose\n\nSave rotated by exif tag images. Some browsers/applications don't respect this tag, \nso it is easier to do that explicitly.\n\n```python\nclass Image(models.Model):\n    uploaded_by = models.ForeignKey(User, blank=True, null=True, on_delete=models.SET_NULL)\n\n    image = models.ImageField(blank=True, upload_to=image_upload_to)\n    thumb_image = models.ImageField(blank=True, upload_to=thumb_upload_to)\n\n    full_url = models.CharField(blank=True, max_length=255)\n    thumb_url = models.CharField(blank=True, max_length=255)\n\n    created = models.DateTimeField(auto_now_add=True)\n    updated = models.DateTimeField(auto_now=True)\n\nclass ImageSerializer(ModelSerializer):\n    class Meta:\n        model = Image\n        fields = ['id', 'created', 'updated', 'full_url', 'thumb_url']\n\nclass UploadImageView(views.GenericAPIView):\n    permission_classes = [IsAuthenticated]\n\n    def post(self, request, *args, **kwargs):\n        image_data = request.data['image']\n        # Fix an image orientation based on exif and remove exif from the resulted image.\n        transformed_image = Transpose().process(image_data)\n        obj = Image.objects.create(uploaded_by=request.user, image=transformed_image)\n        obj.full_url = obj.image.url\n        obj.save()\n\n        s = ImageSerializer(instance=obj)\n        return Response(s.data)\n```\n\n\n### fan_tools.metrics\n\nHelper to send metrics. [Example for datadog](examples/send_dd_metric.py)\n\nUsually you want to setup some kind of notification for metric with name `error_metric`. It is sent by `send_error_metric`.\n\n\nFor DataDog your metric query will look like:\n\n```\nsum:error_metric{service:prod*} by {error_type,service}.as_count()\n```\n\n\n# development\n\n```bash\n# keep docker container\ntox -e py311-django40 -- --keep-db django_tests\ntox -e py311-django40 -- --keep-db --docker-skip django_tests\n```\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Various python stuff: testing, aio helpers, etc",
    "version": "4.4.0",
    "project_urls": {
        "github": "http://github.com/micro-fan/fan_tools"
    },
    "split_keywords": [
        "asyncio",
        " testing"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "b3dbabf7267a12ac1fa3a626c3d7d22a98b3ec650f9dd259de72f1a36167f6d8",
                "md5": "e775eeece868f1a9f8a2787454df5c89",
                "sha256": "0d49ff6d216180018d4d3578b26e02eeab70df76f38593736dbd2972030a7f83"
            },
            "downloads": -1,
            "filename": "fan_tools-4.4.0-py2.py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "e775eeece868f1a9f8a2787454df5c89",
            "packagetype": "bdist_wheel",
            "python_version": "py2.py3",
            "requires_python": null,
            "size": 71605,
            "upload_time": "2024-10-03T17:25:57",
            "upload_time_iso_8601": "2024-10-03T17:25:57.909675Z",
            "url": "https://files.pythonhosted.org/packages/b3/db/abf7267a12ac1fa3a626c3d7d22a98b3ec650f9dd259de72f1a36167f6d8/fan_tools-4.4.0-py2.py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "8acf751a09d30426ced8da9cf5facbedb940bda415cff61647767d6d940fedf1",
                "md5": "02a24e77ab439ca7842cf3136803c3f9",
                "sha256": "210b85b17aae6795996a10b31b3fd71a2abb753df25b3138e70cba4e4f490d05"
            },
            "downloads": -1,
            "filename": "fan_tools-4.4.0.tar.gz",
            "has_sig": false,
            "md5_digest": "02a24e77ab439ca7842cf3136803c3f9",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 89771,
            "upload_time": "2024-10-03T17:26:00",
            "upload_time_iso_8601": "2024-10-03T17:26:00.108189Z",
            "url": "https://files.pythonhosted.org/packages/8a/cf/751a09d30426ced8da9cf5facbedb940bda415cff61647767d6d940fedf1/fan_tools-4.4.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-03 17:26:00",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "micro-fan",
    "github_project": "fan_tools",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "tox": true,
    "lcname": "fan-tools"
}
        
Elapsed time: 0.87265s