thread-executor


Namethread-executor JSON
Version 0.1.2 PyPI version JSON
download
home_pagehttps://pypi.org/project/thread-executor
SummaryA Python library for executing tasks in parallel with threads and queues
upload_time2022-12-25 14:15:08
maintainerTuanDC
docs_urlNone
authorTuanDC
requires_python>=3.6.2,<4.0.0
license
keywords multi thread threading concurency worker pool
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Executor
Fast execute task with python and less mem ops


## Why we need Thread Executor?

Python threading module is a good structure, it helps developers to folk a thread to run some background tasks.
Python have Queue mechanism to connect thread data.
But what problem??

- First, threading module folk threads but python not commit late time. Then know your thread can run, but you don't know when? It's oke fast with small traffic but when server high load you will have some problem, high pressure on memory because when you create too many thread cause slowness. `waste of time`

- Second, when you create and release threads many times, it'll increase memory and CPUs time of system. Sometime, developers did not handle exceptions and release thread. It can put more pressure to the application. `waste of resource`

## How to resolve problem??

This's my resolver.

- We create `exact` or `dynamic` number of threads. Then using `Job` - a unit bring data information to `Worker` to process. Workers don't need to release, and you only create 1 time or reset it when you update config.

- Job bring 2 importance field is: `func` and `args` and you can call them like `func(*args)` and get all the results and return on `callback` is optional.
- Your app doesn't need to create and release threads continuously
- Easy to access and using when coding.

## Disadvance?

- If you use callback then remembered to `add try catch` to handle thread leaked.
- If queue is full you need to wait for available queue slot. set `max_queue_size=0` to avoid this.
- If you restart your app, all the `Jobs` in `Queue` that have not been processed will be `lost`.

## Installtion

Now it's time you can install lib and experience

```bash
pip install thread-executor
```

## Usage : Interface list
```python3
send(job: Job) -> None # Push a job to the queue
wait() -> None # wait for all jobs to be completed without blocking each other
scale_up(number_threads: int) -> None # scale up number of threads
scale_down(self, number_threads: int) -> None # scale down number of threads
```

### Initial
```python3
from executor.safe_queue import Executor, Job

engine = Executor(number_threads=10, max_queue_size=0)
```
### Send Simple Job
```python
import time

def test_exec(*args, **kwargs):
    time.sleep(3)
    print(args)
    return [1, 2, 3]


def test_exec1(*args, **kwargs):
    print(kwargs)
    time.sleep(2)
    return {"a": 1, "b": 2, "c": 3}

engine.send(Job(func=test_exec, args=(1, 2), kwargs={}, callback=None, block=False))
engine.send(Job(func=test_exec1, args=(), kwargs={"time": 1}, callback=None, block=False))
engine.send(Job(func=test_exec1, args=(), kwargs={}, callback=None, block=False))
engine.send(Job(func=test_exec1, args=(), kwargs={}, callback=None, block=False))
engine.send(Job(func=test_exec1, args=(), kwargs={}, callback=None, block=False))
engine.wait()
```

### Send Job with callback
```python3
def call_back(result):
    print(result)
    
for i in range(5):
    engine.send(Job(func=test_exec1, args=(), kwargs={"time": 1}, callback=call_back, block=False))
engine.wait()
```


### Thread scale up/down

```python3
engine.scale_up(3)
engine.scale_down(3)
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://pypi.org/project/thread-executor",
    "name": "thread-executor",
    "maintainer": "TuanDC",
    "docs_url": null,
    "requires_python": ">=3.6.2,<4.0.0",
    "maintainer_email": "tuandao864@gmail.com",
    "keywords": "multi thread,threading,concurency,worker pool",
    "author": "TuanDC",
    "author_email": "tuandao864@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/3b/1e/82cd9a346b3d0c2439ec222d9cba374d80749b86f6c9700cbe9b8bcec63c/thread_executor-0.1.2.tar.gz",
    "platform": null,
    "description": "# Executor\nFast execute task with python and less mem ops\n\n\n## Why we need Thread Executor?\n\nPython threading module is a good structure, it helps developers to folk a thread to run some background tasks.\nPython have Queue mechanism to connect thread data.\nBut what problem??\n\n- First, threading module folk threads but python not commit late time. Then know your thread can run, but you don't know when? It's oke fast with small traffic but when server high load you will have some problem, high pressure on memory because when you create too many thread cause slowness. `waste of time`\n\n- Second, when you create and release threads many times, it'll increase memory and CPUs time of system. Sometime, developers did not handle exceptions and release thread. It can put more pressure to the application. `waste of resource`\n\n## How to resolve problem??\n\nThis's my resolver.\n\n- We create `exact` or `dynamic` number of threads. Then using `Job` - a unit bring data information to `Worker` to process. Workers don't need to release, and you only create 1 time or reset it when you update config.\n\n- Job bring 2 importance field is: `func` and `args` and you can call them like `func(*args)` and get all the results and return on `callback` is optional.\n- Your app doesn't need to create and release threads continuously\n- Easy to access and using when coding.\n\n## Disadvance?\n\n- If you use callback then remembered to `add try catch` to handle thread leaked.\n- If queue is full you need to wait for available queue slot. set `max_queue_size=0` to avoid this.\n- If you restart your app, all the `Jobs` in `Queue` that have not been processed will be `lost`.\n\n## Installtion\n\nNow it's time you can install lib and experience\n\n```bash\npip install thread-executor\n```\n\n## Usage : Interface list\n```python3\nsend(job: Job) -> None # Push a job to the queue\nwait() -> None # wait for all jobs to be completed without blocking each other\nscale_up(number_threads: int) -> None # scale up number of threads\nscale_down(self, number_threads: int) -> None # scale down number of threads\n```\n\n### Initial\n```python3\nfrom executor.safe_queue import Executor, Job\n\nengine = Executor(number_threads=10, max_queue_size=0)\n```\n### Send Simple Job\n```python\nimport time\n\ndef test_exec(*args, **kwargs):\n    time.sleep(3)\n    print(args)\n    return [1, 2, 3]\n\n\ndef test_exec1(*args, **kwargs):\n    print(kwargs)\n    time.sleep(2)\n    return {\"a\": 1, \"b\": 2, \"c\": 3}\n\nengine.send(Job(func=test_exec, args=(1, 2), kwargs={}, callback=None, block=False))\nengine.send(Job(func=test_exec1, args=(), kwargs={\"time\": 1}, callback=None, block=False))\nengine.send(Job(func=test_exec1, args=(), kwargs={}, callback=None, block=False))\nengine.send(Job(func=test_exec1, args=(), kwargs={}, callback=None, block=False))\nengine.send(Job(func=test_exec1, args=(), kwargs={}, callback=None, block=False))\nengine.wait()\n```\n\n### Send Job with callback\n```python3\ndef call_back(result):\n    print(result)\n    \nfor i in range(5):\n    engine.send(Job(func=test_exec1, args=(), kwargs={\"time\": 1}, callback=call_back, block=False))\nengine.wait()\n```\n\n\n### Thread scale up/down\n\n```python3\nengine.scale_up(3)\nengine.scale_down(3)\n```\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "A Python library for executing tasks in parallel with threads and queues",
    "version": "0.1.2",
    "split_keywords": [
        "multi thread",
        "threading",
        "concurency",
        "worker pool"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "md5": "10c7cfb689b8d897d8be0af41e82d6b5",
                "sha256": "984899840e1153e63f86e6f3e9b7948ea702e960cc6f2c60de1c595f46bbad04"
            },
            "downloads": -1,
            "filename": "thread_executor-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "10c7cfb689b8d897d8be0af41e82d6b5",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6.2,<4.0.0",
            "size": 7831,
            "upload_time": "2022-12-25T14:15:06",
            "upload_time_iso_8601": "2022-12-25T14:15:06.555587Z",
            "url": "https://files.pythonhosted.org/packages/75/2f/3aea4e9962b79b45ec49686e99699a31d3f792901101989aaa6f8a261eaf/thread_executor-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "md5": "6505093b8b00ef90bc2ea15f6b4f96f8",
                "sha256": "f61a1fcff9167d86105c418e5f64600561f05252b9a4340bdfed7931836b5fc6"
            },
            "downloads": -1,
            "filename": "thread_executor-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "6505093b8b00ef90bc2ea15f6b4f96f8",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6.2,<4.0.0",
            "size": 7872,
            "upload_time": "2022-12-25T14:15:08",
            "upload_time_iso_8601": "2022-12-25T14:15:08.544075Z",
            "url": "https://files.pythonhosted.org/packages/3b/1e/82cd9a346b3d0c2439ec222d9cba374d80749b86f6c9700cbe9b8bcec63c/thread_executor-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2022-12-25 14:15:08",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "lcname": "thread-executor"
}
        
Elapsed time: 0.02421s