![PyLiveDev](https://ouroboroscoding.s3.us-east-2.amazonaws.com/logos/PyLiveDev_128.png)
[![pypi version](https://img.shields.io/pypi/v/pylivedev.svg)](https://pypi.org/project/pylivedev) ![MIT License](https://img.shields.io/pypi/l/pylivedev.svg)
Python Live Development tool.
## Description
I created **PyLiveDev** because I work a lot in the microservices/REST space and found constantly having to run/restart services while developing and keeping track of multiple logs in separate windows, to be, quite frankly, a pain in my ass.
Inspired by live updates while using create-react-app in development, I wanted to see if there was a way I could make a python program run multiple services and keep track of the files imported. This way if anything changed it could automatically restart those services and save time while developing. As a bonus, piping all stdout/stderr to one screen so I could immediately see if I wrote bad code or was returning something unexpected.
It works by you creating a JSON configuration file called `.pylivedev` in the root of your python project and adding an Object member for each unique process, then running `pylivedev` from the root of your project.
## Install
```console
foo@bar:~$ pip install pylivedev
```
## Warning
If you are using PyLiveDev on a Linux kernel 2.6+, you may at some point run into an issue where the program quits, either quite immediately, or as more and more files are added to your project. This is due to a hard limit in an underlying library. When this happens add/adjust the following setting in your `/etc/sysctl.conf` file (will require root access):
```fs.inotify.max_user_watches=16384```
The default value is 8192, so adjust accordingly then reboot. Increase as necessary until PyLiveDev stops crashing.
## Run
```console
foo@bar:~$ pylivedev
```
## Configuration
```json
{
"rest": {
"command": "rest",
"mode": "module",
"tracked": false,
"python": "/venv/my_project/bin/python",
"arguments": ["-v"],
"additional_files": ["config.json"],
"unbuffered": true,
"verbose": false
}
}
```
| Name | Type | Mandatory | Description |
| ------ | ------ | ------ | ------ |
| command | String | Yes | The name of the script or module to run as a process. e.g. "services.rest", "main.py" |
| mode | "module" \| "script" \| "exe" | No | Tells pylivedev whether you are trying to run a stand alone script, a python module, or a binary (or non-parsable) application. Defaults to "script". |
| tracked | Boolean | No | When true, proccess is tracked via file changes. Use false for static or external modules. Defaults to true. Will be ignored if mode is set to "exe". |
| python | String | No | The full path to the python intepreter to use to run your process. Defaults to the python interpreter running pylivedev. |
| arguments | String[] | No | An array of additional arguments passed to the process. |
| additional_files | String[] | No | An array of additional files to be watched/observed for changes. |
| unbuffered | Boolean | No | Run the processed unbuffered, defaults to true. |
| verbose | Boolean | No | Runs pylivedev in verbose mode to give more information on what is happening, what imports were found, what files have changed, etc. Defaults to false. |
## Defaults
You can also use the special `__default__` member to store values that will be the same across processes. Anything in the proccess config will always overwrite the defaults.
```json
{
"__default__": {
"python": "/venv/my_project/bin/python",
"mode": "module",
"additional_files": ["config.json"],
},
"main": {
"command": "nodes.rest.main"
},
"admin": {
"command": "nodes.rest.admin"
},
"external": {
"command": "nodes.external"
},
"websocket": {
"command": "daemons.websocket"
}
}
```
The above would work for a file structure like the following
my_project/
|-- daemons/
|-- __init__.py
|-- websocket.py
|-- nodes/
|-- rest/
|-- __init__.py
|-- admin.py
|-- main.py
|-- __init__.py
|-- external.py
|-- records/
|-- __init__.py
|-- .pylivedev
|-- config.json
If, for example, nodes/rest/main.py imported the following:
```python
from time import time
from . import Rest
from records import User
class Main(Rest):
pass
```
The **PyLiveDev** would end up with the following list of files to watch/observe for changes
- config.json
- nodes/rest/main.py
- nodes/rest/\_\_init\_\_.py
- records/\_\_init\_\_.py
Any time any of these files is saved/changed on the system, **PyLiveDev** would shut down the "main" process, re-parse the module looking for imports, and then restart the process.
***Note*** system and pip imports will not be added to the list, like `time` in the above example. In most cases system files don't change often and it would waste resources to watch them. If you update a pip library, or update python, it's best to shut down **PyLiveDev** [CRTL-C] and restart it.
Raw data
{
"_id": null,
"home_page": "https://ouroboroscoding.com/pylivedev",
"name": "pylivedev",
"maintainer": "",
"docs_url": null,
"requires_python": "",
"maintainer_email": "",
"keywords": "python,live,development",
"author": "Chris Nasr - OuroborosCoding",
"author_email": "chris@ouroboroscoding.com",
"download_url": "https://files.pythonhosted.org/packages/7e/1c/c2dd4232e81976fc61a38a31b77ad450e29db53eb42d64f835f5db454897/pylivedev-1.2.1.tar.gz",
"platform": null,
"description": "![PyLiveDev](https://ouroboroscoding.s3.us-east-2.amazonaws.com/logos/PyLiveDev_128.png)\n\n[![pypi version](https://img.shields.io/pypi/v/pylivedev.svg)](https://pypi.org/project/pylivedev) ![MIT License](https://img.shields.io/pypi/l/pylivedev.svg)\n\nPython Live Development tool.\n\n## Description\n\nI created **PyLiveDev** because I work a lot in the microservices/REST space and found constantly having to run/restart services while developing and keeping track of multiple logs in separate windows, to be, quite frankly, a pain in my ass.\n\nInspired by live updates while using create-react-app in development, I wanted to see if there was a way I could make a python program run multiple services and keep track of the files imported. This way if anything changed it could automatically restart those services and save time while developing. As a bonus, piping all stdout/stderr to one screen so I could immediately see if I wrote bad code or was returning something unexpected.\n\nIt works by you creating a JSON configuration file called `.pylivedev` in the root of your python project and adding an Object member for each unique process, then running `pylivedev` from the root of your project.\n\n## Install\n\n```console\nfoo@bar:~$ pip install pylivedev\n```\n\n## Warning\n\nIf you are using PyLiveDev on a Linux kernel 2.6+, you may at some point run into an issue where the program quits, either quite immediately, or as more and more files are added to your project. This is due to a hard limit in an underlying library. When this happens add/adjust the following setting in your `/etc/sysctl.conf` file (will require root access):\n\n```fs.inotify.max_user_watches=16384```\n\nThe default value is 8192, so adjust accordingly then reboot. Increase as necessary until PyLiveDev stops crashing.\n\n## Run\n\n```console\nfoo@bar:~$ pylivedev\n```\n\n## Configuration\n\n```json\n{\n\t\"rest\": {\n\t\t\"command\": \"rest\",\n\t\t\"mode\": \"module\",\n\t\t\"tracked\": false,\n\t\t\"python\": \"/venv/my_project/bin/python\",\n\t\t\"arguments\": [\"-v\"],\n\t\t\"additional_files\": [\"config.json\"],\n\t\t\"unbuffered\": true,\n\t\t\"verbose\": false\n\t}\n}\n```\n\n| Name | Type | Mandatory | Description |\n| ------ | ------ | ------ | ------ |\n| command | String | Yes | The name of the script or module to run as a process. e.g. \"services.rest\", \"main.py\" |\n| mode | \"module\" \\| \"script\" \\| \"exe\" | No | Tells pylivedev whether you are trying to run a stand alone script, a python module, or a binary (or non-parsable) application. Defaults to \"script\". |\n| tracked | Boolean | No | When true, proccess is tracked via file changes. Use false for static or external modules. Defaults to true. Will be ignored if mode is set to \"exe\". |\n| python | String | No | The full path to the python intepreter to use to run your process. Defaults to the python interpreter running pylivedev. |\n| arguments | String[] | No | An array of additional arguments passed to the process. |\n| additional_files | String[] | No | An array of additional files to be watched/observed for changes. |\n| unbuffered | Boolean | No | Run the processed unbuffered, defaults to true. |\n| verbose | Boolean | No | Runs pylivedev in verbose mode to give more information on what is happening, what imports were found, what files have changed, etc. Defaults to false. |\n\n## Defaults\n\nYou can also use the special `__default__` member to store values that will be the same across processes. Anything in the proccess config will always overwrite the defaults.\n\n```json\n{\n\t\"__default__\": {\n\t\t\"python\": \"/venv/my_project/bin/python\",\n\t\t\"mode\": \"module\",\n\t\t\"additional_files\": [\"config.json\"],\n\t},\n\n\t\"main\": {\n\t\t\"command\": \"nodes.rest.main\"\n\t},\n\n\t\"admin\": {\n\t\t\"command\": \"nodes.rest.admin\"\n\t},\n\n\t\"external\": {\n\t\t\"command\": \"nodes.external\"\n\t},\n\n\t\"websocket\": {\n\t\t\"command\": \"daemons.websocket\"\n\t}\n}\n```\n\nThe above would work for a file structure like the following\n\n\tmy_project/\n\t|-- daemons/\n\t\t|-- __init__.py\n\t\t|-- websocket.py\n\t|-- nodes/\n\t\t|-- rest/\n\t\t\t|-- __init__.py\n\t\t\t|-- admin.py\n\t\t\t|-- main.py\n\t\t|-- __init__.py\n\t\t|-- external.py\n\t|-- records/\n\t\t|-- __init__.py\n\t|-- .pylivedev\n\t|-- config.json\n\nIf, for example, nodes/rest/main.py imported the following:\n\n```python\nfrom time import time\nfrom . import Rest\nfrom records import User\n\nclass Main(Rest):\n\tpass\n```\n\nThe **PyLiveDev** would end up with the following list of files to watch/observe for changes\n\n- config.json\n- nodes/rest/main.py\n- nodes/rest/\\_\\_init\\_\\_.py\n- records/\\_\\_init\\_\\_.py\n\nAny time any of these files is saved/changed on the system, **PyLiveDev** would shut down the \"main\" process, re-parse the module looking for imports, and then restart the process.\n\n***Note*** system and pip imports will not be added to the list, like `time` in the above example. In most cases system files don't change often and it would waste resources to watch them. If you update a pip library, or update python, it's best to shut down **PyLiveDev** [CRTL-C] and restart it.\n\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "PyLiveDev is used to keep track of files associated with your script so it can be re-started if any file is updated.",
"version": "1.2.1",
"project_urls": {
"Homepage": "https://ouroboroscoding.com/pylivedev",
"Source": "https://github.com/ouroboroscoding/pylivedev",
"Tracker": "https://github.com/ouroboroscoding/pylivedev/issues"
},
"split_keywords": [
"python",
"live",
"development"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "7e1cc2dd4232e81976fc61a38a31b77ad450e29db53eb42d64f835f5db454897",
"md5": "4f120ff608ae4da62892fa091201672d",
"sha256": "09b3b2a0d03809449a41d8cd7879197c9528608413b857abd564dd5211a44568"
},
"downloads": -1,
"filename": "pylivedev-1.2.1.tar.gz",
"has_sig": false,
"md5_digest": "4f120ff608ae4da62892fa091201672d",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 14541,
"upload_time": "2023-06-29T18:14:19",
"upload_time_iso_8601": "2023-06-29T18:14:19.838786Z",
"url": "https://files.pythonhosted.org/packages/7e/1c/c2dd4232e81976fc61a38a31b77ad450e29db53eb42d64f835f5db454897/pylivedev-1.2.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-06-29 18:14:19",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "ouroboroscoding",
"github_project": "pylivedev",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [],
"lcname": "pylivedev"
}