split-python4gpt


Namesplit-python4gpt JSON
Version 1.0.3 PyPI version JSON
download
home_pagehttps://github.com/twardoch/split-python4gpt
SummaryPython tool designed to reorganize large Python projects into minified files based on a specified token limit. This is particularly useful for processing large Python projects with GPT models, as it allows the models to handle the data in manageable chunks.
upload_time2023-06-22 14:28:05
maintainer
docs_urlNone
authorAdam Twardoch
requires_python<3.11,>=3.10
licenseApache-2.0
keywords python code-summarization openai text-summarization summarization code-processing gpt data-preprocessing openai-gpt gpt-3 gpt-4 gpt-35-turbo gpt-35-turbo-16k
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage
            # split_python4gpt

`split_python4gpt` is a Python tool designed to reorganize large Python projects into minified files based on a specified token limit. This is particularly useful for processing large Python projects with GPT models, as it allows the models to handle the data in manageable chunks.

_**Version 1.0.3** (2023-06-22)_

### NOT IMPLEMENTED YET

Warning: The code does not yet implement the splitting or token counting, only type inference and minification. Use at your own risk.

## Installation

You can install `split_python4gpt` via pip:

```bash
pip install split_python4gpt
```

## CLI Usage

After installation, you can use the `pysplit4gpt` or `python3.10 -m split_python4gpt` command: 

```
python3.10 -m split_python4gpt PATH_OR_FOLDER [FLAGS]

POSITIONAL ARGUMENTS
    PATH_OR_FOLDER
        Type: str | pathlib.Path
        Path to the input Python file or folder.

FLAGS
    -o, --out=OUT
        Type: Optional[str | pathlib...
        Default: None
        Output folder for the processed files. Defaults to input folder.
    -p, --pyis=PYIS
        Type: Optional[str | pathlib...
        Default: None
        Directory for storing generated .pyi files. Defaults to the output folder.
    -t, --types=TYPES
        Type: bool
        Default: True
        Infer types using PyType? Defaults to True.
    --mini=MINI
        Type: bool
        Default: True
        Minify the Python scripts? Defaults to True.
    --mini_docs=MINI_DOCS
        Type: bool
        Default: True
        Remove docstrings? Defaults to True.
    --mini_globs=MINI_GLOBS
        Type: bool
        Default: False
        Rename global names? Defaults to False.
    --mini_locs=MINI_LOCS
        Type: bool
        Default: False
        Rename local names? Defaults to False.
    --mini_lits=MINI_LITS
        Type: bool
        Default: True
        Hoist literal statements? Defaults to True.
    --mini_annotations=MINI_ANNOTATIONS
        Type: bool
        Default: True
        Remove annotations? Defaults to True.
    --mini_asserts=MINI_ASSERTS
        Type: bool
        Default: True
        Remove asserts? Defaults to True.
    --mini_debug=MINI_DEBUG
        Type: bool
        Default: True
        Remove debugging statements? Defaults to True.
    --mini_imports=MINI_IMPORTS
        Type: bool
        Default: True
        Combine imports? Defaults to True.
    --mini_obj=MINI_OBJ
        Type: bool
        Default: True
        Remove object base? Defaults to True.
    --mini_pass=MINI_PASS
        Type: bool
        Default: True
        Remove pass statements? Defaults to True.
    --mini_posargs=MINI_POSARGS
        Type: bool
        Default: True
        Convert positional to keyword args? Defaults to True.
    --mini_retnone=MINI_RETNONE
        Type: bool
        Default: True
        Remove explicit return None statements? Defaults to True.
    --mini_shebang=MINI_SHEBANG
        Type: bool
        Default: True
        Remove shebang? Defaults to True.
```

## Python usage

- **[See the API documentation](https://twardoch.github.io/split-python4gpt/API.html)** for more advanced usage

## Changelog

- v1.0.0: Initial release

## Contributing

Contributions to `split_python4gpt` are welcome! Please open an issue or submit a pull request on the [GitHub repository](https://github.com/twardoch/split-python4gpt).

## License

- Copyright (c) 2023 [Adam Twardoch](./AUTHORS.md)
- Written with assistance from ChatGPT
- Licensed under the [Apache License 2.0](./LICENSE.txt)<a id="split_python4gpt"></a>


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/twardoch/split-python4gpt",
    "name": "split-python4gpt",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "<3.11,>=3.10",
    "maintainer_email": "",
    "keywords": "python,code-summarization,openai,text-summarization,summarization,code-processing,gpt,data-preprocessing,openai-gpt,gpt-3,gpt-4,gpt-35-turbo,gpt-35-turbo-16k",
    "author": "Adam Twardoch",
    "author_email": "adam+github@twardoch.com",
    "download_url": "https://files.pythonhosted.org/packages/fe/30/49802f408e1603f37349a5e9d9bf7fee83120cd3681a26664f01e8b8fd7e/split_python4gpt-1.0.3.tar.gz",
    "platform": "any",
    "description": "# split_python4gpt\n\n`split_python4gpt` is a Python tool designed to reorganize large Python projects into minified files based on a specified token limit. This is particularly useful for processing large Python projects with GPT models, as it allows the models to handle the data in manageable chunks.\n\n_**Version 1.0.3** (2023-06-22)_\n\n### NOT IMPLEMENTED YET\n\nWarning: The code does not yet implement the splitting or token counting, only type inference and minification. Use at your own risk.\n\n## Installation\n\nYou can install `split_python4gpt` via pip:\n\n```bash\npip install split_python4gpt\n```\n\n## CLI Usage\n\nAfter installation, you can use the `pysplit4gpt` or `python3.10 -m split_python4gpt` command: \n\n```\npython3.10 -m split_python4gpt PATH_OR_FOLDER [FLAGS]\n\nPOSITIONAL ARGUMENTS\n    PATH_OR_FOLDER\n        Type: str | pathlib.Path\n        Path to the input Python file or folder.\n\nFLAGS\n    -o, --out=OUT\n        Type: Optional[str | pathlib...\n        Default: None\n        Output folder for the processed files. Defaults to input folder.\n    -p, --pyis=PYIS\n        Type: Optional[str | pathlib...\n        Default: None\n        Directory for storing generated .pyi files. Defaults to the output folder.\n    -t, --types=TYPES\n        Type: bool\n        Default: True\n        Infer types using PyType? Defaults to True.\n    --mini=MINI\n        Type: bool\n        Default: True\n        Minify the Python scripts? Defaults to True.\n    --mini_docs=MINI_DOCS\n        Type: bool\n        Default: True\n        Remove docstrings? Defaults to True.\n    --mini_globs=MINI_GLOBS\n        Type: bool\n        Default: False\n        Rename global names? Defaults to False.\n    --mini_locs=MINI_LOCS\n        Type: bool\n        Default: False\n        Rename local names? Defaults to False.\n    --mini_lits=MINI_LITS\n        Type: bool\n        Default: True\n        Hoist literal statements? Defaults to True.\n    --mini_annotations=MINI_ANNOTATIONS\n        Type: bool\n        Default: True\n        Remove annotations? Defaults to True.\n    --mini_asserts=MINI_ASSERTS\n        Type: bool\n        Default: True\n        Remove asserts? Defaults to True.\n    --mini_debug=MINI_DEBUG\n        Type: bool\n        Default: True\n        Remove debugging statements? Defaults to True.\n    --mini_imports=MINI_IMPORTS\n        Type: bool\n        Default: True\n        Combine imports? Defaults to True.\n    --mini_obj=MINI_OBJ\n        Type: bool\n        Default: True\n        Remove object base? Defaults to True.\n    --mini_pass=MINI_PASS\n        Type: bool\n        Default: True\n        Remove pass statements? Defaults to True.\n    --mini_posargs=MINI_POSARGS\n        Type: bool\n        Default: True\n        Convert positional to keyword args? Defaults to True.\n    --mini_retnone=MINI_RETNONE\n        Type: bool\n        Default: True\n        Remove explicit return None statements? Defaults to True.\n    --mini_shebang=MINI_SHEBANG\n        Type: bool\n        Default: True\n        Remove shebang? Defaults to True.\n```\n\n## Python usage\n\n- **[See the API documentation](https://twardoch.github.io/split-python4gpt/API.html)** for more advanced usage\n\n## Changelog\n\n- v1.0.0: Initial release\n\n## Contributing\n\nContributions to `split_python4gpt` are welcome! Please open an issue or submit a pull request on the [GitHub repository](https://github.com/twardoch/split-python4gpt).\n\n## License\n\n- Copyright (c) 2023 [Adam Twardoch](./AUTHORS.md)\n- Written with assistance from ChatGPT\n- Licensed under the [Apache License 2.0](./LICENSE.txt)<a id=\"split_python4gpt\"></a>\n\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "Python tool designed to reorganize large Python projects into minified files based on a specified token limit. This is particularly useful for processing large Python projects with GPT models, as it allows the models to handle the data in manageable chunks.",
    "version": "1.0.3",
    "project_urls": {
        "Documentation": "https://twardoch.github.io/split-python4gpt/",
        "Homepage": "https://github.com/twardoch/split-python4gpt",
        "Source": "https://github.com/twardoch/split-python4gpt"
    },
    "split_keywords": [
        "python",
        "code-summarization",
        "openai",
        "text-summarization",
        "summarization",
        "code-processing",
        "gpt",
        "data-preprocessing",
        "openai-gpt",
        "gpt-3",
        "gpt-4",
        "gpt-35-turbo",
        "gpt-35-turbo-16k"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "efa600969e8e022e017d129dc3ece1c1b3029fa59ccd18a3bf4d0ba8abab9d38",
                "md5": "a44a139feec3200b35b9ac1f7f825d22",
                "sha256": "685c66870a0d4149950de006b256ecef172bd51a4c60cf05791ddb9d5e488ca7"
            },
            "downloads": -1,
            "filename": "split_python4gpt-1.0.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a44a139feec3200b35b9ac1f7f825d22",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.11,>=3.10",
            "size": 10187,
            "upload_time": "2023-06-22T14:28:03",
            "upload_time_iso_8601": "2023-06-22T14:28:03.867647Z",
            "url": "https://files.pythonhosted.org/packages/ef/a6/00969e8e022e017d129dc3ece1c1b3029fa59ccd18a3bf4d0ba8abab9d38/split_python4gpt-1.0.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "fe3049802f408e1603f37349a5e9d9bf7fee83120cd3681a26664f01e8b8fd7e",
                "md5": "f30c23f67af68df122f725713422fe49",
                "sha256": "ad0c6c08fa557ebcb275b28a86c0b7ad5f3300e2ea3a5c8d8051d26a706a6b6a"
            },
            "downloads": -1,
            "filename": "split_python4gpt-1.0.3.tar.gz",
            "has_sig": false,
            "md5_digest": "f30c23f67af68df122f725713422fe49",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.11,>=3.10",
            "size": 15077,
            "upload_time": "2023-06-22T14:28:05",
            "upload_time_iso_8601": "2023-06-22T14:28:05.057611Z",
            "url": "https://files.pythonhosted.org/packages/fe/30/49802f408e1603f37349a5e9d9bf7fee83120cd3681a26664f01e8b8fd7e/split_python4gpt-1.0.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-06-22 14:28:05",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "twardoch",
    "github_project": "split-python4gpt",
    "travis_ci": false,
    "coveralls": true,
    "github_actions": true,
    "tox": true,
    "lcname": "split-python4gpt"
}
        
Elapsed time: 0.08344s