GrePT


NameGrePT JSON
Version 1.3.0 PyPI version JSON
download
home_page
SummaryTalk to your code
upload_time2023-11-15 17:36:52
maintainer
docs_urlNone
author
requires_python>=3.9
license
keywords gpt openai gpt-3
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # GrePT 
### Talk to your code

## Getting Started
1. Download:
> `pip install GrePT` or `git clone https://github.com/jackbarry24/GrePT` then `pip install -e .`
2. Set your OpenAI API key
> Get an [OpenAI](https://beta.openai.com/) key.
> Add the API key to env: `export OPENAI_API_KEY=<key>` and consider adding it to your `.bashrc` so it remains between sessions.

## Usage
GrePT comes with two command line tools out of the box:
- `grept-embed`: used to calculate embeddings for files within the fs
- `grept`: used to query the filesystem or a pre-existing embedding using ChatGPT

### `grept-embed`
Uses chromadb to store embeddings. Pass in a list of files+directory paths and it embeds the file contents in each to a parquet. You can then use these embeddings with the `grept` command. It defaults to storing embeddings in the `.chromadb` directory.
>Optional Flags: 
> - `-p`: path to store embeddings
> - `-l`: recursive depth for file system crawling
> - `-x`: list of file suffixes to filter by

Examples:

`grept-embed src/ test/ -l 2`: Embed all files in `src/` and `test/` directories and next subdirectory layer, save in the `.chromadb/` folder.
### `grept`
The main chat interface for working with files and embeddings. If chat mode is specified it parses the inputted files, passing them in as chat entries to the API. This is preferable when working with small codebases but when the codebase + chat history exceeds the token limit (4096 for gpt-3.5-turbo) performance degrades. In this case it is best to embed a bunch of files using `grept-embed`. 
>Required Flags:
>- `-c`: chat mode, pass in each specified file as a chat (mutex with `-e`)
>- `-e`: embedding mode, pass in a preexisting embedding (mutex with `-c`)
>
>Optional Flags:
>- `-l`: recursive depth for file system crawling
>- `-x`: list of file suffixes to filter by
>- `-t`: max response tokens
>- `-p`: path to load embeddings from if using `-e`

Examples:

`grept src/ -c`: Pass in all files in the `src/` directory as chats to the API. 

`grept -e -p embedding1/`: Load the embeddings from the `embedding1/` directory and query them.

`grept ./ -l 3 -x .py -c`: Query all python files in 3 layers of subdirectories from the `./` directory in chat mode. 

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "GrePT",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": "",
    "keywords": "GPT,OpenAI,GPT-3",
    "author": "",
    "author_email": "Jack Barry <jack.barry@live.com>",
    "download_url": "https://files.pythonhosted.org/packages/51/ba/7c4689698df5db657e618f8c2386a63096188264cd2f13c167377b55be72/GrePT-1.3.0.tar.gz",
    "platform": null,
    "description": "# GrePT \n### Talk to your code\n\n## Getting Started\n1. Download:\n> `pip install GrePT` or `git clone https://github.com/jackbarry24/GrePT` then `pip install -e .`\n2. Set your OpenAI API key\n> Get an [OpenAI](https://beta.openai.com/) key.\n> Add the API key to env: `export OPENAI_API_KEY=<key>` and consider adding it to your `.bashrc` so it remains between sessions.\n\n## Usage\nGrePT comes with two command line tools out of the box:\n- `grept-embed`: used to calculate embeddings for files within the fs\n- `grept`: used to query the filesystem or a pre-existing embedding using ChatGPT\n\n### `grept-embed`\nUses chromadb to store embeddings. Pass in a list of files+directory paths and it embeds the file contents in each to a parquet. You can then use these embeddings with the `grept` command. It defaults to storing embeddings in the `.chromadb` directory.\n>Optional Flags: \n> - `-p`: path to store embeddings\n> - `-l`: recursive depth for file system crawling\n> - `-x`: list of file suffixes to filter by\n\nExamples:\n\n`grept-embed src/ test/ -l 2`: Embed all files in `src/` and `test/` directories and next subdirectory layer, save in the `.chromadb/` folder.\n### `grept`\nThe main chat interface for working with files and embeddings. If chat mode is specified it parses the inputted files, passing them in as chat entries to the API. This is preferable when working with small codebases but when the codebase + chat history exceeds the token limit (4096 for gpt-3.5-turbo) performance degrades. In this case it is best to embed a bunch of files using `grept-embed`. \n>Required Flags:\n>- `-c`: chat mode, pass in each specified file as a chat (mutex with `-e`)\n>- `-e`: embedding mode, pass in a preexisting embedding (mutex with `-c`)\n>\n>Optional Flags:\n>- `-l`: recursive depth for file system crawling\n>- `-x`: list of file suffixes to filter by\n>- `-t`: max response tokens\n>- `-p`: path to load embeddings from if using `-e`\n\nExamples:\n\n`grept src/ -c`: Pass in all files in the `src/` directory as chats to the API. \n\n`grept -e -p embedding1/`: Load the embeddings from the `embedding1/` directory and query them.\n\n`grept ./ -l 3 -x .py -c`: Query all python files in 3 layers of subdirectories from the `./` directory in chat mode. \n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Talk to your code",
    "version": "1.3.0",
    "project_urls": {
        "Homepage": "https://github.com/jackbarry24/GrePT"
    },
    "split_keywords": [
        "gpt",
        "openai",
        "gpt-3"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "085cf5a77ec6b27257fb1efac902ba3a99b1ac9ded1ae14d34ce19c249df510f",
                "md5": "081434bf9da140fc65afa1ad7cab38e1",
                "sha256": "3ba2ad6ece34ecd3df1cec8f0ac0b0bb5e2b44a18f976e5dea2726945faeda36"
            },
            "downloads": -1,
            "filename": "GrePT-1.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "081434bf9da140fc65afa1ad7cab38e1",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 9679,
            "upload_time": "2023-11-15T17:36:50",
            "upload_time_iso_8601": "2023-11-15T17:36:50.906576Z",
            "url": "https://files.pythonhosted.org/packages/08/5c/f5a77ec6b27257fb1efac902ba3a99b1ac9ded1ae14d34ce19c249df510f/GrePT-1.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "51ba7c4689698df5db657e618f8c2386a63096188264cd2f13c167377b55be72",
                "md5": "0e362c355c0acb0bf380257b157d45fa",
                "sha256": "08ea773696c71b194b4309225440d75775139289ad532d2f9a16351547425c61"
            },
            "downloads": -1,
            "filename": "GrePT-1.3.0.tar.gz",
            "has_sig": false,
            "md5_digest": "0e362c355c0acb0bf380257b157d45fa",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 7898,
            "upload_time": "2023-11-15T17:36:52",
            "upload_time_iso_8601": "2023-11-15T17:36:52.422524Z",
            "url": "https://files.pythonhosted.org/packages/51/ba/7c4689698df5db657e618f8c2386a63096188264cd2f13c167377b55be72/GrePT-1.3.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-11-15 17:36:52",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "jackbarry24",
    "github_project": "GrePT",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "grept"
}
        
Elapsed time: 0.22672s