commandsgpt


Namecommandsgpt JSON
Version 1.6.0 PyPI version JSON
download
home_pagehttps://github.com/AlexisAndradeDev/CommandsGPT
SummaryAn implementation of GPT-4 that recognizes which commands it must run to fulfill an instruction, using a graph. Create new commands easily by describing them using natural language and coding the functions corresponding to the commands.
upload_time2024-12-22 09:10:50
maintainerNone
docs_urlNone
authorMartín Alexis Martínez Andrade
requires_python>=3.6
licenseNone
keywords
VCS
bugtrack_url
requirements annotated-types anyio certifi charset-normalizer colorama distro exceptiongroup h11 httpcore httpx idna jiter numpy openai pandas-stubs pandas pydantic-core pydantic python-dateutil pytz requests six sniffio tqdm types-pytz typing-extensions tzdata urllib3
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # CommandsGPT

An implementation of GPT-4 to recognize instructions. It recognizes which commands it must run to fulfill the user's instruction, using a graph where each node is a command and the data generated by each command can be passed to other commands.

Create new commands easily by describing them using natural language and coding the functions corresponding to the commands.

# Installation

* Install the `commandsgpt` module.

```
pip install commandsgpt
```

If you're using a virtual environment:
```
pipenv install commandsgpt
```

* You also have to install the OpenAI package:

```
pip install openai
```

or

```
pipenv install openai
```

* Set an environment variable OPENAI_API_KEY to store your OpenAI key.

# Usage

Create a `commands` dictionary that will store the commands described in natural language. 

```python
commands = {
    "REQUEST_USER_INPUT": {
        "description": "Asks the user to input data through the interface.",
        "arguments": {
            "message": {"description": "Message displayed to the user related to the data that will be requested (example: 'Enter your age').", "type": "string"},
        },
        "generates_data": {
            "input": {"description": "Data entered by the user", "type": "string"},
        },
    },
    ...
}
```

Now, create the functions that will be called when the commands are executed.

* The name of the function is irrelevant.
* The first parameter must be the Config object; the second one must be the Graph object. Normally, you won't work directly with these objects, so you don't have to worry about them. Just use them as the first two parameters.
* The following parameters must match those described in the commands dictionary (name and data type).
* The return value must be a dictionary. Its keys, values and data types must match the return values described in the commands dictionary.

*Example of a command function*

```python
from commands_gpt.config import Config
from commands_gpt.commands.graphs import Graph

def request_user_input_command(config: Config, graph: Graph, message: str) -> dict[str, Any]:
    input_ = input(f"{message}\n*: ")
    results = {
        "input": input_,
    }
    return results
```

Create a `command_name_to_func` dictionary that will take the name of a command and return the corresponding function.

*Example of command_name_to_func dictionary*
```python
command_name_to_func = {
    "REQUEST_USER_INPUT": request_user_input_command,
    ...
}
```

Add the ***essential commands*** to your commands dictionaries.
* These are the default commands that implement core logic to the model's thinking, like an IF command.
* If you already defined your own core logic commands (IF command, THINK command, CALCULATE command, etc.), then you are free not to use them.

```python
from commands_gpt.commands.commands_funcs import add_essential_commands
add_essential_commands(commands, command_name_to_func)
```

Your `config` object:
```python
# keyword arguments are optional
config = Config("gpt-4o", verbosity=2, explain_graph=True, save_graph_as_file=False)
```

Create an instruction:

```python
instruction = input("Enter your instruction: ")
```

Create a recognizer:

```python
recognizer = ComplexRecognizer(config, commands, command_name_to_func)
```

Pass your instruction to the recognizer model:

```python
commands_data_str = recognizer.recognize(instruction)
```

Create the graph of commands and execute it:

```python
graph = Graph(recognizer, commands_data_str)
graph.execute_commands(config)
```

# Example

Create two files: `custom_commands.py` and `main.py`.

## custom_commands.py

```python
from typing import Any
from pathlib import Path

from commands_gpt.config import Config
from commands_gpt.commands.graphs import Graph

commands = {
    "WRITE_TO_USER": {
        "description": "Writes something to the interface to communicate with the user.",
        "arguments": {
            "content": {"description": "Content to write.", "type": "string"},
        },
        "generates_data": {},
    },
    "REQUEST_USER_INPUT": {
        "description": "Asks the user to input data through the interface.",
        "arguments": {
            "message": {"description": "Message displayed to the user related to the data that will be requested (example: 'Enter your age').", "type": "string"},
        },
        "generates_data": {
            "input": {"description": "Data entered by the user", "type": "string"},
        },
    },
    "WRITE_FILE": {
        "description": "Write a file.",
        "arguments": {
            "content": {"description": "Content that will be written.", "type": "string"},
            "file_path": {"description": "Complete path of the file that will be written.", "type": "string"},
        },
        "generates_data": {},
    },
}

# Commands functions
# The name of the function is irrelevant
# The first argument must be the Config object, followed by the Graph object
# The arguments must match the arguments from the commands dictionary
# The return value must be a dictionary which keys must match the "generates_data" keys
# The data types must match the ones declared in the commands dictionary

def write_to_user_command(config: Config, graph: Graph, content: str) -> dict[str, Any]:
    print(f">>> {content}")
    return {}

def request_user_input_command(config: Config, graph: Graph, message: str) -> dict[str, Any]:
    input_ = input(f"{message}\n*: ")
    results = {
        "input": input_,
    }
    return results

def write_file_command(config: Config, graph: Graph, content: str, file_path: str) -> dict[str, Any]:
    file_dir = Path(file_path).parent
    assert file_dir.exists(), f"Container directory '{file_dir}' does not exist."
    with open(file_path, "w+", encoding="utf-8") as f:
        f.write(content)
        f.close()
    return {}

# add your functions here
command_name_to_func = {
    "WRITE_TO_USER": write_to_user_command,
    "REQUEST_USER_INPUT": request_user_input_command,
    "WRITE_FILE": write_file_command,
}
```

## main.py
```python
from commands_gpt.recognizers import ComplexRecognizer
from commands_gpt.commands.graphs import Graph
from commands_gpt.config import Config
from custom_commands import commands, command_name_to_func

from commands_gpt.commands.commands_funcs import add_essential_commands
add_essential_commands(commands, command_name_to_func)

chat_model = "gpt-4o"

config = Config(chat_model, verbosity=2, explain_graph=False, save_graph_as_file=False)

instruction = input("Enter your prompt: ")

recognizer = ComplexRecognizer(config, commands, command_name_to_func)

commands_data_str = recognizer.recognize(instruction)
graph = Graph(recognizer, commands_data_str)
graph.execute_commands(config)
```

Copyright (c) 2023, Martín Alexis Martínez Andrade.
All rights reserved.

Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are
met:

    * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.

    * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.

    * Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/AlexisAndradeDev/CommandsGPT",
    "name": "commandsgpt",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": null,
    "keywords": null,
    "author": "Mart\u00edn Alexis Mart\u00ednez Andrade",
    "author_email": "alexis.martinez.contacto@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/a5/e9/42ac3ef06612885d9b5e05be24bbc8abce0892a218a7645ebfc2b3cca505/commandsgpt-1.6.0.tar.gz",
    "platform": null,
    "description": "# CommandsGPT\n\nAn implementation of GPT-4 to recognize instructions. It recognizes which commands it must run to fulfill the user's instruction, using a graph where each node is a command and the data generated by each command can be passed to other commands.\n\nCreate new commands easily by describing them using natural language and coding the functions corresponding to the commands.\n\n# Installation\n\n* Install the `commandsgpt` module.\n\n```\npip install commandsgpt\n```\n\nIf you're using a virtual environment:\n```\npipenv install commandsgpt\n```\n\n* You also have to install the OpenAI package:\n\n```\npip install openai\n```\n\nor\n\n```\npipenv install openai\n```\n\n* Set an environment variable OPENAI_API_KEY to store your OpenAI key.\n\n# Usage\n\nCreate a `commands` dictionary that will store the commands described in natural language. \n\n```python\ncommands = {\n    \"REQUEST_USER_INPUT\": {\n        \"description\": \"Asks the user to input data through the interface.\",\n        \"arguments\": {\n            \"message\": {\"description\": \"Message displayed to the user related to the data that will be requested (example: 'Enter your age').\", \"type\": \"string\"},\n        },\n        \"generates_data\": {\n            \"input\": {\"description\": \"Data entered by the user\", \"type\": \"string\"},\n        },\n    },\n    ...\n}\n```\n\nNow, create the functions that will be called when the commands are executed.\n\n* The name of the function is irrelevant.\n* The first parameter must be the Config object; the second one must be the Graph object. Normally, you won't work directly with these objects, so you don't have to worry about them. Just use them as the first two parameters.\n* The following parameters must match those described in the commands dictionary (name and data type).\n* The return value must be a dictionary. Its keys, values and data types must match the return values described in the commands dictionary.\n\n*Example of a command function*\n\n```python\nfrom commands_gpt.config import Config\nfrom commands_gpt.commands.graphs import Graph\n\ndef request_user_input_command(config: Config, graph: Graph, message: str) -> dict[str, Any]:\n    input_ = input(f\"{message}\\n*: \")\n    results = {\n        \"input\": input_,\n    }\n    return results\n```\n\nCreate a `command_name_to_func` dictionary that will take the name of a command and return the corresponding function.\n\n*Example of command_name_to_func dictionary*\n```python\ncommand_name_to_func = {\n    \"REQUEST_USER_INPUT\": request_user_input_command,\n    ...\n}\n```\n\nAdd the ***essential commands*** to your commands dictionaries.\n* These are the default commands that implement core logic to the model's thinking, like an IF command.\n* If you already defined your own core logic commands (IF command, THINK command, CALCULATE command, etc.), then you are free not to use them.\n\n```python\nfrom commands_gpt.commands.commands_funcs import add_essential_commands\nadd_essential_commands(commands, command_name_to_func)\n```\n\nYour `config` object:\n```python\n# keyword arguments are optional\nconfig = Config(\"gpt-4o\", verbosity=2, explain_graph=True, save_graph_as_file=False)\n```\n\nCreate an instruction:\n\n```python\ninstruction = input(\"Enter your instruction: \")\n```\n\nCreate a recognizer:\n\n```python\nrecognizer = ComplexRecognizer(config, commands, command_name_to_func)\n```\n\nPass your instruction to the recognizer model:\n\n```python\ncommands_data_str = recognizer.recognize(instruction)\n```\n\nCreate the graph of commands and execute it:\n\n```python\ngraph = Graph(recognizer, commands_data_str)\ngraph.execute_commands(config)\n```\n\n# Example\n\nCreate two files: `custom_commands.py` and `main.py`.\n\n## custom_commands.py\n\n```python\nfrom typing import Any\nfrom pathlib import Path\n\nfrom commands_gpt.config import Config\nfrom commands_gpt.commands.graphs import Graph\n\ncommands = {\n    \"WRITE_TO_USER\": {\n        \"description\": \"Writes something to the interface to communicate with the user.\",\n        \"arguments\": {\n            \"content\": {\"description\": \"Content to write.\", \"type\": \"string\"},\n        },\n        \"generates_data\": {},\n    },\n    \"REQUEST_USER_INPUT\": {\n        \"description\": \"Asks the user to input data through the interface.\",\n        \"arguments\": {\n            \"message\": {\"description\": \"Message displayed to the user related to the data that will be requested (example: 'Enter your age').\", \"type\": \"string\"},\n        },\n        \"generates_data\": {\n            \"input\": {\"description\": \"Data entered by the user\", \"type\": \"string\"},\n        },\n    },\n    \"WRITE_FILE\": {\n        \"description\": \"Write a file.\",\n        \"arguments\": {\n            \"content\": {\"description\": \"Content that will be written.\", \"type\": \"string\"},\n            \"file_path\": {\"description\": \"Complete path of the file that will be written.\", \"type\": \"string\"},\n        },\n        \"generates_data\": {},\n    },\n}\n\n# Commands functions\n# The name of the function is irrelevant\n# The first argument must be the Config object, followed by the Graph object\n# The arguments must match the arguments from the commands dictionary\n# The return value must be a dictionary which keys must match the \"generates_data\" keys\n# The data types must match the ones declared in the commands dictionary\n\ndef write_to_user_command(config: Config, graph: Graph, content: str) -> dict[str, Any]:\n    print(f\">>> {content}\")\n    return {}\n\ndef request_user_input_command(config: Config, graph: Graph, message: str) -> dict[str, Any]:\n    input_ = input(f\"{message}\\n*: \")\n    results = {\n        \"input\": input_,\n    }\n    return results\n\ndef write_file_command(config: Config, graph: Graph, content: str, file_path: str) -> dict[str, Any]:\n    file_dir = Path(file_path).parent\n    assert file_dir.exists(), f\"Container directory '{file_dir}' does not exist.\"\n    with open(file_path, \"w+\", encoding=\"utf-8\") as f:\n        f.write(content)\n        f.close()\n    return {}\n\n# add your functions here\ncommand_name_to_func = {\n    \"WRITE_TO_USER\": write_to_user_command,\n    \"REQUEST_USER_INPUT\": request_user_input_command,\n    \"WRITE_FILE\": write_file_command,\n}\n```\n\n## main.py\n```python\nfrom commands_gpt.recognizers import ComplexRecognizer\nfrom commands_gpt.commands.graphs import Graph\nfrom commands_gpt.config import Config\nfrom custom_commands import commands, command_name_to_func\n\nfrom commands_gpt.commands.commands_funcs import add_essential_commands\nadd_essential_commands(commands, command_name_to_func)\n\nchat_model = \"gpt-4o\"\n\nconfig = Config(chat_model, verbosity=2, explain_graph=False, save_graph_as_file=False)\n\ninstruction = input(\"Enter your prompt: \")\n\nrecognizer = ComplexRecognizer(config, commands, command_name_to_func)\n\ncommands_data_str = recognizer.recognize(instruction)\ngraph = Graph(recognizer, commands_data_str)\ngraph.execute_commands(config)\n```\n\nCopyright (c) 2023, Mart\u00edn Alexis Mart\u00ednez Andrade.\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are\nmet:\n\n    * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.\n\n    * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.\n\n    * Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS\n\"AS IS\" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT\nLIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR\nA PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT\nOWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,\nSPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT\nLIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,\nDATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY\nTHEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\n(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\nOF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "An implementation of GPT-4 that recognizes which commands it must run to fulfill an instruction, using a graph. Create new commands easily by describing them using natural language and coding the functions corresponding to the commands.",
    "version": "1.6.0",
    "project_urls": {
        "Bug Tracker": "https://github.com/AlexisAndradeDev/CommandsGPT/issues",
        "Homepage": "https://github.com/AlexisAndradeDev/CommandsGPT",
        "repository": "https://github.com/AlexisAndradeDev/CommandsGPT"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "445c73a8f3e9f0be94e0595c7290ae6db00403c5470017a3016a0f379bcbe54f",
                "md5": "ecf24c5add5e98e69d9670bdfcbc3760",
                "sha256": "f0a006a41186ac69bbbf8cac67c0a1d30e18c2a0f2ca10e5ca0cfa77004cc0a7"
            },
            "downloads": -1,
            "filename": "commandsgpt-1.6.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "ecf24c5add5e98e69d9670bdfcbc3760",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 19177,
            "upload_time": "2024-12-22T09:10:48",
            "upload_time_iso_8601": "2024-12-22T09:10:48.820463Z",
            "url": "https://files.pythonhosted.org/packages/44/5c/73a8f3e9f0be94e0595c7290ae6db00403c5470017a3016a0f379bcbe54f/commandsgpt-1.6.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a5e942ac3ef06612885d9b5e05be24bbc8abce0892a218a7645ebfc2b3cca505",
                "md5": "036e294e8ed3538bfd4a00fbebb9acb3",
                "sha256": "85c80401e880b55522d1fa56933debfe3a6319395ff9a1658cf720b00d7cfa88"
            },
            "downloads": -1,
            "filename": "commandsgpt-1.6.0.tar.gz",
            "has_sig": false,
            "md5_digest": "036e294e8ed3538bfd4a00fbebb9acb3",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 18494,
            "upload_time": "2024-12-22T09:10:50",
            "upload_time_iso_8601": "2024-12-22T09:10:50.305010Z",
            "url": "https://files.pythonhosted.org/packages/a5/e9/42ac3ef06612885d9b5e05be24bbc8abce0892a218a7645ebfc2b3cca505/commandsgpt-1.6.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-22 09:10:50",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "AlexisAndradeDev",
    "github_project": "CommandsGPT",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "annotated-types",
            "specs": [
                [
                    "==",
                    "0.7.0"
                ]
            ]
        },
        {
            "name": "anyio",
            "specs": [
                [
                    "==",
                    "4.7.0"
                ]
            ]
        },
        {
            "name": "certifi",
            "specs": [
                [
                    "==",
                    "2024.12.14"
                ]
            ]
        },
        {
            "name": "charset-normalizer",
            "specs": [
                [
                    "==",
                    "3.4.0"
                ]
            ]
        },
        {
            "name": "colorama",
            "specs": [
                [
                    "==",
                    "0.4.6"
                ]
            ]
        },
        {
            "name": "distro",
            "specs": [
                [
                    "==",
                    "1.9.0"
                ]
            ]
        },
        {
            "name": "exceptiongroup",
            "specs": [
                [
                    "==",
                    "1.2.2"
                ]
            ]
        },
        {
            "name": "h11",
            "specs": [
                [
                    "==",
                    "0.14.0"
                ]
            ]
        },
        {
            "name": "httpcore",
            "specs": [
                [
                    "==",
                    "1.0.7"
                ]
            ]
        },
        {
            "name": "httpx",
            "specs": [
                [
                    "==",
                    "0.28.1"
                ]
            ]
        },
        {
            "name": "idna",
            "specs": [
                [
                    "==",
                    "3.10"
                ]
            ]
        },
        {
            "name": "jiter",
            "specs": [
                [
                    "==",
                    "0.8.2"
                ]
            ]
        },
        {
            "name": "numpy",
            "specs": [
                [
                    "==",
                    "2.2.1"
                ]
            ]
        },
        {
            "name": "openai",
            "specs": [
                [
                    "==",
                    "1.58.1"
                ]
            ]
        },
        {
            "name": "pandas-stubs",
            "specs": [
                [
                    "==",
                    "2.2.3.241126"
                ]
            ]
        },
        {
            "name": "pandas",
            "specs": [
                [
                    "==",
                    "2.2.3"
                ]
            ]
        },
        {
            "name": "pydantic-core",
            "specs": [
                [
                    "==",
                    "2.27.2"
                ]
            ]
        },
        {
            "name": "pydantic",
            "specs": [
                [
                    "==",
                    "2.10.4"
                ]
            ]
        },
        {
            "name": "python-dateutil",
            "specs": [
                [
                    "==",
                    "2.9.0.post0"
                ]
            ]
        },
        {
            "name": "pytz",
            "specs": [
                [
                    "==",
                    "2024.2"
                ]
            ]
        },
        {
            "name": "requests",
            "specs": [
                [
                    "==",
                    "2.32.3"
                ]
            ]
        },
        {
            "name": "six",
            "specs": [
                [
                    "==",
                    "1.17.0"
                ]
            ]
        },
        {
            "name": "sniffio",
            "specs": [
                [
                    "==",
                    "1.3.1"
                ]
            ]
        },
        {
            "name": "tqdm",
            "specs": [
                [
                    "==",
                    "4.67.1"
                ]
            ]
        },
        {
            "name": "types-pytz",
            "specs": [
                [
                    "==",
                    "2024.2.0.20241221"
                ]
            ]
        },
        {
            "name": "typing-extensions",
            "specs": [
                [
                    "==",
                    "4.12.2"
                ]
            ]
        },
        {
            "name": "tzdata",
            "specs": [
                [
                    "==",
                    "2024.2"
                ]
            ]
        },
        {
            "name": "urllib3",
            "specs": [
                [
                    "==",
                    "2.2.3"
                ]
            ]
        }
    ],
    "lcname": "commandsgpt"
}
        
Elapsed time: 1.20309s