# Pycodeless
> **Warning**
> This project is in a very early stage of development and may not always
> produce desired results. See the [limitations](#limitations) section for
> more information.
Pycodeless integrates the power of LLMs into the Python programming language
and allows you to use natural language as a universal interface for code
generation.
## Features
- [x] The `@codeless` decorator generates your function code from their type
annotations and docstrings. All generated code is cached in a local
`__pycodeless__` package and can be edited or committed to your version
control system.
- [ ] Support for different language models to allow for better customization
and offline usage.
- [ ] Custom docstring formatting to allow for referencing objects across the
whole codebase.
## Installation
You can install Pycodeless using pip:
```
pip install pycodeless
```
Since Pycodeless currently only works with the OpenAI API, you'll also need an
[OpenAI API key][openai-api-keys].
## Usage
Following is a sample code using Pycodeless.
```python
from pycodeless import codeless
# You can either specify your OpenAI API key in this way, or you can use the
# environment variable OPENAI_API_KEY.
# The same goes for specifying the model name, where the relevant environment
# variable is OPENAI_MODEL_NAME. If no model name is specified, Pycodeless
# will default to "gpt-3.5-turbo".
codeless.openai_api_key = "sk-fEaz..."
codeless.openai_model_name = "gpt-3.5-turbo"
@codeless
def greet(name: str) -> str:
"""
Make greeting for a person with the given name in pirate language
"""
@codeless
def spongebob_case(text: str) -> str:
"""
Convert the passed text into spongebob case
"""
def main():
print(spongebob_case(greet("Patrick")))
if __name__ == "__main__":
main()
```
## Limitations
At this point, Pycodeless works *reasonably* well with native Python type
hints or type annotations from popular third-party libraries that the LLMs
are familiar with and know how to import. Unless you are able to come up with
a very elaborate docstring prompt, custom type annotations will not work. More
exploration work is needed to figure out how to make this work in all cases.
Following is an non-exhaustive list of known limitations:
- The current code generation and parsing features might be buggy, as there's
currently no mechanism for telling whether an output from an LLM is actually
runnable Python code.
- The `@codeless` decorator may not work in all contexts (REPL, classes,
methods, etc.).
- There's currently no dependency management in place. This means that we
can't track changes across the whole codebase and regenerate functions when
their dependencies change. This also means that we can't remove any generated
imports when removing a generated function inside a generated module, because
there's no way of telling whether or not the import in question is used by
any other function. Another problem with this is that if there's a custom
type annotation in the function definition, we have no way of referencing it
in the generated module.
## FAQ
- **How is this different from GitHub Copilot?**
Pycodeless has indeed very similar functionality to GitHub Copilot, but
the goal of this project is very different. Firstly, Copilot autocompletes
your code in place and does not differentiate between the generated part of
the code and the original prompt. One of the main points of this project is
to completely shift the development focus onto the prompt itself and make
natural language a universal interface for code generation while moving the
generated code into the background. The development experience is really
distinct, and I encourage you to test this out for yourself. Another
important point is that Pycodeless makes code generation from natural
language a first-class Python citizen. The code generation happens in runtime,
which means we can leverage the whole Python infrastructure to enhance the
prompt specification process (think dynamic docstrings with custom tags to
reference various objects from the codebase and aid the LLMs, walking through
the dependency graphs of type annotations to provide additional context, etc.).
Also, I like the idea of not being dependent on any particular language model
or text editor.
## Contribution
If you'd like to report a bug or have an idea for a cool feature, issues and
pull requests are highly appreciated.
[openai-api-keys]: https://platform.openai.com/account/api-keys
Raw data
{
"_id": null,
"home_page": "https://github.com/hacksparr0w/pycodeless",
"name": "pycodeless",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8,<4.0",
"maintainer_email": "",
"keywords": "natural,language,codegen,programming,llm,codeless",
"author": "hacksparr0w",
"author_email": "hacksparr0w@protonmail.com",
"download_url": "https://files.pythonhosted.org/packages/9d/87/e880b9ab31a0a3101e14ffcf76c81f4f51dc126ad85af486b83d36922b0a/pycodeless-0.2.8.tar.gz",
"platform": null,
"description": "# Pycodeless\n\n> **Warning**\n> This project is in a very early stage of development and may not always\n> produce desired results. See the [limitations](#limitations) section for\n> more information.\n\nPycodeless integrates the power of LLMs into the Python programming language\nand allows you to use natural language as a universal interface for code\ngeneration.\n\n## Features\n\n - [x] The `@codeless` decorator generates your function code from their type\n annotations and docstrings. All generated code is cached in a local\n `__pycodeless__` package and can be edited or committed to your version\n control system.\n - [ ] Support for different language models to allow for better customization\n and offline usage.\n - [ ] Custom docstring formatting to allow for referencing objects across the\n whole codebase.\n\n## Installation\n\nYou can install Pycodeless using pip:\n\n```\npip install pycodeless\n```\n\nSince Pycodeless currently only works with the OpenAI API, you'll also need an\n[OpenAI API key][openai-api-keys].\n\n## Usage\n\nFollowing is a sample code using Pycodeless.\n\n```python\nfrom pycodeless import codeless\n\n\n# You can either specify your OpenAI API key in this way, or you can use the\n# environment variable OPENAI_API_KEY.\n# The same goes for specifying the model name, where the relevant environment\n# variable is OPENAI_MODEL_NAME. If no model name is specified, Pycodeless\n# will default to \"gpt-3.5-turbo\".\n\ncodeless.openai_api_key = \"sk-fEaz...\"\ncodeless.openai_model_name = \"gpt-3.5-turbo\"\n\n\n@codeless\ndef greet(name: str) -> str:\n \"\"\"\n Make greeting for a person with the given name in pirate language\n \"\"\"\n\n\n@codeless\ndef spongebob_case(text: str) -> str:\n \"\"\"\n Convert the passed text into spongebob case\n \"\"\"\n\n\ndef main():\n print(spongebob_case(greet(\"Patrick\")))\n\n\nif __name__ == \"__main__\":\n main()\n\n```\n\n## Limitations\n\nAt this point, Pycodeless works *reasonably* well with native Python type\nhints or type annotations from popular third-party libraries that the LLMs\nare familiar with and know how to import. Unless you are able to come up with\na very elaborate docstring prompt, custom type annotations will not work. More\nexploration work is needed to figure out how to make this work in all cases.\n\nFollowing is an non-exhaustive list of known limitations:\n\n - The current code generation and parsing features might be buggy, as there's\n currently no mechanism for telling whether an output from an LLM is actually\n runnable Python code.\n - The `@codeless` decorator may not work in all contexts (REPL, classes,\n methods, etc.).\n - There's currently no dependency management in place. This means that we\n can't track changes across the whole codebase and regenerate functions when\n their dependencies change. This also means that we can't remove any generated\n imports when removing a generated function inside a generated module, because\n there's no way of telling whether or not the import in question is used by\n any other function. Another problem with this is that if there's a custom\n type annotation in the function definition, we have no way of referencing it\n in the generated module.\n\n## FAQ\n\n - **How is this different from GitHub Copilot?**\n Pycodeless has indeed very similar functionality to GitHub Copilot, but\n the goal of this project is very different. Firstly, Copilot autocompletes\n your code in place and does not differentiate between the generated part of\n the code and the original prompt. One of the main points of this project is\n to completely shift the development focus onto the prompt itself and make\n natural language a universal interface for code generation while moving the\n generated code into the background. The development experience is really\n distinct, and I encourage you to test this out for yourself. Another\n important point is that Pycodeless makes code generation from natural\n language a first-class Python citizen. The code generation happens in runtime,\n which means we can leverage the whole Python infrastructure to enhance the\n prompt specification process (think dynamic docstrings with custom tags to \n reference various objects from the codebase and aid the LLMs, walking through\n the dependency graphs of type annotations to provide additional context, etc.).\n Also, I like the idea of not being dependent on any particular language model\n or text editor.\n\n## Contribution\n\nIf you'd like to report a bug or have an idea for a cool feature, issues and\npull requests are highly appreciated.\n\n[openai-api-keys]: https://platform.openai.com/account/api-keys\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Build Python code using natural language as a universal interface",
"version": "0.2.8",
"split_keywords": [
"natural",
"language",
"codegen",
"programming",
"llm",
"codeless"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "f8442330bf51b40a29cfbbc6028e8e8669c0568adb6c530879b3f3c6206e0146",
"md5": "61384b7559df1da038a1264217813d53",
"sha256": "6aa0d2a1ab171c82f64e925c184f3bff7ae7445d77032e3da415f6c60aba0078"
},
"downloads": -1,
"filename": "pycodeless-0.2.8-py3-none-any.whl",
"has_sig": false,
"md5_digest": "61384b7559df1da038a1264217813d53",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8,<4.0",
"size": 9356,
"upload_time": "2023-04-15T15:00:13",
"upload_time_iso_8601": "2023-04-15T15:00:13.749654Z",
"url": "https://files.pythonhosted.org/packages/f8/44/2330bf51b40a29cfbbc6028e8e8669c0568adb6c530879b3f3c6206e0146/pycodeless-0.2.8-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "9d87e880b9ab31a0a3101e14ffcf76c81f4f51dc126ad85af486b83d36922b0a",
"md5": "cc4de18ab1ab14fa08e29cb4e6b1666e",
"sha256": "76096cfc99aa9e533877764e8d46429d6ea6c47b81586b3e441d23270fd7733a"
},
"downloads": -1,
"filename": "pycodeless-0.2.8.tar.gz",
"has_sig": false,
"md5_digest": "cc4de18ab1ab14fa08e29cb4e6b1666e",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8,<4.0",
"size": 7672,
"upload_time": "2023-04-15T15:00:15",
"upload_time_iso_8601": "2023-04-15T15:00:15.509064Z",
"url": "https://files.pythonhosted.org/packages/9d/87/e880b9ab31a0a3101e14ffcf76c81f4f51dc126ad85af486b83d36922b0a/pycodeless-0.2.8.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-04-15 15:00:15",
"github": true,
"gitlab": false,
"bitbucket": false,
"github_user": "hacksparr0w",
"github_project": "pycodeless",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "pycodeless"
}