Name | docullim JSON |
Version |
0.5.0
JSON |
| download |
home_page | None |
Summary | auto-generate documentation for python code using llms |
upload_time | 2025-02-10 22:33:54 |
maintainer | None |
docs_url | None |
author | shrynx |
requires_python | <4.0,>=3.10 |
license | MIT |
keywords |
documentation
llm
openai
generative-ai
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# 📝 docullim
[](https://pypi.org/project/docullim/#history)
[](https://pypi.org/project/docullim/)
auto-generate documentation for python code using llms.
[](https://asciinema.org/a/702627)
## ⚙️ installation
```sh
pip install docullim
```
```sh
poetry add docullim
```
## 💻 usage
add `@docullim` to the function or class you want to generate documention for. you can also pass a tag to the annotation like `@docullim("custom_tag")`.
```python
from docullim import docullim
@docullim
def add(a, b):
return a + b
@docullim("custom_tag")
def sub(a, b):
return a - b
```
```sh
docullim file1.py file2.py
docullim "src/**/*.py"
docullim --config docullim.json --model gpt-4 "src/**/*.py"
docullim --reset-cache --concurrency 3 --write file1.py "src/**/*.py"
```
config file is a json file that can have `model` as string, `max_concurrency` as number or `prompts` as string pairs.
`docullim.json`
```json
{
"model": "gpt-4",
"max_concurrency": 5,
"prompts": {
"default": "Generate short and simple documentation explaing the code and include sample usage.",
"custom_tag": "this is a a diffrent propmt being passed to the llm when @docullim('custom_tag') is passed"
}
}
```
you should provide your llm api key provided as enviroment variable by default it requires `OPENAI_API_KEY`.
you can switch models and provide other llm api keys. See [supported llms](https://docs.litellm.ai/docs/providers)
## 🛠️ development
Install the following tools are needed to have this project running
- [devbox](https://www.jetify.com/devbox) manages all the packages needed by the project
- [direnv](https://direnv.net/) loads env variables and run devbox when in project directory
also rename `.env.example` to `.env` and add your llm api key.
Raw data
{
"_id": null,
"home_page": null,
"name": "docullim",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.10",
"maintainer_email": null,
"keywords": "documentation, llm, openai, generative-ai",
"author": "shrynx",
"author_email": "shriyansbhatnagar@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/19/a1/9a3b40eb1a261b884d923d600a5a43c10bfc8ab68ef9b75d006f115f4423/docullim-0.5.0.tar.gz",
"platform": null,
"description": "# \ud83d\udcdd docullim\n\n[](https://pypi.org/project/docullim/#history)\n[](https://pypi.org/project/docullim/)\n\nauto-generate documentation for python code using llms.\n\n[](https://asciinema.org/a/702627)\n\n## \u2699\ufe0f installation\n\n```sh\npip install docullim\n```\n\n```sh\npoetry add docullim\n```\n\n## \ud83d\udcbb usage\n\nadd `@docullim` to the function or class you want to generate documention for. you can also pass a tag to the annotation like `@docullim(\"custom_tag\")`.\n\n```python\nfrom docullim import docullim\n\n@docullim\ndef add(a, b):\n return a + b\n\n@docullim(\"custom_tag\")\ndef sub(a, b):\n return a - b\n\n```\n\n```sh\ndocullim file1.py file2.py\ndocullim \"src/**/*.py\"\ndocullim --config docullim.json --model gpt-4 \"src/**/*.py\"\ndocullim --reset-cache --concurrency 3 --write file1.py \"src/**/*.py\"\n```\n\nconfig file is a json file that can have `model` as string, `max_concurrency` as number or `prompts` as string pairs.\n\n`docullim.json`\n\n```json\n{\n \"model\": \"gpt-4\",\n \"max_concurrency\": 5,\n \"prompts\": {\n \"default\": \"Generate short and simple documentation explaing the code and include sample usage.\",\n \"custom_tag\": \"this is a a diffrent propmt being passed to the llm when @docullim('custom_tag') is passed\"\n }\n}\n```\n\nyou should provide your llm api key provided as enviroment variable by default it requires `OPENAI_API_KEY`.\nyou can switch models and provide other llm api keys. See [supported llms](https://docs.litellm.ai/docs/providers)\n\n## \ud83d\udee0\ufe0f development\n\nInstall the following tools are needed to have this project running\n\n- [devbox](https://www.jetify.com/devbox) manages all the packages needed by the project\n- [direnv](https://direnv.net/) loads env variables and run devbox when in project directory\n\nalso rename `.env.example` to `.env` and add your llm api key.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "auto-generate documentation for python code using llms",
"version": "0.5.0",
"project_urls": null,
"split_keywords": [
"documentation",
" llm",
" openai",
" generative-ai"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "e4e7d4f22498428a15350db5df31e5ab3055eed9a6b2073cdb5797884ce0f802",
"md5": "146f7e23185b9097d3ed7c4c28924af1",
"sha256": "fd7ce6501b70947a40a92672de39402adde9283fe7d4031a29e18f28c6be9d86"
},
"downloads": -1,
"filename": "docullim-0.5.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "146f7e23185b9097d3ed7c4c28924af1",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.10",
"size": 9060,
"upload_time": "2025-02-10T22:33:52",
"upload_time_iso_8601": "2025-02-10T22:33:52.528807Z",
"url": "https://files.pythonhosted.org/packages/e4/e7/d4f22498428a15350db5df31e5ab3055eed9a6b2073cdb5797884ce0f802/docullim-0.5.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "19a19a3b40eb1a261b884d923d600a5a43c10bfc8ab68ef9b75d006f115f4423",
"md5": "3522a876ae5770ff8ff4d3bccded5b28",
"sha256": "5e1290c956fdf790d3ec624973f830af9c158640ef9d5a690edb0e8067b78600"
},
"downloads": -1,
"filename": "docullim-0.5.0.tar.gz",
"has_sig": false,
"md5_digest": "3522a876ae5770ff8ff4d3bccded5b28",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.10",
"size": 7028,
"upload_time": "2025-02-10T22:33:54",
"upload_time_iso_8601": "2025-02-10T22:33:54.493570Z",
"url": "https://files.pythonhosted.org/packages/19/a1/9a3b40eb1a261b884d923d600a5a43c10bfc8ab68ef9b75d006f115f4423/docullim-0.5.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-02-10 22:33:54",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "docullim"
}