django-llm


Namedjango-llm JSON
Version 0.1.3 PyPI version JSON
download
home_pagehttps://github.com/mikrl/django-llm
SummaryA LLM (Large Language Model) app for Django
upload_time2023-03-28 15:41:07
maintainer
docs_urlNone
authorMichael Lynch
requires_python>=3.11
licenseMIT
keywords
VCS
bugtrack_url
requirements Django openai langchain llama-index pytest
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![Python 3.11](https://img.shields.io/badge/python-3.11-blue.svg)](https://www.python.org/downloads/release/python-3112/)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)

# Django LLM
An app for Django to aid development of large language model (LLM) workflows.

Have you ever wanted to store ChatGPT queries in your database? Now you can.

Powered by [langchain](https://github.com/hwchase17/langchain)


* [Wiki](https://github.com/mikrl/django-llm/wiki)
* [Sample Project](https://github.com/mikrl/django-llm-sample)

# Information
Configure your LLM workflow from Django.

Build a business layer for your LLM application.

## Install latest binary
```bash
pip install django-llm
```

## Build and install from source
```bash
./build.sh
pip install dist/*.whl
```

# Tests (Stabilizing)
```bash
pip install -r static.txt
./static.sh
pytest tests/
```

# Features
## Run ChatGPT queries through Django shell and model code
```bash
docker build -t django_llm .  
docker run -it django_llm 
>>> from llm.models.prompts import Prompt
>>> prompt = Prompt(template = "Give a bombastic and raucous 'Hello, {name}' to the user")
>>> prompt.save()
>>> Prompt.objects.all()
<QuerySet [<Prompt: Prompt object (1)>]>
>>> Prompt.objects.all()[0].template
"Give a bombastic and raucous 'Hello, {name}' to the user"
>>> from llm.models import ModelProviderAPI
>>> openai = ModelProviderAPI(service = 'OpenAI', api_key = '<<<YOUR OPENAI API KEY>>>')
>>> openai.save()
>>> openai.service
'OpenAI'
>>> from llm.models.queries import OpenAIChatQuery
>>> query = OpenAIChatQuery(prompt = prompt, api = openai)
>>> query.do_query(name="World")
"HELLO WORLD! WELCOME TO THE MIGHTY REALM OF TECHNOLOGY AND INNOVATION! PREPARE TO BE ASTOUNDED AND DAZZLED BY THE POWER OF CODE AND THE ENDLESS POSSIBILITIES OF THE DIGITAL AGE! LET'S ROCK AND ROLL!"
>>>
```

## More
* Model to configure and execute ChatGPT query
* Model to hold prompt and determine prompt variables
* Model to store API keys for various services

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/mikrl/django-llm",
    "name": "django-llm",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.11",
    "maintainer_email": "",
    "keywords": "",
    "author": "Michael Lynch",
    "author_email": "michael@flatlander.dev",
    "download_url": "",
    "platform": null,
    "description": "[![Python 3.11](https://img.shields.io/badge/python-3.11-blue.svg)](https://www.python.org/downloads/release/python-3112/)\n[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)\n\n# Django LLM\nAn app for Django to aid development of large language model (LLM) workflows.\n\nHave you ever wanted to store ChatGPT queries in your database? Now you can.\n\nPowered by [langchain](https://github.com/hwchase17/langchain)\n\n\n* [Wiki](https://github.com/mikrl/django-llm/wiki)\n* [Sample Project](https://github.com/mikrl/django-llm-sample)\n\n# Information\nConfigure your LLM workflow from Django.\n\nBuild a business layer for your LLM application.\n\n## Install latest binary\n```bash\npip install django-llm\n```\n\n## Build and install from source\n```bash\n./build.sh\npip install dist/*.whl\n```\n\n# Tests (Stabilizing)\n```bash\npip install -r static.txt\n./static.sh\npytest tests/\n```\n\n# Features\n## Run ChatGPT queries through Django shell and model code\n```bash\ndocker build -t django_llm .  \ndocker run -it django_llm \n>>> from llm.models.prompts import Prompt\n>>> prompt = Prompt(template = \"Give a bombastic and raucous 'Hello, {name}' to the user\")\n>>> prompt.save()\n>>> Prompt.objects.all()\n<QuerySet [<Prompt: Prompt object (1)>]>\n>>> Prompt.objects.all()[0].template\n\"Give a bombastic and raucous 'Hello, {name}' to the user\"\n>>> from llm.models import ModelProviderAPI\n>>> openai = ModelProviderAPI(service = 'OpenAI', api_key = '<<<YOUR OPENAI API KEY>>>')\n>>> openai.save()\n>>> openai.service\n'OpenAI'\n>>> from llm.models.queries import OpenAIChatQuery\n>>> query = OpenAIChatQuery(prompt = prompt, api = openai)\n>>> query.do_query(name=\"World\")\n\"HELLO WORLD! WELCOME TO THE MIGHTY REALM OF TECHNOLOGY AND INNOVATION! PREPARE TO BE ASTOUNDED AND DAZZLED BY THE POWER OF CODE AND THE ENDLESS POSSIBILITIES OF THE DIGITAL AGE! LET'S ROCK AND ROLL!\"\n>>>\n```\n\n## More\n* Model to configure and execute ChatGPT query\n* Model to hold prompt and determine prompt variables\n* Model to store API keys for various services\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A LLM (Large Language Model) app for Django",
    "version": "0.1.3",
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c135c0003d41dea3a74bbc170dc05a2090cc2e24aec167df41b38bbc45fd3bb4",
                "md5": "c8846291dafc240531390c820d2641c1",
                "sha256": "ffb82674a4607227154b06322fa8cf4dc91f0aa78b28c10be43fbf8085201bad"
            },
            "downloads": -1,
            "filename": "django_llm-0.1.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c8846291dafc240531390c820d2641c1",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.11",
            "size": 13219,
            "upload_time": "2023-03-28T15:41:07",
            "upload_time_iso_8601": "2023-03-28T15:41:07.935925Z",
            "url": "https://files.pythonhosted.org/packages/c1/35/c0003d41dea3a74bbc170dc05a2090cc2e24aec167df41b38bbc45fd3bb4/django_llm-0.1.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-03-28 15:41:07",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "mikrl",
    "github_project": "django-llm",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "Django",
            "specs": [
                [
                    "==",
                    "4.1.7"
                ]
            ]
        },
        {
            "name": "openai",
            "specs": [
                [
                    "==",
                    "0.27.2"
                ]
            ]
        },
        {
            "name": "langchain",
            "specs": [
                [
                    "==",
                    "0.0.121"
                ]
            ]
        },
        {
            "name": "llama-index",
            "specs": [
                [
                    "==",
                    "0.4.36"
                ]
            ]
        },
        {
            "name": "pytest",
            "specs": [
                [
                    "==",
                    "7.2.2"
                ]
            ]
        }
    ],
    "lcname": "django-llm"
}
        
Elapsed time: 0.04723s