# python-llm: A LLM API for Humans

The simplicity and elegance of python-requests, but for LLMs. This library supports models from OpenAI and Anthropic. I will try to add more when I have the time, and am warmly accepting pull requests if that's of interest.
## Usage
```python
import llm
llm.set_api_key(openai="sk-...", anthropic="sk-...")
# Chat
llm.chat("what is 2+2") # 4. Uses GPT-3 by default if key is provided.
llm.chat("what is 2+2", engine="anthropic:claude-instant-v1") # 4.
# Completion
llm.complete("hello, I am") # A GPT model.
llm.complete("hello, I am", engine="openai:gpt-4") # A big GPT model.
llm.complete("hello, I am ", engine="anthropic:claude-instant-v1") # Claude.
# Back-and-forth chat [human, assistant, human]
llm.chat(["hi", "hi there, how are you?", "good, tell me a joke"]) # Why did chicken cross road?
# Streaming chat
llm.stream_chat(["what is 2+2"]) # 4.
llm.multi_stream_chat(["what is 2+2"],
engines=
["anthropic:claude-instant-v1",
"openai:gpt-3.5-turbo"])
# Results will stream back to you from both models at the same time like this:
# ["anthropic:claude-instant-v1", "hi"], ["openai:gpt-3.5-turbo", "howdy"],
# ["anthropic:claude-instant-v1", " there"] ["openai:gpt-3.5-turbo", " my friend"]
# Engines are in the provider:model format, as in openai:gpt-4, or anthropic:claude-instant-v1.
```
## Multi Stream Chat In Action
Given this feature is very lively, I've included a video of it in action.
https://github.com/danielgross/python-llm/assets/279531/d68eb843-7a32-4ffe-8ac2-b06b81e764b0
## Installation
To install `python-llm`, use pip: ```pip install python-llm```.
## Configuration
You can set API keys in a few ways:
1. Through environment variables (you can also set a `.env` file).
```bash
export OPENAI_API_KEY=sk_...
export ANTHROPIC_API_KEY=sk_...
```
2. By calling the method manually:
```python
import llm
llm.set_api_key(openai="sk-...", anthropic="sk-...")
```
3. By passing a JSON file like this:
```python
llm.set_api_key("path/to/api_keys.json")
```
The JSON should look like:
```json
{
"openai": "sk-...",
"anthropic": "sk-..."
}
```
## TODO
- [ ] Caching!
- [ ] More LLM vendors!
- [ ] More tests!
Raw data
{
"_id": null,
"home_page": "https://github.com/danielgross/python-llm",
"name": "python-llm",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.6",
"maintainer_email": "",
"keywords": "",
"author": "Daniel Gross",
"author_email": "d@dcgross.com",
"download_url": "https://files.pythonhosted.org/packages/24/f0/0381c058f196fc5a9b2dd3f8bba4f90f984385cf02f4e54affb3135c7f14/python-llm-0.2.0.tar.gz",
"platform": null,
"description": "# python-llm: A LLM API for Humans\n\n\n\nThe simplicity and elegance of python-requests, but for LLMs. This library supports models from OpenAI and Anthropic. I will try to add more when I have the time, and am warmly accepting pull requests if that's of interest.\n\n## Usage\n\n```python\nimport llm\nllm.set_api_key(openai=\"sk-...\", anthropic=\"sk-...\")\n\n# Chat\nllm.chat(\"what is 2+2\") # 4. Uses GPT-3 by default if key is provided.\nllm.chat(\"what is 2+2\", engine=\"anthropic:claude-instant-v1\") # 4.\n\n# Completion\nllm.complete(\"hello, I am\") # A GPT model.\nllm.complete(\"hello, I am\", engine=\"openai:gpt-4\") # A big GPT model.\nllm.complete(\"hello, I am \", engine=\"anthropic:claude-instant-v1\") # Claude.\n\n# Back-and-forth chat [human, assistant, human]\nllm.chat([\"hi\", \"hi there, how are you?\", \"good, tell me a joke\"]) # Why did chicken cross road?\n\n# Streaming chat\nllm.stream_chat([\"what is 2+2\"]) # 4. \nllm.multi_stream_chat([\"what is 2+2\"], \n engines=\n [\"anthropic:claude-instant-v1\", \n \"openai:gpt-3.5-turbo\"]) \n# Results will stream back to you from both models at the same time like this:\n# [\"anthropic:claude-instant-v1\", \"hi\"], [\"openai:gpt-3.5-turbo\", \"howdy\"], \n# [\"anthropic:claude-instant-v1\", \" there\"] [\"openai:gpt-3.5-turbo\", \" my friend\"]\n\n# Engines are in the provider:model format, as in openai:gpt-4, or anthropic:claude-instant-v1.\n```\n\n## Multi Stream Chat In Action\nGiven this feature is very lively, I've included a video of it in action.\n\nhttps://github.com/danielgross/python-llm/assets/279531/d68eb843-7a32-4ffe-8ac2-b06b81e764b0\n\n## Installation\n\nTo install `python-llm`, use pip: ```pip install python-llm```.\n\n## Configuration\nYou can set API keys in a few ways:\n1. Through environment variables (you can also set a `.env` file).\n```bash\nexport OPENAI_API_KEY=sk_...\nexport ANTHROPIC_API_KEY=sk_...\n```\n2. By calling the method manually:\n```python\nimport llm\nllm.set_api_key(openai=\"sk-...\", anthropic=\"sk-...\")\n```\n3. By passing a JSON file like this:\n```python\nllm.set_api_key(\"path/to/api_keys.json\")\n```\nThe JSON should look like:\n```json\n{\n \"openai\": \"sk-...\",\n \"anthropic\": \"sk-...\"\n}\n```\n\n## TODO \n- [ ] Caching! \n- [ ] More LLM vendors!\n- [ ] More tests!\n\n",
"bugtrack_url": null,
"license": "",
"summary": "An LLM wrapper for Humans",
"version": "0.2.0",
"project_urls": {
"Homepage": "https://github.com/danielgross/python-llm"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "24f00381c058f196fc5a9b2dd3f8bba4f90f984385cf02f4e54affb3135c7f14",
"md5": "0b7b301cef23281dccba5674aee57016",
"sha256": "bbc8cd3277b1d71ec02f490e712de80052429ae8319af1d2410792912d25f8fe"
},
"downloads": -1,
"filename": "python-llm-0.2.0.tar.gz",
"has_sig": false,
"md5_digest": "0b7b301cef23281dccba5674aee57016",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.6",
"size": 11325,
"upload_time": "2023-05-26T13:46:39",
"upload_time_iso_8601": "2023-05-26T13:46:39.213427Z",
"url": "https://files.pythonhosted.org/packages/24/f0/0381c058f196fc5a9b2dd3f8bba4f90f984385cf02f4e54affb3135c7f14/python-llm-0.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-05-26 13:46:39",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "danielgross",
"github_project": "python-llm",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [],
"lcname": "python-llm"
}