Name | justai JSON |
Version |
3.11.3
JSON |
| download |
home_page | None |
Summary | Makes working with LLMs like OpenAI GPT, Anthropic Claude, Google Gemini and Open source models super easy |
upload_time | 2024-12-23 13:05:20 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.11 |
license | This is free and unencumbered software released into the public domain. Anyone is free to copy, modify, publish, use, compile, sell, or distribute this software, either in source code form or as a compiled binary, for any purpose, commercial or non-commercial, and by any means. In jurisdictions that recognize copyright laws, the author or authors of this software dedicate any and all copyright interest in the software to the public domain. We make this dedication for the benefit of the public at large and to the detriment of our heirs and successors. We intend this dedication to be an overt act of relinquishment in perpetuity of all present and future rights to this software under copyright law. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. For more information, please refer to <https://unlicense.org> |
keywords |
chatgpt
gpt4o
gpt4
api
claude
anthropic
lllama
gemini
|
VCS |
|
bugtrack_url |
|
requirements |
anthropic
google-generativeai
httpx
justdays
llama-cpp-python
lxml
openai
packaging
pillow
pydantic
python-dateutil
python-dotenv
rich
tiktoken
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# JustAI
Package to make working with Large Language models in Python super easy.
Author: Hans-Peter Harmsen (hp@harmsen.nl) \
Current version: 3.11.3
## Installation
1. Install the package:
~~~~bash
python -m pip install justai
~~~~
2. Create an OpenAI acccount (for GPT3.5 / 4) [here](https://platform.openai.com/) or an Anthropic account [here](https://console.anthropic.com/) or a Google account
3. Create an OpenAI api key (for Claude) [here](https://platform.openai.com/account/api-keys) or an Anthropic api key [here](https://console.anthropic.com/settings/keys) or a Google api key [here](https://aistudio.google.com/app/apikey)
4. Create a .env file with the following content:
```bash
OPENAI_API_KEY=your-openai-api-key
OPENAI_ORGANIZATION=your-openai-organization-id
ANTHROPIC_API_KEY=your-anthropic-api-key
GOOGLE_API_KEY=your-google-api-key
```
## Usage
```Python
from justai import Agent
if __name__ == "__main__":
agent = Agent('gpt-4o')
agent.system = "You are a movie critic. I feed you with movie titles and you give me a review in 50 words."
message = agent.chat("Forrest Gump")
print(message)
```
output
```
Forrest Gump is an American classic that tells the story of
a man with a kind heart and simple mind who experiences major
events in history. Tom Hanks gives an unforgettable performance,
making us both laugh and cry. A heartwarming and nostalgic
movie that still resonates with audiences today.
```
## Other models
Justai can use different types of models:
**OpenAI** models like GPT-3.5, GPT-4-turbo and GPT-4o\
**Anthropic** models like the Claude-3 models\
**Google** models like the Gemini models\
**Open source** models like Llama2-7b or Mixtral-8x7b-instruct as long as they are in the GGUF format.
The provider is chosen depending on the model name. E.g. if a model name starts with gpt, OpenAI is chosen as the provider.
To use an open source model, just pass the full path to the .gguf file as the model name.
## Using the examples
Install dependencies:
```bash
python -m pip install -r requirements.txt
```
### Basic
```bash
python examples/basic.py
```
Shows basic use of Justai
### Returning json or other types
```bash
python examples/return_types.py
```
You can specify a specific return type (like a list of dicts) for the completion.
This is useful when you want to extract structured data from the completion.
To return json, just pass return_json=True to agent.chat() and tell the model in the
prompt how you want your json to be structured.
To define a specific return type you can use the return_type parameter.
Currently this works with the Google models (pass a Python type definition, returns Json)
and with OpenAI (pass a Pydatic type definition, returns a Pydantic model).\
See the example code for more details.
### Images
```bash
python examples/vision.py
```
Pass images to the model. An image can either be:
* An url to an image
* The raw image data
* A PIL image
### Asynchronous use
```bash
python examples/async.py
```
Shows how to use Justai asynchronously.
### Prompt caching
```bash
python examples/prompt_caching.py
```
Shows how to use Prompt caching in Anthropic models.
### Interactive
```bash
python examples/interactive.py
```
Starts an interactive session. In the session you dan chat with GPT-4 or another model.
#### Special commands in interactive mode
In the interactive mode you can use these special commands which each start with a colon:
| Syntax | Description |
|-----------------------------------|---------------------------------------------------------------------|
| :reset | resets the conversation |
| :load _name_ | loads the saved conversation with the specified name |
| :save _name_ | saves the conversation under the specified name |
| :input _filename_ | loads an input from the specified file |
| :model _gpt-4_ | Sets the AI model |
| :max_tokens _800_ | The maximum number of tokens to generate in the completion |
| :temperature _0.9_ | What sampling temperature to use, between 0 and 2 |
| :n _1_ | Specifies the number answers given |
| :bye | quits but saves the conversation first |
| :exit or :quit | quits the program |
Raw data
{
"_id": null,
"home_page": null,
"name": "justai",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.11",
"maintainer_email": null,
"keywords": "ChatGPT, GPT4o, GPT4, api, Claude, Anthropic, Lllama, Gemini",
"author": null,
"author_email": "HP Harmsen <hp@harmsen.nl>",
"download_url": "https://files.pythonhosted.org/packages/34/9f/5006d793c1ddb0848ba69490dc10b4add82b812e1b6092053754a41fbec5/justai-3.11.3.tar.gz",
"platform": null,
"description": "# JustAI\n\nPackage to make working with Large Language models in Python super easy.\n\nAuthor: Hans-Peter Harmsen (hp@harmsen.nl) \\\nCurrent version: 3.11.3\n\n## Installation\n1. Install the package:\n~~~~bash\npython -m pip install justai\n~~~~\n2. Create an OpenAI acccount (for GPT3.5 / 4) [here](https://platform.openai.com/) or an Anthropic account [here](https://console.anthropic.com/) or a Google account\n3. Create an OpenAI api key (for Claude) [here](https://platform.openai.com/account/api-keys) or an Anthropic api key [here](https://console.anthropic.com/settings/keys) or a Google api key [here](https://aistudio.google.com/app/apikey)\n4. Create a .env file with the following content:\n```bash\nOPENAI_API_KEY=your-openai-api-key\nOPENAI_ORGANIZATION=your-openai-organization-id\nANTHROPIC_API_KEY=your-anthropic-api-key\nGOOGLE_API_KEY=your-google-api-key\n```\n## Usage\n\n```Python\nfrom justai import Agent\n\nif __name__ == \"__main__\":\n agent = Agent('gpt-4o')\n agent.system = \"You are a movie critic. I feed you with movie titles and you give me a review in 50 words.\"\n\n message = agent.chat(\"Forrest Gump\")\n print(message)\n```\noutput\n```\nForrest Gump is an American classic that tells the story of\na man with a kind heart and simple mind who experiences major\nevents in history. Tom Hanks gives an unforgettable performance, \nmaking us both laugh and cry. A heartwarming and nostalgic \nmovie that still resonates with audiences today.\n```\n## Other models\nJustai can use different types of models:\n\n**OpenAI** models like GPT-3.5, GPT-4-turbo and GPT-4o\\\n**Anthropic** models like the Claude-3 models\\\n**Google** models like the Gemini models\\\n**Open source** models like Llama2-7b or Mixtral-8x7b-instruct as long as they are in the GGUF format.\n\nThe provider is chosen depending on the model name. E.g. if a model name starts with gpt, OpenAI is chosen as the provider.\nTo use an open source model, just pass the full path to the .gguf file as the model name.\n\n\n## Using the examples\nInstall dependencies:\n```bash\npython -m pip install -r requirements.txt\n```\n\n\n### Basic\n```bash\npython examples/basic.py\n```\nShows basic use of Justai\n\n### Returning json or other types\n```bash\npython examples/return_types.py\n```\nYou can specify a specific return type (like a list of dicts) for the completion. \nThis is useful when you want to extract structured data from the completion.\n\nTo return json, just pass return_json=True to agent.chat() and tell the model in the \nprompt how you want your json to be structured.\n\nTo define a specific return type you can use the return_type parameter.\nCurrently this works with the Google models (pass a Python type definition, returns Json)\nand with OpenAI (pass a Pydatic type definition, returns a Pydantic model).\\\nSee the example code for more details.\n\n### Images\n```bash\npython examples/vision.py\n```\nPass images to the model. An image can either be:\n* An url to an image\n* The raw image data\n* A PIL image\n\n### Asynchronous use\n```bash\npython examples/async.py\n```\nShows how to use Justai asynchronously.\n\n### Prompt caching\n```bash\npython examples/prompt_caching.py\n```\nShows how to use Prompt caching in Anthropic models.\n\n### Interactive\n```bash\npython examples/interactive.py\n```\nStarts an interactive session. In the session you dan chat with GPT-4 or another model.\n\n#### Special commands in interactive mode\nIn the interactive mode you can use these special commands which each start with a colon:\n\n| Syntax | Description |\n|-----------------------------------|---------------------------------------------------------------------|\n| :reset | resets the conversation |\n| :load _name_ | loads the saved conversation with the specified name |\n| :save _name_ | saves the conversation under the specified name |\n| :input _filename_ | loads an input from the specified file |\n| :model _gpt-4_ | Sets the AI model |\n| :max_tokens _800_ | The maximum number of tokens to generate in the completion |\n| :temperature _0.9_ | What sampling temperature to use, between 0 and 2 |\n| :n _1_ | Specifies the number answers given |\n| :bye | quits but saves the conversation first |\n| :exit or :quit | quits the program |\n\n",
"bugtrack_url": null,
"license": "This is free and unencumbered software released into the public domain. Anyone is free to copy, modify, publish, use, compile, sell, or distribute this software, either in source code form or as a compiled binary, for any purpose, commercial or non-commercial, and by any means. In jurisdictions that recognize copyright laws, the author or authors of this software dedicate any and all copyright interest in the software to the public domain. We make this dedication for the benefit of the public at large and to the detriment of our heirs and successors. We intend this dedication to be an overt act of relinquishment in perpetuity of all present and future rights to this software under copyright law. THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. For more information, please refer to <https://unlicense.org> ",
"summary": "Makes working with LLMs like OpenAI GPT, Anthropic Claude, Google Gemini and Open source models super easy",
"version": "3.11.3",
"project_urls": {
"Homepage": "https://github.com/hpharmsen/justai"
},
"split_keywords": [
"chatgpt",
" gpt4o",
" gpt4",
" api",
" claude",
" anthropic",
" lllama",
" gemini"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "8ce57ce765752d81edd5de28bbaa40a16e063174d5b0fd2d57d518772dc080fd",
"md5": "c11c367300c4497b341b780ded3c1176",
"sha256": "04faf0b5b137a902798c3556748d86137c3f97a69200c15bac2e33fd6cef7b16"
},
"downloads": -1,
"filename": "justai-3.11.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "c11c367300c4497b341b780ded3c1176",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.11",
"size": 35460,
"upload_time": "2024-12-23T13:05:18",
"upload_time_iso_8601": "2024-12-23T13:05:18.168794Z",
"url": "https://files.pythonhosted.org/packages/8c/e5/7ce765752d81edd5de28bbaa40a16e063174d5b0fd2d57d518772dc080fd/justai-3.11.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "349f5006d793c1ddb0848ba69490dc10b4add82b812e1b6092053754a41fbec5",
"md5": "09200ec9344261cd27793279e1516a7e",
"sha256": "121e8aaf9f7c6b964103342062c564750994e1846df698f8994973e3eaa52f3a"
},
"downloads": -1,
"filename": "justai-3.11.3.tar.gz",
"has_sig": false,
"md5_digest": "09200ec9344261cd27793279e1516a7e",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.11",
"size": 31988,
"upload_time": "2024-12-23T13:05:20",
"upload_time_iso_8601": "2024-12-23T13:05:20.749144Z",
"url": "https://files.pythonhosted.org/packages/34/9f/5006d793c1ddb0848ba69490dc10b4add82b812e1b6092053754a41fbec5/justai-3.11.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-23 13:05:20",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "hpharmsen",
"github_project": "justai",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "anthropic",
"specs": [
[
">=",
"0.34.0"
]
]
},
{
"name": "google-generativeai",
"specs": []
},
{
"name": "httpx",
"specs": []
},
{
"name": "justdays",
"specs": []
},
{
"name": "llama-cpp-python",
"specs": []
},
{
"name": "lxml",
"specs": [
[
"==",
"5.1.1"
]
]
},
{
"name": "openai",
"specs": [
[
">=",
"1.40.0"
]
]
},
{
"name": "packaging",
"specs": []
},
{
"name": "pillow",
"specs": []
},
{
"name": "pydantic",
"specs": []
},
{
"name": "python-dateutil",
"specs": []
},
{
"name": "python-dotenv",
"specs": []
},
{
"name": "rich",
"specs": []
},
{
"name": "tiktoken",
"specs": []
}
],
"lcname": "justai"
}