<p align="center">
<img width="300" src="https://github.com/dev-mush/jaims-py/assets/669003/5c53381f-25b5-4141-bcd2-7457863eafb9" >
</p>
# JAIms
_My name is Bot, JAIms Bot._ 🕶️
JAIms is a lightweight Python package that lets you build powerful LLM-Based agents or LLM powered applications with ease. It is platform agnostic, so you can focus on integrating AI into your software and let JAIms handle the boilerplate of communicating with the LLM API.
The main goal of JAIms is to provide a simple and easy-to-use interface to leverage the power of LLMs in your software, without having to worry about the specifics of the underlying provider, and to seamlessly integrate LLM functionality with your own codebase.
JAIms currently supports mainstream foundation LLMs such as OpenAI's GPT models, Google's gemini models (also on Vertex), Mistral models and Anthropic Models (both hosted on Anthropic and Vertex endpoints). JAIms can be easily extended to connect to your own model and endpoints.
Check out the [getting started guide](docs/getting_started.md) to quickly get up and running with JAIms.
Also consider checking out the [examples](examples) folder for more advanced use cases.
### ✨ Main Features
- Built in support for most common foundational models.
- Built in conversation history management to allow fast creation of chatbots, this can be easily extended to support more advanced history management strategies.
- Image support for multimodal LLMs 🖼️.
- Support for function calling, both streamed and non-streamed.
- Fast integration with dataclasses and pydantic models.
- Error handling and exponential backoff for built in providers (openai, google, mistral)
### 🧠 Guiding Principles
JAIms comes out of the necessity for a lightweight and easy-to-use framework to create LLM agents or integrate LLM functionality in python projects. Given the increasing work with both foundational and open source LLMs, JAIms has been designed as an abstraction layer to streamline fast creation of agentic business logic and seamless codebase integration.
In case you like to contribute, please keep in mind that I try to keep the code:
- **Modular**: any component is provided with a default basic implementation and an interface that can be easily extended for more complex use cases.
- **Type Hinted and Explicit**: I've done my best to type hint everything and document the codebase as much as possible to avoid digging into the code.
- **Tested**: Well...Let's just say I could have done better, but am planning to improve code coverage and test automation in the near future.
- **Application focused**: I'm not trying to build a library similar to langchain or llamaindex to perform data-driven operations on LLMs, I'm trying to build a very simple and lightweight framework that leverages the possibility of LLMs to perform function calling so that LLMs can easily be integrated in software applications.
- **Extensible**: I'm planning to add more providers and more features.
## ⚠️ Project status
I'm using this library in many of my projects without problems, that said I've just revamped it entirely to support multiple providers and entirely refactored the codebase to streamline function calling. I've done my best to test it thoroughly, but I can't guarantee something won't break.
In the [roadmap](docs/roadmap.md) I'm tracking the next steps I'm planning to take to improve the library.
I'm actively working on this project and I'm open to contributions, so feel free to open an issue or a PR if you find something that needs fixing or improving.
Since I've started the development of JAIms, a few similar projects have been started, and granted that I didn't have time to check them out yet, some might easily be more advanced, yet I've widely employed this library in my projects and those of the company I work for, and I've been actively maintaining it, so I'm planning to keep it up to date and to improve it as much as I can.
I've opted for an open source by default approach to allow others to benefit from it and force myself to keep the code clean and well documented, just remember that since this is, for now, a side-project developed just by me (that am fairly new to python), expect the possibility of encountering some issues and don't expect an immediate patch from me, any help is very much appreciated 🤗.
## 📝 License
Copyright (c) 2023 Marco Musella (aka Mush). This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
Raw data
{
"_id": null,
"home_page": "https://github.com/dev-mush/jaims-py",
"name": "jaims-py",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": "An extensible library to create LLM Agents and LLM based applications.",
"author": "Marco Musella",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/11/e2/ad8afb6927071d5ac59be7111385f505b2eba1837835686fc922e088ef9c/jaims_py-2.0.0.tar.gz",
"platform": null,
"description": "<p align=\"center\">\n <img width=\"300\" src=\"https://github.com/dev-mush/jaims-py/assets/669003/5c53381f-25b5-4141-bcd2-7457863eafb9\" >\n</p>\n\n# JAIms\n\n_My name is Bot, JAIms Bot._ \ud83d\udd76\ufe0f\n\nJAIms is a lightweight Python package that lets you build powerful LLM-Based agents or LLM powered applications with ease. It is platform agnostic, so you can focus on integrating AI into your software and let JAIms handle the boilerplate of communicating with the LLM API.\nThe main goal of JAIms is to provide a simple and easy-to-use interface to leverage the power of LLMs in your software, without having to worry about the specifics of the underlying provider, and to seamlessly integrate LLM functionality with your own codebase.\nJAIms currently supports mainstream foundation LLMs such as OpenAI's GPT models, Google's gemini models (also on Vertex), Mistral models and Anthropic Models (both hosted on Anthropic and Vertex endpoints). JAIms can be easily extended to connect to your own model and endpoints.\n\nCheck out the [getting started guide](docs/getting_started.md) to quickly get up and running with JAIms.\n\nAlso consider checking out the [examples](examples) folder for more advanced use cases.\n\n### \u2728 Main Features\n\n- Built in support for most common foundational models.\n- Built in conversation history management to allow fast creation of chatbots, this can be easily extended to support more advanced history management strategies.\n- Image support for multimodal LLMs \ud83d\uddbc\ufe0f.\n- Support for function calling, both streamed and non-streamed.\n- Fast integration with dataclasses and pydantic models.\n- Error handling and exponential backoff for built in providers (openai, google, mistral)\n\n### \ud83e\udde0 Guiding Principles\n\nJAIms comes out of the necessity for a lightweight and easy-to-use framework to create LLM agents or integrate LLM functionality in python projects. Given the increasing work with both foundational and open source LLMs, JAIms has been designed as an abstraction layer to streamline fast creation of agentic business logic and seamless codebase integration.\n\nIn case you like to contribute, please keep in mind that I try to keep the code:\n\n- **Modular**: any component is provided with a default basic implementation and an interface that can be easily extended for more complex use cases.\n- **Type Hinted and Explicit**: I've done my best to type hint everything and document the codebase as much as possible to avoid digging into the code.\n- **Tested**: Well...Let's just say I could have done better, but am planning to improve code coverage and test automation in the near future.\n- **Application focused**: I'm not trying to build a library similar to langchain or llamaindex to perform data-driven operations on LLMs, I'm trying to build a very simple and lightweight framework that leverages the possibility of LLMs to perform function calling so that LLMs can easily be integrated in software applications.\n- **Extensible**: I'm planning to add more providers and more features.\n\n## \u26a0\ufe0f Project status\n\nI'm using this library in many of my projects without problems, that said I've just revamped it entirely to support multiple providers and entirely refactored the codebase to streamline function calling. I've done my best to test it thoroughly, but I can't guarantee something won't break.\n\nIn the [roadmap](docs/roadmap.md) I'm tracking the next steps I'm planning to take to improve the library.\n\nI'm actively working on this project and I'm open to contributions, so feel free to open an issue or a PR if you find something that needs fixing or improving.\n\nSince I've started the development of JAIms, a few similar projects have been started, and granted that I didn't have time to check them out yet, some might easily be more advanced, yet I've widely employed this library in my projects and those of the company I work for, and I've been actively maintaining it, so I'm planning to keep it up to date and to improve it as much as I can.\n\nI've opted for an open source by default approach to allow others to benefit from it and force myself to keep the code clean and well documented, just remember that since this is, for now, a side-project developed just by me (that am fairly new to python), expect the possibility of encountering some issues and don't expect an immediate patch from me, any help is very much appreciated \ud83e\udd17.\n\n## \ud83d\udcdd License\n\nCopyright (c) 2023 Marco Musella (aka Mush). This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "A Python package for creating LLM powered, agentic, platform agnostic software.",
"version": "2.0.0",
"project_urls": {
"Homepage": "https://github.com/dev-mush/jaims-py"
},
"split_keywords": [
"an",
"extensible",
"library",
"to",
"create",
"llm",
"agents",
"and",
"llm",
"based",
"applications."
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "3f1b5731d439a28f1b8d235cfd62d521c00a90709e805a84fa635b68925b4184",
"md5": "d9886846f7f6dfc882099c6f52fd626a",
"sha256": "2ba40fbbadb8bdf05a3c0432d47999ae2756734b6af1b7aaa420922822a304c9"
},
"downloads": -1,
"filename": "jaims_py-2.0.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "d9886846f7f6dfc882099c6f52fd626a",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 46111,
"upload_time": "2024-09-23T18:32:53",
"upload_time_iso_8601": "2024-09-23T18:32:53.709781Z",
"url": "https://files.pythonhosted.org/packages/3f/1b/5731d439a28f1b8d235cfd62d521c00a90709e805a84fa635b68925b4184/jaims_py-2.0.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "11e2ad8afb6927071d5ac59be7111385f505b2eba1837835686fc922e088ef9c",
"md5": "3fad110288e782807da0cf73c876e051",
"sha256": "6746912f512c4e5ec7a98aa6df88a196347ed767015596be37c7ee8a2d66a440"
},
"downloads": -1,
"filename": "jaims_py-2.0.0.tar.gz",
"has_sig": false,
"md5_digest": "3fad110288e782807da0cf73c876e051",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 42618,
"upload_time": "2024-09-23T18:32:55",
"upload_time_iso_8601": "2024-09-23T18:32:55.440976Z",
"url": "https://files.pythonhosted.org/packages/11/e2/ad8afb6927071d5ac59be7111385f505b2eba1837835686fc922e088ef9c/jaims_py-2.0.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-09-23 18:32:55",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "dev-mush",
"github_project": "jaims-py",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [
{
"name": "Pillow",
"specs": [
[
">=",
"10.3.0"
]
]
},
{
"name": "pydantic",
"specs": [
[
">=",
"2.0.0"
]
]
},
{
"name": "jsonref",
"specs": [
[
">=",
"1.1.0"
]
]
}
],
"lcname": "jaims-py"
}