# LLMstudio by [TensorOps](http://tensorops.ai "TensorOps")
Prompt Engineering at your fingertips
![LLMstudio logo](https://imgur.com/Xqsj6V2.gif)
## 🌟 Features
![LLMstudio UI](https://imgur.com/wrwiIUs.png)
- **LLM Proxy Access**: Seamless access to all the latest LLMs by OpenAI, Anthropic, Google.
- **Custom and Local LLM Support**: Use custom or local open-source LLMs through Ollama.
- **Prompt Playground UI**: A user-friendly interface for engineering and fine-tuning your prompts.
- **Python SDK**: Easily integrate LLMstudio into your existing workflows.
- **Monitoring and Logging**: Keep track of your usage and performance for all requests.
- **LangChain Integration**: LLMstudio integrates with your already existing LangChain projects.
- **Batch Calling**: Send multiple requests at once for improved efficiency.
- **Smart Routing and Fallback**: Ensure 24/7 availability by routing your requests to trusted LLMs.
- **Type Casting (soon)**: Convert data types as needed for your specific use case.
## 🚀 Quickstart
Don't forget to check out [https://docs.llmstudio.ai](docs) page.
## Installation
Install the latest version of **LLMstudio** using `pip`. We suggest that you create and activate a new environment using `conda`
```bash
pip install llmstudio
```
Install `bun` if you want to use the UI
```bash
curl -fsSL https://bun.sh/install | bash
```
Create a `.env` file at the same path you'll run **LLMstudio**
```bash
OPENAI_API_KEY="sk-api_key"
ANTHROPIC_API_KEY="sk-api_key"
```
Now you should be able to run **LLMstudio** using the following command.
```bash
llmstudio server --ui
```
When the `--ui` flag is set, you'll be able to access the UI at [http://localhost:3000](http://localhost:3000)
## 📖 Documentation
- [Visit our docs to learn how the SDK works](https://docs.LLMstudio.ai) (coming soon)
- Checkout our [notebook examples](https://github.com/TensorOpsAI/LLMstudio/tree/main/examples) to follow along with interactive tutorials
## 👨💻 Contributing
- Head on to our [Contribution Guide](https://github.com/TensorOpsAI/LLMstudio/tree/main/CONTRIBUTING.md) to see how you can help LLMstudio.
- Join our [Discord](https://discord.gg/GkAfPZR9wy) to talk with other LLMstudio enthusiasts.
## Training
[![Banner](https://imgur.com/XTRFZ4m.png)](https://www.tensorops.ai/llm-studio-workshop)
---
Thank you for choosing LLMstudio. Your journey to perfecting AI interactions starts here.
Raw data
{
"_id": null,
"home_page": "https://llmstudio.ai/",
"name": "llmstudio",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.9",
"maintainer_email": null,
"keywords": "ml, ai, llm, llmops, openai, langchain, chatgpt, llmstudio, tensorops",
"author": "Cl\u00e1udio Lemos",
"author_email": "claudio@tensorops.ai",
"download_url": "https://files.pythonhosted.org/packages/73/93/165492e2470f63115d145a6647b68b80b7e33d937a768cc52afc63d9f1c1/llmstudio-1.0.0.tar.gz",
"platform": null,
"description": "# LLMstudio by [TensorOps](http://tensorops.ai \"TensorOps\")\n\nPrompt Engineering at your fingertips\n\n![LLMstudio logo](https://imgur.com/Xqsj6V2.gif)\n\n## \ud83c\udf1f Features\n\n![LLMstudio UI](https://imgur.com/wrwiIUs.png)\n\n- **LLM Proxy Access**: Seamless access to all the latest LLMs by OpenAI, Anthropic, Google.\n- **Custom and Local LLM Support**: Use custom or local open-source LLMs through Ollama.\n- **Prompt Playground UI**: A user-friendly interface for engineering and fine-tuning your prompts.\n- **Python SDK**: Easily integrate LLMstudio into your existing workflows.\n- **Monitoring and Logging**: Keep track of your usage and performance for all requests.\n- **LangChain Integration**: LLMstudio integrates with your already existing LangChain projects.\n- **Batch Calling**: Send multiple requests at once for improved efficiency.\n- **Smart Routing and Fallback**: Ensure 24/7 availability by routing your requests to trusted LLMs.\n- **Type Casting (soon)**: Convert data types as needed for your specific use case.\n\n## \ud83d\ude80 Quickstart\n\nDon't forget to check out [https://docs.llmstudio.ai](docs) page.\n\n## Installation\n\nInstall the latest version of **LLMstudio** using `pip`. We suggest that you create and activate a new environment using `conda`\n\n```bash\npip install llmstudio\n```\n\nInstall `bun` if you want to use the UI\n\n```bash\ncurl -fsSL https://bun.sh/install | bash\n```\n\nCreate a `.env` file at the same path you'll run **LLMstudio**\n\n```bash\nOPENAI_API_KEY=\"sk-api_key\"\nANTHROPIC_API_KEY=\"sk-api_key\"\n```\n\nNow you should be able to run **LLMstudio** using the following command.\n\n```bash\nllmstudio server --ui\n```\n\nWhen the `--ui` flag is set, you'll be able to access the UI at [http://localhost:3000](http://localhost:3000)\n\n## \ud83d\udcd6 Documentation\n\n- [Visit our docs to learn how the SDK works](https://docs.LLMstudio.ai) (coming soon)\n- Checkout our [notebook examples](https://github.com/TensorOpsAI/LLMstudio/tree/main/examples) to follow along with interactive tutorials\n\n## \ud83d\udc68\u200d\ud83d\udcbb Contributing\n\n- Head on to our [Contribution Guide](https://github.com/TensorOpsAI/LLMstudio/tree/main/CONTRIBUTING.md) to see how you can help LLMstudio.\n- Join our [Discord](https://discord.gg/GkAfPZR9wy) to talk with other LLMstudio enthusiasts.\n\n## Training\n\n[![Banner](https://imgur.com/XTRFZ4m.png)](https://www.tensorops.ai/llm-studio-workshop)\n\n---\n\nThank you for choosing LLMstudio. Your journey to perfecting AI interactions starts here.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Prompt Perfection at Your Fingertips",
"version": "1.0.0",
"project_urls": {
"Bug Tracker": "https://github.com/tensoropsai/llmstudio/issues",
"Documentation": "https://docs.llmstudio.ai",
"Homepage": "https://llmstudio.ai/",
"Repository": "https://github.com/tensoropsai/llmstudio"
},
"split_keywords": [
"ml",
" ai",
" llm",
" llmops",
" openai",
" langchain",
" chatgpt",
" llmstudio",
" tensorops"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "fe2ed68d9cf121b02c9640efffefe3b2c815b483106f836ffc430229a4540d7c",
"md5": "75c324d4f80f374790f91581f26e06f8",
"sha256": "320bee2fb771427c723a9b491d02e090771dacdc620c66f83b9c5ddc7472d175"
},
"downloads": -1,
"filename": "llmstudio-1.0.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "75c324d4f80f374790f91581f26e06f8",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.9",
"size": 8087,
"upload_time": "2024-10-28T09:43:48",
"upload_time_iso_8601": "2024-10-28T09:43:48.886327Z",
"url": "https://files.pythonhosted.org/packages/fe/2e/d68d9cf121b02c9640efffefe3b2c815b483106f836ffc430229a4540d7c/llmstudio-1.0.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "7393165492e2470f63115d145a6647b68b80b7e33d937a768cc52afc63d9f1c1",
"md5": "fa11e9a309538303708dd8984094444f",
"sha256": "457fe6faab4e1582c65665b0661725e581b420064a3ed8cdebf940f5b3c312cf"
},
"downloads": -1,
"filename": "llmstudio-1.0.0.tar.gz",
"has_sig": false,
"md5_digest": "fa11e9a309538303708dd8984094444f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.9",
"size": 7681,
"upload_time": "2024-10-28T09:43:50",
"upload_time_iso_8601": "2024-10-28T09:43:50.307312Z",
"url": "https://files.pythonhosted.org/packages/73/93/165492e2470f63115d145a6647b68b80b7e33d937a768cc52afc63d9f1c1/llmstudio-1.0.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-10-28 09:43:50",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "tensoropsai",
"github_project": "llmstudio",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "llmstudio"
}