llmstudio


Namellmstudio JSON
Version 0.3.3 PyPI version JSON
download
home_pagehttps://llmstudio.ai/
SummaryPrompt Perfection at Your Fingertips
upload_time2024-03-12 16:19:52
maintainer
docs_urlNone
authorCláudio Lemos
requires_python>=3.9,<4.0
licenseMIT
keywords ml ai llm llmops openai langchain chatgpt llmstudio tensorops
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LLMstudio by [TensorOps](http://tensorops.ai "TensorOps")

Prompt Engineering at your fingertips

![LLMstudio logo](https://imgur.com/Xqsj6V2.gif)

> [!IMPORTANT]
> LLMstudio is now supporting OpenAI v1.0 + just added support to Anthropic

## 🌟 Features

![LLMstudio UI](https://imgur.com/wrwiIUs.png)

1.  **Python Client Gateway**:
    - Access models from known providers such as OpenAI, VertexAI and Bedrock. All in one platform.
    - Speed up development with tracking and robustness features from LLMstudio.
    - Continue using popular libraries like LangChain through their LLMstudio-wrapped versions.
2.  **Prompt Editing UI**:
    - An intuitive interface designed for prompt engineering.
    - Quickly iterate between prompts until you reach your desired results.
    - Access the history of your previous prompts and their results.
3.  **History Management**:
    - Track past runs, available for both on the UI and the Client.
    - Log the cost, latency and output of each prompt.
    - Export the history to a CSV.
4.  **Context Limit Adaptability**:
    - Automatic switch to a larger-context fallback model if the current model's context limit is exceeded.
    - Always use the lowest context model and only use the higher context ones when necessary to save costs.
    - For instance, exceeding 4k tokens in gpt-3.5-turbo triggers a switch to gpt-3.5-turbo-16k.

### 👀 Coming soon:

- Side-by-side comparison of multiple LLMs using the same prompt.
- Automated testing and validation for your LLMs. (Create Unit-tests for your LLMs which are evaluated automatically)
- API key administration. (Define quota limits)
- Projects and sessions. (Organize your History and API keys by project)
- Resilience against service provider rate limits.
- Organized tracking of groups of related prompts (Chains, Agents)

## 🚀 Quickstart

Don't forget to check out [https://docs.llmstudio.ai](docs) page.

## Installation

Install the latest version of **LLMstudio** using `pip`. We suggest that you create and activate a new environment using `conda`

```bash
pip install llmstudio
```

Install `bun` if you want to use the UI

```bash
curl -fsSL https://bun.sh/install | bash
```

Create a `.env` file at the same path you'll run **LLMstudio**

```bash
OPENAI_API_KEY="sk-api_key"
ANTHROPIC_API_KEY="sk-api_key"
```

Now you should be able to run **LLMstudio** using the following command.

```bash
llmstudio server --ui
```

When the `--ui` flag is set, you'll be able to access the UI at [http://localhost:3000](http://localhost:3000)

## 🤔 About LLMstudio

Powered by TensorOps, LLMstudio redefines your experience with OpenAI, Vertex Ai and more language model providers. More than just a tool, it’s an evolving environment where teams can experiment, modify, and optimize their interactions with advanced language models.

Benefits include:

- **Streamlined Prompt Engineering**: Simplify and enhance your prompt design process.
- **Execution History**: Keep a detailed log of past executions, track progress, and make iterative improvements effortlessly.
- **Effortless Data Export**: Share your team's endeavors by exporting data to shareable CSV files.

Step into the future of AI with LLMstudio, by watching our [introduction video](https://www.youtube.com/watch?v=I9h701fbD18)

## 📖 Documentation

- [Visit our docs to learn how the SDK works](https://docs.LLMstudio.ai) (coming soon)
- Checkout our [notebook examples](https://github.com/TensorOpsAI/LLMstudio/tree/main/examples) to follow along with interactive tutorials
- Checkout out [LLMstudio Architecture Roadmap](https://github.com/TensorOpsAI/LLMstudio/blob/main/docs/LLMstudio-architecture/LLMstudio-architecture-roadmap.md)

## 👨‍💻 Contributing

- Head on to our [Contribution Guide](https://github.com/TensorOpsAI/LLMstudio/tree/main/CONTRIBUTING.md) to see how you can help LLMstudio.
- Join our [Discord](https://discord.gg/GkAfPZR9wy) to talk with other LLMstudio enthusiasts.

## Training

[![Banner](https://imgur.com/XTRFZ4m.png)](https://www.tensorops.ai/llm-studio-workshop)

---

Thank you for choosing LLMstudio. Your journey to perfecting AI interactions starts here.

            

Raw data

            {
    "_id": null,
    "home_page": "https://llmstudio.ai/",
    "name": "llmstudio",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.9,<4.0",
    "maintainer_email": "",
    "keywords": "ml,ai,llm,llmops,openai,langchain,chatgpt,llmstudio,tensorops",
    "author": "Cl\u00e1udio Lemos",
    "author_email": "claudio@tensorops.ai",
    "download_url": "https://files.pythonhosted.org/packages/49/10/20c2319f5a92b2ea7dd5b19f63cc7df96129bdebc72eb0ce4a6b145da9a7/llmstudio-0.3.3.tar.gz",
    "platform": null,
    "description": "# LLMstudio by [TensorOps](http://tensorops.ai \"TensorOps\")\n\nPrompt Engineering at your fingertips\n\n![LLMstudio logo](https://imgur.com/Xqsj6V2.gif)\n\n> [!IMPORTANT]\n> LLMstudio is now supporting OpenAI v1.0 + just added support to Anthropic\n\n## \ud83c\udf1f Features\n\n![LLMstudio UI](https://imgur.com/wrwiIUs.png)\n\n1.  **Python Client Gateway**:\n    - Access models from known providers such as OpenAI, VertexAI and Bedrock. All in one platform.\n    - Speed up development with tracking and robustness features from LLMstudio.\n    - Continue using popular libraries like LangChain through their LLMstudio-wrapped versions.\n2.  **Prompt Editing UI**:\n    - An intuitive interface designed for prompt engineering.\n    - Quickly iterate between prompts until you reach your desired results.\n    - Access the history of your previous prompts and their results.\n3.  **History Management**:\n    - Track past runs, available for both on the UI and the Client.\n    - Log the cost, latency and output of each prompt.\n    - Export the history to a CSV.\n4.  **Context Limit Adaptability**:\n    - Automatic switch to a larger-context fallback model if the current model's context limit is exceeded.\n    - Always use the lowest context model and only use the higher context ones when necessary to save costs.\n    - For instance, exceeding 4k tokens in gpt-3.5-turbo triggers a switch to gpt-3.5-turbo-16k.\n\n### \ud83d\udc40 Coming soon:\n\n- Side-by-side comparison of multiple LLMs using the same prompt.\n- Automated testing and validation for your LLMs. (Create Unit-tests for your LLMs which are evaluated automatically)\n- API key administration. (Define quota limits)\n- Projects and sessions. (Organize your History and API keys by project)\n- Resilience against service provider rate limits.\n- Organized tracking of groups of related prompts (Chains, Agents)\n\n## \ud83d\ude80 Quickstart\n\nDon't forget to check out [https://docs.llmstudio.ai](docs) page.\n\n## Installation\n\nInstall the latest version of **LLMstudio** using `pip`. We suggest that you create and activate a new environment using `conda`\n\n```bash\npip install llmstudio\n```\n\nInstall `bun` if you want to use the UI\n\n```bash\ncurl -fsSL https://bun.sh/install | bash\n```\n\nCreate a `.env` file at the same path you'll run **LLMstudio**\n\n```bash\nOPENAI_API_KEY=\"sk-api_key\"\nANTHROPIC_API_KEY=\"sk-api_key\"\n```\n\nNow you should be able to run **LLMstudio** using the following command.\n\n```bash\nllmstudio server --ui\n```\n\nWhen the `--ui` flag is set, you'll be able to access the UI at [http://localhost:3000](http://localhost:3000)\n\n## \ud83e\udd14 About LLMstudio\n\nPowered by TensorOps, LLMstudio redefines your experience with OpenAI, Vertex Ai and more language model providers. More than just a tool, it\u2019s an evolving environment where teams can experiment, modify, and optimize their interactions with advanced language models.\n\nBenefits include:\n\n- **Streamlined Prompt Engineering**: Simplify and enhance your prompt design process.\n- **Execution History**: Keep a detailed log of past executions, track progress, and make iterative improvements effortlessly.\n- **Effortless Data Export**: Share your team's endeavors by exporting data to shareable CSV files.\n\nStep into the future of AI with LLMstudio, by watching our [introduction video](https://www.youtube.com/watch?v=I9h701fbD18)\n\n## \ud83d\udcd6 Documentation\n\n- [Visit our docs to learn how the SDK works](https://docs.LLMstudio.ai) (coming soon)\n- Checkout our [notebook examples](https://github.com/TensorOpsAI/LLMstudio/tree/main/examples) to follow along with interactive tutorials\n- Checkout out [LLMstudio Architecture Roadmap](https://github.com/TensorOpsAI/LLMstudio/blob/main/docs/LLMstudio-architecture/LLMstudio-architecture-roadmap.md)\n\n## \ud83d\udc68\u200d\ud83d\udcbb Contributing\n\n- Head on to our [Contribution Guide](https://github.com/TensorOpsAI/LLMstudio/tree/main/CONTRIBUTING.md) to see how you can help LLMstudio.\n- Join our [Discord](https://discord.gg/GkAfPZR9wy) to talk with other LLMstudio enthusiasts.\n\n## Training\n\n[![Banner](https://imgur.com/XTRFZ4m.png)](https://www.tensorops.ai/llm-studio-workshop)\n\n---\n\nThank you for choosing LLMstudio. Your journey to perfecting AI interactions starts here.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Prompt Perfection at Your Fingertips",
    "version": "0.3.3",
    "project_urls": {
        "Bug Tracker": "https://github.com/tensoropsai/llmstudio/issues",
        "Documentation": "https://docs.llmstudio.ai",
        "Homepage": "https://llmstudio.ai/",
        "Repository": "https://github.com/tensoropsai/llmstudio"
    },
    "split_keywords": [
        "ml",
        "ai",
        "llm",
        "llmops",
        "openai",
        "langchain",
        "chatgpt",
        "llmstudio",
        "tensorops"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "96a32b972b6aca21b6272f4017d1e3ff9994fdfd763a627ba3c25523fee42d65",
                "md5": "ce91a5bab2e99366bfc6328429b4374a",
                "sha256": "83a0fe68822960456b7b9f007af2ddb6cf8f368011b0e7e86e3ae802b0a84446"
            },
            "downloads": -1,
            "filename": "llmstudio-0.3.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "ce91a5bab2e99366bfc6328429b4374a",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9,<4.0",
            "size": 134305,
            "upload_time": "2024-03-12T16:19:50",
            "upload_time_iso_8601": "2024-03-12T16:19:50.793750Z",
            "url": "https://files.pythonhosted.org/packages/96/a3/2b972b6aca21b6272f4017d1e3ff9994fdfd763a627ba3c25523fee42d65/llmstudio-0.3.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "491020c2319f5a92b2ea7dd5b19f63cc7df96129bdebc72eb0ce4a6b145da9a7",
                "md5": "8fd32f116a36a5415015297fc41ecc5f",
                "sha256": "b709e99401daab5c49b8a24fb25b49f645504350e6bbed335c4020f712047844"
            },
            "downloads": -1,
            "filename": "llmstudio-0.3.3.tar.gz",
            "has_sig": false,
            "md5_digest": "8fd32f116a36a5415015297fc41ecc5f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9,<4.0",
            "size": 87517,
            "upload_time": "2024-03-12T16:19:52",
            "upload_time_iso_8601": "2024-03-12T16:19:52.996785Z",
            "url": "https://files.pythonhosted.org/packages/49/10/20c2319f5a92b2ea7dd5b19f63cc7df96129bdebc72eb0ce4a6b145da9a7/llmstudio-0.3.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-03-12 16:19:52",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "tensoropsai",
    "github_project": "llmstudio",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "llmstudio"
}
        
Elapsed time: 0.23479s