llmstudio-core


Namellmstudio-core JSON
Version 1.0.0 PyPI version JSON
download
home_pageNone
SummaryLLMStudio core capabilities for routing llm calls for any vendor. No proxy server required. For that use llmstudio[proxy]
upload_time2024-10-28 09:43:34
maintainerNone
docs_urlNone
authorCláudio Lemos
requires_python<4.0,>=3.9
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LLMstudio-core by [TensorOps](http://tensorops.ai "TensorOps")

Prompt Engineering at your fingertips

![LLMstudio logo](https://imgur.com/Xqsj6V2.gif)

## 🌟 Features
- **Custom and Local LLM Support**: Use custom or local open-source LLMs through Ollama.
- **Python SDK**: Easily integrate LLMstudio into your existing workflows.
- **LangChain Integration**: LLMstudio integrates with your already existing LangChain projects.

## 🚀 Quickstart

Don't forget to check out [https://docs.llmstudio.ai](docs) page.

## Installation

Install the latest version of **LLMstudio** using `pip`. We suggest that you create and activate a new environment using `conda`

```bash
pip install llmstudio-core
```

Create a `.env` file at the same path you'll run **LLMstudio**

```bash
OPENAI_API_KEY="sk-api_key"
GOOGLE_API_KEY="sk-api_key"
```

Now you should be able to run **LLMstudio** Providers using the following code:

```
# You can set OPENAI_API_KEY environment variable, add it to .env, or pass directly as api_key
import os
from llmstudio_core.providers import LLMCore as LLM
llm = LLM("vertexai", api_key=os.environ["GOOGLE_API_KEY"])
response = llm.chat("How are you", model="gemini-1.5-pro-latest")
print(response.chat_output, response.metrics)
```
## 📖 Documentation

- [Visit our docs to learn how the SDK works](https://docs.LLMstudio.ai) (coming soon)
- Checkout our [notebook examples](https://github.com/TensorOpsAI/LLMstudio/tree/main/examples) to follow along with interactive tutorials

## 👨‍💻 Contributing

- Head on to our [Contribution Guide](https://github.com/TensorOpsAI/LLMstudio/tree/main/CONTRIBUTING.md) to see how you can help LLMstudio.
- Join our [Discord](https://discord.gg/GkAfPZR9wy) to talk with other LLMstudio enthusiasts.

## Training

[![Banner](https://imgur.com/XTRFZ4m.png)](https://www.tensorops.ai/llm-studio-workshop)

---

Thank you for choosing LLMstudio. Your journey to perfecting AI interactions starts here.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llmstudio-core",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": "Cl\u00e1udio Lemos",
    "author_email": "claudio.lemos@tensorops.ai",
    "download_url": "https://files.pythonhosted.org/packages/96/be/f6140adf3e4f34cf6a9ea137db7ccc52c3061a9d4550979207d3e589b134/llmstudio_core-1.0.0.tar.gz",
    "platform": null,
    "description": "# LLMstudio-core by [TensorOps](http://tensorops.ai \"TensorOps\")\n\nPrompt Engineering at your fingertips\n\n![LLMstudio logo](https://imgur.com/Xqsj6V2.gif)\n\n## \ud83c\udf1f Features\n- **Custom and Local LLM Support**: Use custom or local open-source LLMs through Ollama.\n- **Python SDK**: Easily integrate LLMstudio into your existing workflows.\n- **LangChain Integration**: LLMstudio integrates with your already existing LangChain projects.\n\n## \ud83d\ude80 Quickstart\n\nDon't forget to check out [https://docs.llmstudio.ai](docs) page.\n\n## Installation\n\nInstall the latest version of **LLMstudio** using `pip`. We suggest that you create and activate a new environment using `conda`\n\n```bash\npip install llmstudio-core\n```\n\nCreate a `.env` file at the same path you'll run **LLMstudio**\n\n```bash\nOPENAI_API_KEY=\"sk-api_key\"\nGOOGLE_API_KEY=\"sk-api_key\"\n```\n\nNow you should be able to run **LLMstudio** Providers using the following code:\n\n```\n# You can set OPENAI_API_KEY environment variable, add it to .env, or pass directly as api_key\nimport os\nfrom llmstudio_core.providers import LLMCore as LLM\nllm = LLM(\"vertexai\", api_key=os.environ[\"GOOGLE_API_KEY\"])\nresponse = llm.chat(\"How are you\", model=\"gemini-1.5-pro-latest\")\nprint(response.chat_output, response.metrics)\n```\n## \ud83d\udcd6 Documentation\n\n- [Visit our docs to learn how the SDK works](https://docs.LLMstudio.ai) (coming soon)\n- Checkout our [notebook examples](https://github.com/TensorOpsAI/LLMstudio/tree/main/examples) to follow along with interactive tutorials\n\n## \ud83d\udc68\u200d\ud83d\udcbb Contributing\n\n- Head on to our [Contribution Guide](https://github.com/TensorOpsAI/LLMstudio/tree/main/CONTRIBUTING.md) to see how you can help LLMstudio.\n- Join our [Discord](https://discord.gg/GkAfPZR9wy) to talk with other LLMstudio enthusiasts.\n\n## Training\n\n[![Banner](https://imgur.com/XTRFZ4m.png)](https://www.tensorops.ai/llm-studio-workshop)\n\n---\n\nThank you for choosing LLMstudio. Your journey to perfecting AI interactions starts here.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "LLMStudio core capabilities for routing llm calls for any vendor. No proxy server required. For that use llmstudio[proxy]",
    "version": "1.0.0",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1a35abe116cde8f0ef4539c58fb8fe35858e0a252b5b40a49d88d650f274ba17",
                "md5": "fb4ed3c417679be33111f06cb930f5da",
                "sha256": "9304ccfb9a5f029677a44264dc853b85915a36951e562dccf150606d41cff4d8"
            },
            "downloads": -1,
            "filename": "llmstudio_core-1.0.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "fb4ed3c417679be33111f06cb930f5da",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 17967,
            "upload_time": "2024-10-28T09:43:32",
            "upload_time_iso_8601": "2024-10-28T09:43:32.667753Z",
            "url": "https://files.pythonhosted.org/packages/1a/35/abe116cde8f0ef4539c58fb8fe35858e0a252b5b40a49d88d650f274ba17/llmstudio_core-1.0.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "96bef6140adf3e4f34cf6a9ea137db7ccc52c3061a9d4550979207d3e589b134",
                "md5": "98d2d4d5b155d7c36bdcec1036fbca6c",
                "sha256": "ed3e7944994800722f03b925a4b62f8c0b7ef567201e630f02e8e50fd8d5b100"
            },
            "downloads": -1,
            "filename": "llmstudio_core-1.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "98d2d4d5b155d7c36bdcec1036fbca6c",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 15414,
            "upload_time": "2024-10-28T09:43:34",
            "upload_time_iso_8601": "2024-10-28T09:43:34.027492Z",
            "url": "https://files.pythonhosted.org/packages/96/be/f6140adf3e4f34cf6a9ea137db7ccc52c3061a9d4550979207d3e589b134/llmstudio_core-1.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-28 09:43:34",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llmstudio-core"
}
        
Elapsed time: 0.47354s