Name | llama-index-llms-text-generation-inference JSON |
Version |
0.2.2
JSON |
| download |
home_page | None |
Summary | llama-index llms huggingface text generation inference integration |
upload_time | 2024-09-13 20:03:20 |
maintainer | None |
docs_url | None |
author | Your Name |
requires_python | <4.0,>=3.8.1 |
license | MIT |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# LlamaIndex Llms Integration: Text Generation Inference
Integration with [Text Generation Inference](https://huggingface.co/docs/text-generation-inference) from Hugging Face to generate text.
## Installation
```shell
pip install llama-index-llms-text-generation-inference
```
## Usage
```python
from llama_index.llms.text_generation_inference import TextGenerationInference
llm = TextGenerationInference(
model_name="openai-community/gpt2",
temperature=0.7,
max_tokens=100,
token="<your-token>", # Optional
)
response = llm.complete("Hello, how are you?")
```
Raw data
{
"_id": null,
"home_page": null,
"name": "llama-index-llms-text-generation-inference",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.8.1",
"maintainer_email": null,
"keywords": null,
"author": "Your Name",
"author_email": "you@example.com",
"download_url": "https://files.pythonhosted.org/packages/0f/f6/52a61bc15d4605e192cb24bbd7d7fa2e4daf3c77dad17430b93138f289b0/llama_index_llms_text_generation_inference-0.2.2.tar.gz",
"platform": null,
"description": "# LlamaIndex Llms Integration: Text Generation Inference\n\nIntegration with [Text Generation Inference](https://huggingface.co/docs/text-generation-inference) from Hugging Face to generate text.\n\n## Installation\n\n```shell\npip install llama-index-llms-text-generation-inference\n```\n\n## Usage\n\n```python\nfrom llama_index.llms.text_generation_inference import TextGenerationInference\n\nllm = TextGenerationInference(\n model_name=\"openai-community/gpt2\",\n temperature=0.7,\n max_tokens=100,\n token=\"<your-token>\", # Optional\n)\n\nresponse = llm.complete(\"Hello, how are you?\")\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "llama-index llms huggingface text generation inference integration",
"version": "0.2.2",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "4a522d328d9cb7757d67dda861201d95cb657434604103f761c9838c8160999a",
"md5": "ca1c5bb02e535e0aa69ae6eb74735b26",
"sha256": "5ed21e1196089fcdccb6f16136f5fec8a20decdb909fb036f9deac20d1a51458"
},
"downloads": -1,
"filename": "llama_index_llms_text_generation_inference-0.2.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "ca1c5bb02e535e0aa69ae6eb74735b26",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.8.1",
"size": 6452,
"upload_time": "2024-09-13T20:03:19",
"upload_time_iso_8601": "2024-09-13T20:03:19.843729Z",
"url": "https://files.pythonhosted.org/packages/4a/52/2d328d9cb7757d67dda861201d95cb657434604103f761c9838c8160999a/llama_index_llms_text_generation_inference-0.2.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "0ff652a61bc15d4605e192cb24bbd7d7fa2e4daf3c77dad17430b93138f289b0",
"md5": "ed7a4bf25180d4eda331e8c1ef0ebee7",
"sha256": "c69a718346f058e2514d55064afeb7431a0b4494f9abb4ce7fe328103584e1f2"
},
"downloads": -1,
"filename": "llama_index_llms_text_generation_inference-0.2.2.tar.gz",
"has_sig": false,
"md5_digest": "ed7a4bf25180d4eda331e8c1ef0ebee7",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.8.1",
"size": 5573,
"upload_time": "2024-09-13T20:03:20",
"upload_time_iso_8601": "2024-09-13T20:03:20.682691Z",
"url": "https://files.pythonhosted.org/packages/0f/f6/52a61bc15d4605e192cb24bbd7d7fa2e4daf3c77dad17430b93138f289b0/llama_index_llms_text_generation_inference-0.2.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-09-13 20:03:20",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "llama-index-llms-text-generation-inference"
}