llama-index-llms-text-generation-inference


Namellama-index-llms-text-generation-inference JSON
Version 0.3.1 PyPI version JSON
download
home_pageNone
Summaryllama-index llms huggingface text generation inference integration
upload_time2024-12-11 23:54:56
maintainerNone
docs_urlNone
authorYour Name
requires_python<4.0,>=3.9
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LlamaIndex Llms Integration: Text Generation Inference

Integration with [Text Generation Inference](https://huggingface.co/docs/text-generation-inference) from Hugging Face to generate text.

## Installation

```shell
pip install llama-index-llms-text-generation-inference
```

## Usage

```python
from llama_index.llms.text_generation_inference import TextGenerationInference

llm = TextGenerationInference(
    model_name="openai-community/gpt2",
    temperature=0.7,
    max_tokens=100,
    token="<your-token>",  # Optional
)

response = llm.complete("Hello, how are you?")
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-llms-text-generation-inference",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": "Your Name",
    "author_email": "you@example.com",
    "download_url": "https://files.pythonhosted.org/packages/47/34/de40f2c71b97d8509717e1861eaa4b595e63c931b6fdec3eaa5bc588c53c/llama_index_llms_text_generation_inference-0.3.1.tar.gz",
    "platform": null,
    "description": "# LlamaIndex Llms Integration: Text Generation Inference\n\nIntegration with [Text Generation Inference](https://huggingface.co/docs/text-generation-inference) from Hugging Face to generate text.\n\n## Installation\n\n```shell\npip install llama-index-llms-text-generation-inference\n```\n\n## Usage\n\n```python\nfrom llama_index.llms.text_generation_inference import TextGenerationInference\n\nllm = TextGenerationInference(\n    model_name=\"openai-community/gpt2\",\n    temperature=0.7,\n    max_tokens=100,\n    token=\"<your-token>\",  # Optional\n)\n\nresponse = llm.complete(\"Hello, how are you?\")\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "llama-index llms huggingface text generation inference integration",
    "version": "0.3.1",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "5a198bca3b6068dcee5791054e9a006fa95578f8dae587cd6ad9a171431492c0",
                "md5": "344cb5f12ffbc21efe5ec6a26715137e",
                "sha256": "07dcc8c4fd176e05d1a3f4e4954866cf74f372151866df2699b23cc8d0c16b74"
            },
            "downloads": -1,
            "filename": "llama_index_llms_text_generation_inference-0.3.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "344cb5f12ffbc21efe5ec6a26715137e",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 6485,
            "upload_time": "2024-12-11T23:54:54",
            "upload_time_iso_8601": "2024-12-11T23:54:54.891619Z",
            "url": "https://files.pythonhosted.org/packages/5a/19/8bca3b6068dcee5791054e9a006fa95578f8dae587cd6ad9a171431492c0/llama_index_llms_text_generation_inference-0.3.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "4734de40f2c71b97d8509717e1861eaa4b595e63c931b6fdec3eaa5bc588c53c",
                "md5": "dbe3a38807d0d383078bfe4fc83b2445",
                "sha256": "4d2f8b9fdcb56013db8149fc7aee06d563b04e7f24c49b1a48529eb1e678057e"
            },
            "downloads": -1,
            "filename": "llama_index_llms_text_generation_inference-0.3.1.tar.gz",
            "has_sig": false,
            "md5_digest": "dbe3a38807d0d383078bfe4fc83b2445",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 5610,
            "upload_time": "2024-12-11T23:54:56",
            "upload_time_iso_8601": "2024-12-11T23:54:56.974997Z",
            "url": "https://files.pythonhosted.org/packages/47/34/de40f2c71b97d8509717e1861eaa4b595e63c931b6fdec3eaa5bc588c53c/llama_index_llms_text_generation_inference-0.3.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-11 23:54:56",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-llms-text-generation-inference"
}
        
Elapsed time: 2.12919s