llama-index-llms-llama-api


Namellama-index-llms-llama-api JSON
Version 0.3.0 PyPI version JSON
download
home_pageNone
Summaryllama-index llms llama api integration
upload_time2024-11-18 01:06:46
maintainerNone
docs_urlNone
authorYour Name
requires_python<4.0,>=3.9
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LlamaIndex Llms Integration: Llama Api

## Prerequisites

1. **API Key**: Obtain an API key from [Llama API](https://www.llama-api.com/).
2. **Python 3.x**: Ensure you have Python installed on your system.

## Installation

1. Install the required Python packages:

   ```bash
   %pip install llama-index-program-openai
   %pip install llama-index-llms-llama-api
   !pip install llama-index
   ```

## Basic Usage

### Import Required Libraries

```python
from llama_index.llms.llama_api import LlamaAPI
from llama_index.core.llms import ChatMessage
```

### Initialize LlamaAPI

Set up the API key:

```python
api_key = "LL-your-key"
llm = LlamaAPI(api_key=api_key)
```

### Complete with a Prompt

Generate a response using a prompt:

```python
resp = llm.complete("Paul Graham is ")
print(resp)
```

### Chat with a List of Messages

Interact with the model using a chat interface:

```python
messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(role="user", content="What is your name"),
]
resp = llm.chat(messages)
print(resp)
```

### Function Calling

Define a function using Pydantic and call it through LlamaAPI:

```python
from pydantic import BaseModel
from llama_index.core.llms.openai_utils import to_openai_function


class Song(BaseModel):
    """A song with name and artist"""

    name: str
    artist: str


song_fn = to_openai_function(Song)
response = llm.complete("Generate a song", functions=[song_fn])
function_call = response.additional_kwargs["function_call"]
print(function_call)
```

### Structured Data Extraction

Define schemas for structured output using Pydantic:

```python
from pydantic import BaseModel
from typing import List


class Song(BaseModel):
    """Data model for a song."""

    title: str
    length_mins: int


class Album(BaseModel):
    """Data model for an album."""

    name: str
    artist: str
    songs: List[Song]
```

Define the prompt template for extracting structured data:

```python
from llama_index.program.openai import OpenAIPydanticProgram

prompt_template_str = """\
Extract album and songs from the text provided.
For each song, make sure to specify the title and the length_mins.
{text}
"""

llm = LlamaAPI(api_key=api_key, temperature=0.0)

program = OpenAIPydanticProgram.from_defaults(
    output_cls=Album,
    llm=llm,
    prompt_template_str=prompt_template_str,
    verbose=True,
)
```

### Run Program to Get Structured Output

Execute the program to extract structured data from the provided text:

```python
output = program(
    text="""
    "Echoes of Eternity" is a compelling and thought-provoking album, skillfully crafted by the renowned artist, Seraphina Rivers. \
    This captivating musical collection takes listeners on an introspective journey, delving into the depths of the human experience \
    and the vastness of the universe. With her mesmerizing vocals and poignant songwriting, Seraphina Rivers infuses each track with \
    raw emotion and a sense of cosmic wonder. The album features several standout songs, including the hauntingly beautiful "Stardust \
    Serenade," a celestial ballad that lasts for six minutes, carrying listeners through a celestial dreamscape. "Eclipse of the Soul" \
    captivates with its enchanting melodies and spans over eight minutes, inviting introspection and contemplation. Another gem, "Infinity \
    Embrace," unfolds like a cosmic odyssey, lasting nearly ten minutes, drawing listeners deeper into its ethereal atmosphere. "Echoes of Eternity" \
    is a masterful testament to Seraphina Rivers' artistic prowess, leaving an enduring impact on all who embark on this musical voyage through \
    time and space.
    """
)
```

### Output Example

You can print the structured output like this:

```python
print(output)
```

### LLM Implementation example

https://docs.llamaindex.ai/en/stable/examples/llm/llama_api/

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-llms-llama-api",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": "Your Name",
    "author_email": "you@example.com",
    "download_url": "https://files.pythonhosted.org/packages/d0/a1/dcb10b69ae518d44205c90eef9194154fb842230e806048fd7c01336186c/llama_index_llms_llama_api-0.3.0.tar.gz",
    "platform": null,
    "description": "# LlamaIndex Llms Integration: Llama Api\n\n## Prerequisites\n\n1. **API Key**: Obtain an API key from [Llama API](https://www.llama-api.com/).\n2. **Python 3.x**: Ensure you have Python installed on your system.\n\n## Installation\n\n1. Install the required Python packages:\n\n   ```bash\n   %pip install llama-index-program-openai\n   %pip install llama-index-llms-llama-api\n   !pip install llama-index\n   ```\n\n## Basic Usage\n\n### Import Required Libraries\n\n```python\nfrom llama_index.llms.llama_api import LlamaAPI\nfrom llama_index.core.llms import ChatMessage\n```\n\n### Initialize LlamaAPI\n\nSet up the API key:\n\n```python\napi_key = \"LL-your-key\"\nllm = LlamaAPI(api_key=api_key)\n```\n\n### Complete with a Prompt\n\nGenerate a response using a prompt:\n\n```python\nresp = llm.complete(\"Paul Graham is \")\nprint(resp)\n```\n\n### Chat with a List of Messages\n\nInteract with the model using a chat interface:\n\n```python\nmessages = [\n    ChatMessage(\n        role=\"system\", content=\"You are a pirate with a colorful personality\"\n    ),\n    ChatMessage(role=\"user\", content=\"What is your name\"),\n]\nresp = llm.chat(messages)\nprint(resp)\n```\n\n### Function Calling\n\nDefine a function using Pydantic and call it through LlamaAPI:\n\n```python\nfrom pydantic import BaseModel\nfrom llama_index.core.llms.openai_utils import to_openai_function\n\n\nclass Song(BaseModel):\n    \"\"\"A song with name and artist\"\"\"\n\n    name: str\n    artist: str\n\n\nsong_fn = to_openai_function(Song)\nresponse = llm.complete(\"Generate a song\", functions=[song_fn])\nfunction_call = response.additional_kwargs[\"function_call\"]\nprint(function_call)\n```\n\n### Structured Data Extraction\n\nDefine schemas for structured output using Pydantic:\n\n```python\nfrom pydantic import BaseModel\nfrom typing import List\n\n\nclass Song(BaseModel):\n    \"\"\"Data model for a song.\"\"\"\n\n    title: str\n    length_mins: int\n\n\nclass Album(BaseModel):\n    \"\"\"Data model for an album.\"\"\"\n\n    name: str\n    artist: str\n    songs: List[Song]\n```\n\nDefine the prompt template for extracting structured data:\n\n```python\nfrom llama_index.program.openai import OpenAIPydanticProgram\n\nprompt_template_str = \"\"\"\\\nExtract album and songs from the text provided.\nFor each song, make sure to specify the title and the length_mins.\n{text}\n\"\"\"\n\nllm = LlamaAPI(api_key=api_key, temperature=0.0)\n\nprogram = OpenAIPydanticProgram.from_defaults(\n    output_cls=Album,\n    llm=llm,\n    prompt_template_str=prompt_template_str,\n    verbose=True,\n)\n```\n\n### Run Program to Get Structured Output\n\nExecute the program to extract structured data from the provided text:\n\n```python\noutput = program(\n    text=\"\"\"\n    \"Echoes of Eternity\" is a compelling and thought-provoking album, skillfully crafted by the renowned artist, Seraphina Rivers. \\\n    This captivating musical collection takes listeners on an introspective journey, delving into the depths of the human experience \\\n    and the vastness of the universe. With her mesmerizing vocals and poignant songwriting, Seraphina Rivers infuses each track with \\\n    raw emotion and a sense of cosmic wonder. The album features several standout songs, including the hauntingly beautiful \"Stardust \\\n    Serenade,\" a celestial ballad that lasts for six minutes, carrying listeners through a celestial dreamscape. \"Eclipse of the Soul\" \\\n    captivates with its enchanting melodies and spans over eight minutes, inviting introspection and contemplation. Another gem, \"Infinity \\\n    Embrace,\" unfolds like a cosmic odyssey, lasting nearly ten minutes, drawing listeners deeper into its ethereal atmosphere. \"Echoes of Eternity\" \\\n    is a masterful testament to Seraphina Rivers' artistic prowess, leaving an enduring impact on all who embark on this musical voyage through \\\n    time and space.\n    \"\"\"\n)\n```\n\n### Output Example\n\nYou can print the structured output like this:\n\n```python\nprint(output)\n```\n\n### LLM Implementation example\n\nhttps://docs.llamaindex.ai/en/stable/examples/llm/llama_api/\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "llama-index llms llama api integration",
    "version": "0.3.0",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ffa00c153bd9735a6f16ef0e0cd26a61bef2c417f5cd50d03e2c24cdc9d62ea3",
                "md5": "52d112e0053e2645658f85647c72ded3",
                "sha256": "e86995d5b018de3f8bea85f09b65d0420db9f45dd326d7b92887799293ba0cd7"
            },
            "downloads": -1,
            "filename": "llama_index_llms_llama_api-0.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "52d112e0053e2645658f85647c72ded3",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 4699,
            "upload_time": "2024-11-18T01:06:45",
            "upload_time_iso_8601": "2024-11-18T01:06:45.182826Z",
            "url": "https://files.pythonhosted.org/packages/ff/a0/0c153bd9735a6f16ef0e0cd26a61bef2c417f5cd50d03e2c24cdc9d62ea3/llama_index_llms_llama_api-0.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d0a1dcb10b69ae518d44205c90eef9194154fb842230e806048fd7c01336186c",
                "md5": "88bc37397fbea17470508138e95cf634",
                "sha256": "55f7cebb7ccefa296a5ef9f7832b5623f95772ec4afcfb760c9162b598114a32"
            },
            "downloads": -1,
            "filename": "llama_index_llms_llama_api-0.3.0.tar.gz",
            "has_sig": false,
            "md5_digest": "88bc37397fbea17470508138e95cf634",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 4475,
            "upload_time": "2024-11-18T01:06:46",
            "upload_time_iso_8601": "2024-11-18T01:06:46.785078Z",
            "url": "https://files.pythonhosted.org/packages/d0/a1/dcb10b69ae518d44205c90eef9194154fb842230e806048fd7c01336186c/llama_index_llms_llama_api-0.3.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-18 01:06:46",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-llms-llama-api"
}
        
Elapsed time: 0.47509s