llama-index-llms-llama-api


Namellama-index-llms-llama-api JSON
Version 0.2.2 PyPI version JSON
download
home_pageNone
Summaryllama-index llms llama api integration
upload_time2024-10-08 22:30:05
maintainerNone
docs_urlNone
authorYour Name
requires_python<4.0,>=3.9
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LlamaIndex Llms Integration: Llama Api

## Prerequisites

1. **API Key**: Obtain an API key from [Llama API](https://www.llama-api.com/).
2. **Python 3.x**: Ensure you have Python installed on your system.

## Installation

1. Install the required Python packages:

   ```bash
   %pip install llama-index-program-openai
   %pip install llama-index-llms-llama-api
   !pip install llama-index
   ```

## Basic Usage

### Import Required Libraries

```python
from llama_index.llms.llama_api import LlamaAPI
from llama_index.core.llms import ChatMessage
```

### Initialize LlamaAPI

Set up the API key:

```python
api_key = "LL-your-key"
llm = LlamaAPI(api_key=api_key)
```

### Complete with a Prompt

Generate a response using a prompt:

```python
resp = llm.complete("Paul Graham is ")
print(resp)
```

### Chat with a List of Messages

Interact with the model using a chat interface:

```python
messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(role="user", content="What is your name"),
]
resp = llm.chat(messages)
print(resp)
```

### Function Calling

Define a function using Pydantic and call it through LlamaAPI:

```python
from pydantic import BaseModel
from llama_index.core.llms.openai_utils import to_openai_function


class Song(BaseModel):
    """A song with name and artist"""

    name: str
    artist: str


song_fn = to_openai_function(Song)
response = llm.complete("Generate a song", functions=[song_fn])
function_call = response.additional_kwargs["function_call"]
print(function_call)
```

### Structured Data Extraction

Define schemas for structured output using Pydantic:

```python
from pydantic import BaseModel
from typing import List


class Song(BaseModel):
    """Data model for a song."""

    title: str
    length_mins: int


class Album(BaseModel):
    """Data model for an album."""

    name: str
    artist: str
    songs: List[Song]
```

Define the prompt template for extracting structured data:

```python
from llama_index.program.openai import OpenAIPydanticProgram

prompt_template_str = """\
Extract album and songs from the text provided.
For each song, make sure to specify the title and the length_mins.
{text}
"""

llm = LlamaAPI(api_key=api_key, temperature=0.0)

program = OpenAIPydanticProgram.from_defaults(
    output_cls=Album,
    llm=llm,
    prompt_template_str=prompt_template_str,
    verbose=True,
)
```

### Run Program to Get Structured Output

Execute the program to extract structured data from the provided text:

```python
output = program(
    text="""
    "Echoes of Eternity" is a compelling and thought-provoking album, skillfully crafted by the renowned artist, Seraphina Rivers. \
    This captivating musical collection takes listeners on an introspective journey, delving into the depths of the human experience \
    and the vastness of the universe. With her mesmerizing vocals and poignant songwriting, Seraphina Rivers infuses each track with \
    raw emotion and a sense of cosmic wonder. The album features several standout songs, including the hauntingly beautiful "Stardust \
    Serenade," a celestial ballad that lasts for six minutes, carrying listeners through a celestial dreamscape. "Eclipse of the Soul" \
    captivates with its enchanting melodies and spans over eight minutes, inviting introspection and contemplation. Another gem, "Infinity \
    Embrace," unfolds like a cosmic odyssey, lasting nearly ten minutes, drawing listeners deeper into its ethereal atmosphere. "Echoes of Eternity" \
    is a masterful testament to Seraphina Rivers' artistic prowess, leaving an enduring impact on all who embark on this musical voyage through \
    time and space.
    """
)
```

### Output Example

You can print the structured output like this:

```python
print(output)
```

### LLM Implementation example

https://docs.llamaindex.ai/en/stable/examples/llm/llama_api/

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-llms-llama-api",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": "Your Name",
    "author_email": "you@example.com",
    "download_url": "https://files.pythonhosted.org/packages/4e/87/d061bbf6ae0110f564d241dcea86a027d60584917befb9d97d5b5d2d9193/llama_index_llms_llama_api-0.2.2.tar.gz",
    "platform": null,
    "description": "# LlamaIndex Llms Integration: Llama Api\n\n## Prerequisites\n\n1. **API Key**: Obtain an API key from [Llama API](https://www.llama-api.com/).\n2. **Python 3.x**: Ensure you have Python installed on your system.\n\n## Installation\n\n1. Install the required Python packages:\n\n   ```bash\n   %pip install llama-index-program-openai\n   %pip install llama-index-llms-llama-api\n   !pip install llama-index\n   ```\n\n## Basic Usage\n\n### Import Required Libraries\n\n```python\nfrom llama_index.llms.llama_api import LlamaAPI\nfrom llama_index.core.llms import ChatMessage\n```\n\n### Initialize LlamaAPI\n\nSet up the API key:\n\n```python\napi_key = \"LL-your-key\"\nllm = LlamaAPI(api_key=api_key)\n```\n\n### Complete with a Prompt\n\nGenerate a response using a prompt:\n\n```python\nresp = llm.complete(\"Paul Graham is \")\nprint(resp)\n```\n\n### Chat with a List of Messages\n\nInteract with the model using a chat interface:\n\n```python\nmessages = [\n    ChatMessage(\n        role=\"system\", content=\"You are a pirate with a colorful personality\"\n    ),\n    ChatMessage(role=\"user\", content=\"What is your name\"),\n]\nresp = llm.chat(messages)\nprint(resp)\n```\n\n### Function Calling\n\nDefine a function using Pydantic and call it through LlamaAPI:\n\n```python\nfrom pydantic import BaseModel\nfrom llama_index.core.llms.openai_utils import to_openai_function\n\n\nclass Song(BaseModel):\n    \"\"\"A song with name and artist\"\"\"\n\n    name: str\n    artist: str\n\n\nsong_fn = to_openai_function(Song)\nresponse = llm.complete(\"Generate a song\", functions=[song_fn])\nfunction_call = response.additional_kwargs[\"function_call\"]\nprint(function_call)\n```\n\n### Structured Data Extraction\n\nDefine schemas for structured output using Pydantic:\n\n```python\nfrom pydantic import BaseModel\nfrom typing import List\n\n\nclass Song(BaseModel):\n    \"\"\"Data model for a song.\"\"\"\n\n    title: str\n    length_mins: int\n\n\nclass Album(BaseModel):\n    \"\"\"Data model for an album.\"\"\"\n\n    name: str\n    artist: str\n    songs: List[Song]\n```\n\nDefine the prompt template for extracting structured data:\n\n```python\nfrom llama_index.program.openai import OpenAIPydanticProgram\n\nprompt_template_str = \"\"\"\\\nExtract album and songs from the text provided.\nFor each song, make sure to specify the title and the length_mins.\n{text}\n\"\"\"\n\nllm = LlamaAPI(api_key=api_key, temperature=0.0)\n\nprogram = OpenAIPydanticProgram.from_defaults(\n    output_cls=Album,\n    llm=llm,\n    prompt_template_str=prompt_template_str,\n    verbose=True,\n)\n```\n\n### Run Program to Get Structured Output\n\nExecute the program to extract structured data from the provided text:\n\n```python\noutput = program(\n    text=\"\"\"\n    \"Echoes of Eternity\" is a compelling and thought-provoking album, skillfully crafted by the renowned artist, Seraphina Rivers. \\\n    This captivating musical collection takes listeners on an introspective journey, delving into the depths of the human experience \\\n    and the vastness of the universe. With her mesmerizing vocals and poignant songwriting, Seraphina Rivers infuses each track with \\\n    raw emotion and a sense of cosmic wonder. The album features several standout songs, including the hauntingly beautiful \"Stardust \\\n    Serenade,\" a celestial ballad that lasts for six minutes, carrying listeners through a celestial dreamscape. \"Eclipse of the Soul\" \\\n    captivates with its enchanting melodies and spans over eight minutes, inviting introspection and contemplation. Another gem, \"Infinity \\\n    Embrace,\" unfolds like a cosmic odyssey, lasting nearly ten minutes, drawing listeners deeper into its ethereal atmosphere. \"Echoes of Eternity\" \\\n    is a masterful testament to Seraphina Rivers' artistic prowess, leaving an enduring impact on all who embark on this musical voyage through \\\n    time and space.\n    \"\"\"\n)\n```\n\n### Output Example\n\nYou can print the structured output like this:\n\n```python\nprint(output)\n```\n\n### LLM Implementation example\n\nhttps://docs.llamaindex.ai/en/stable/examples/llm/llama_api/\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "llama-index llms llama api integration",
    "version": "0.2.2",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ed3d51e6186e9b65dae68cefe763719a96c4066cfb8010e0a0de8ced26562901",
                "md5": "cbfe3d6d316264ce9eea58fcd754f4f9",
                "sha256": "878cd8914cec6a0ec5c4946273b55a509434af0d717a9f0c94b5a4604337bf67"
            },
            "downloads": -1,
            "filename": "llama_index_llms_llama_api-0.2.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "cbfe3d6d316264ce9eea58fcd754f4f9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 4699,
            "upload_time": "2024-10-08T22:30:03",
            "upload_time_iso_8601": "2024-10-08T22:30:03.542865Z",
            "url": "https://files.pythonhosted.org/packages/ed/3d/51e6186e9b65dae68cefe763719a96c4066cfb8010e0a0de8ced26562901/llama_index_llms_llama_api-0.2.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "4e87d061bbf6ae0110f564d241dcea86a027d60584917befb9d97d5b5d2d9193",
                "md5": "4276639a851d31305966f2fb0a749c9e",
                "sha256": "701e7d490482eba818646de84ec78b856d885a8ee38110a0d6c0e6cb24c58569"
            },
            "downloads": -1,
            "filename": "llama_index_llms_llama_api-0.2.2.tar.gz",
            "has_sig": false,
            "md5_digest": "4276639a851d31305966f2fb0a749c9e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 4386,
            "upload_time": "2024-10-08T22:30:05",
            "upload_time_iso_8601": "2024-10-08T22:30:05.009242Z",
            "url": "https://files.pythonhosted.org/packages/4e/87/d061bbf6ae0110f564d241dcea86a027d60584917befb9d97d5b5d2d9193/llama_index_llms_llama_api-0.2.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-08 22:30:05",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-llms-llama-api"
}
        
Elapsed time: 0.38171s