llama-index-llms-anthropic


Namellama-index-llms-anthropic JSON
Version 0.6.3 PyPI version JSON
download
home_pageNone
Summaryllama-index llms anthropic integration
upload_time2024-12-17 21:41:16
maintainerNone
docs_urlNone
authorYour Name
requires_python<4.0,>=3.9
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LlamaIndex LLM Integration: Anthropic

Anthropic is an AI research company focused on developing advanced language models, notably the Claude series. Their flagship model, Claude, is designed to generate human-like text while prioritizing safety and alignment with human intentions. Anthropic aims to create AI systems that are not only powerful but also responsible, addressing potential risks associated with artificial intelligence.

### Installation

```sh
%pip install llama-index-llms-anthropic
!pip install llama-index
```

```
# Set Tokenizer
# First we want to set the tokenizer, which is slightly different than TikToken.
# NOTE: The Claude 3 tokenizer has not been updated yet; using the existing Anthropic tokenizer leads
# to context overflow errors for 200k tokens. We've temporarily set the max tokens for Claude 3 to 180k.
```

### Basic Usage

```py
from llama_index.llms.anthropic import Anthropic
from llama_index.core import Settings

tokenizer = Anthropic().tokenizer
Settings.tokenizer = tokenizer

# Call complete with a prompt
import os

os.environ["ANTHROPIC_API_KEY"] = "YOUR ANTHROPIC API KEY"
from llama_index.llms.anthropic import Anthropic

# To customize your API key, do this
# otherwise it will lookup ANTHROPIC_API_KEY from your env variable
# llm = Anthropic(api_key="<api_key>")
llm = Anthropic(model="claude-3-opus-20240229")

resp = llm.complete("Paul Graham is ")
print(resp)

# Sample response
# Paul Graham is a well-known entrepreneur, programmer, venture capitalist, and essayist.
# He is best known for co-founding Viaweb, one of the first web application companies, which was later
# sold to Yahoo! in 1998 and became Yahoo! Store. Graham is also the co-founder of Y Combinator, a highly
# successful startup accelerator that has helped launch numerous successful companies, such as Dropbox,
# Airbnb, and Reddit.
```

### Using Anthropic model through Vertex AI

```py
import os

os.environ["ANTHROPIC_PROJECT_ID"] = "YOUR PROJECT ID HERE"
os.environ["ANTHROPIC_REGION"] = "YOUR PROJECT REGION HERE"
# Set region and project_id to make Anthropic use the Vertex AI client

llm = Anthropic(
    model="claude-3-5-sonnet@20240620",
    region=os.getenv("ANTHROPIC_REGION"),
    project_id=os.getenv("ANTHROPIC_PROJECT_ID"),
)

resp = llm.complete("Paul Graham is ")
print(resp)
```

### Chat example with a list of messages

```py
from llama_index.core.llms import ChatMessage
from llama_index.llms.anthropic import Anthropic

messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(role="user", content="Tell me a story"),
]
resp = Anthropic(model="claude-3-opus-20240229").chat(messages)
print(resp)
```

### Streaming example

```py
from llama_index.llms.anthropic import Anthropic

llm = Anthropic(model="claude-3-opus-20240229", max_tokens=100)
resp = llm.stream_complete("Paul Graham is ")
for r in resp:
    print(r.delta, end="")
```

### Chat streaming with pirate story

```py
llm = Anthropic(model="claude-3-opus-20240229")
messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(role="user", content="Tell me a story"),
]
resp = llm.stream_chat(messages)
for r in resp:
    print(r.delta, end="")
```

### Configure Model

```py
from llama_index.llms.anthropic import Anthropic

llm = Anthropic(model="claude-3-sonnet-20240229")
resp = llm.stream_complete("Paul Graham is ")
for r in resp:
    print(r.delta, end="")
```

### Async completion

```py
from llama_index.llms.anthropic import Anthropic

llm = Anthropic("claude-3-sonnet-20240229")
resp = await llm.acomplete("Paul Graham is ")
print(resp)
```

### Structured Prediction Example

```py
from llama_index.llms.anthropic import Anthropic
from llama_index.core.prompts import PromptTemplate
from llama_index.core.bridge.pydantic import BaseModel
from typing import List


class MenuItem(BaseModel):
    """A menu item in a restaurant."""

    course_name: str
    is_vegetarian: bool


class Restaurant(BaseModel):
    """A restaurant with name, city, and cuisine."""

    name: str
    city: str
    cuisine: str
    menu_items: List[MenuItem]


llm = Anthropic("claude-3-5-sonnet-20240620")
prompt_tmpl = PromptTemplate(
    "Generate a restaurant in a given city {city_name}"
)

# Option 1: Use `as_structured_llm`
restaurant_obj = (
    llm.as_structured_llm(Restaurant)
    .complete(prompt_tmpl.format(city_name="Miami"))
    .raw
)
print(restaurant_obj)

# Option 2: Use `structured_predict`
# restaurant_obj = llm.structured_predict(Restaurant, prompt_tmpl, city_name="Miami")

# Streaming Structured Prediction
from llama_index.core.llms import ChatMessage
from IPython.display import clear_output
from pprint import pprint

input_msg = ChatMessage.from_str("Generate a restaurant in San Francisco")

sllm = llm.as_structured_llm(Restaurant)
stream_output = sllm.stream_chat([input_msg])
for partial_output in stream_output:
    clear_output(wait=True)
    pprint(partial_output.raw.dict())
```

### LLM Implementation example

https://docs.llamaindex.ai/en/stable/examples/llm/anthropic/

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-llms-anthropic",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": "Your Name",
    "author_email": "you@example.com",
    "download_url": "https://files.pythonhosted.org/packages/50/e7/ed31babcb8888a4314e8d83ff38d426e7ecca2d603eccbdee64e6edeeedc/llama_index_llms_anthropic-0.6.3.tar.gz",
    "platform": null,
    "description": "# LlamaIndex LLM Integration: Anthropic\n\nAnthropic is an AI research company focused on developing advanced language models, notably the Claude series. Their flagship model, Claude, is designed to generate human-like text while prioritizing safety and alignment with human intentions. Anthropic aims to create AI systems that are not only powerful but also responsible, addressing potential risks associated with artificial intelligence.\n\n### Installation\n\n```sh\n%pip install llama-index-llms-anthropic\n!pip install llama-index\n```\n\n```\n# Set Tokenizer\n# First we want to set the tokenizer, which is slightly different than TikToken.\n# NOTE: The Claude 3 tokenizer has not been updated yet; using the existing Anthropic tokenizer leads\n# to context overflow errors for 200k tokens. We've temporarily set the max tokens for Claude 3 to 180k.\n```\n\n### Basic Usage\n\n```py\nfrom llama_index.llms.anthropic import Anthropic\nfrom llama_index.core import Settings\n\ntokenizer = Anthropic().tokenizer\nSettings.tokenizer = tokenizer\n\n# Call complete with a prompt\nimport os\n\nos.environ[\"ANTHROPIC_API_KEY\"] = \"YOUR ANTHROPIC API KEY\"\nfrom llama_index.llms.anthropic import Anthropic\n\n# To customize your API key, do this\n# otherwise it will lookup ANTHROPIC_API_KEY from your env variable\n# llm = Anthropic(api_key=\"<api_key>\")\nllm = Anthropic(model=\"claude-3-opus-20240229\")\n\nresp = llm.complete(\"Paul Graham is \")\nprint(resp)\n\n# Sample response\n# Paul Graham is a well-known entrepreneur, programmer, venture capitalist, and essayist.\n# He is best known for co-founding Viaweb, one of the first web application companies, which was later\n# sold to Yahoo! in 1998 and became Yahoo! Store. Graham is also the co-founder of Y Combinator, a highly\n# successful startup accelerator that has helped launch numerous successful companies, such as Dropbox,\n# Airbnb, and Reddit.\n```\n\n### Using Anthropic model through Vertex AI\n\n```py\nimport os\n\nos.environ[\"ANTHROPIC_PROJECT_ID\"] = \"YOUR PROJECT ID HERE\"\nos.environ[\"ANTHROPIC_REGION\"] = \"YOUR PROJECT REGION HERE\"\n# Set region and project_id to make Anthropic use the Vertex AI client\n\nllm = Anthropic(\n    model=\"claude-3-5-sonnet@20240620\",\n    region=os.getenv(\"ANTHROPIC_REGION\"),\n    project_id=os.getenv(\"ANTHROPIC_PROJECT_ID\"),\n)\n\nresp = llm.complete(\"Paul Graham is \")\nprint(resp)\n```\n\n### Chat example with a list of messages\n\n```py\nfrom llama_index.core.llms import ChatMessage\nfrom llama_index.llms.anthropic import Anthropic\n\nmessages = [\n    ChatMessage(\n        role=\"system\", content=\"You are a pirate with a colorful personality\"\n    ),\n    ChatMessage(role=\"user\", content=\"Tell me a story\"),\n]\nresp = Anthropic(model=\"claude-3-opus-20240229\").chat(messages)\nprint(resp)\n```\n\n### Streaming example\n\n```py\nfrom llama_index.llms.anthropic import Anthropic\n\nllm = Anthropic(model=\"claude-3-opus-20240229\", max_tokens=100)\nresp = llm.stream_complete(\"Paul Graham is \")\nfor r in resp:\n    print(r.delta, end=\"\")\n```\n\n### Chat streaming with pirate story\n\n```py\nllm = Anthropic(model=\"claude-3-opus-20240229\")\nmessages = [\n    ChatMessage(\n        role=\"system\", content=\"You are a pirate with a colorful personality\"\n    ),\n    ChatMessage(role=\"user\", content=\"Tell me a story\"),\n]\nresp = llm.stream_chat(messages)\nfor r in resp:\n    print(r.delta, end=\"\")\n```\n\n### Configure Model\n\n```py\nfrom llama_index.llms.anthropic import Anthropic\n\nllm = Anthropic(model=\"claude-3-sonnet-20240229\")\nresp = llm.stream_complete(\"Paul Graham is \")\nfor r in resp:\n    print(r.delta, end=\"\")\n```\n\n### Async completion\n\n```py\nfrom llama_index.llms.anthropic import Anthropic\n\nllm = Anthropic(\"claude-3-sonnet-20240229\")\nresp = await llm.acomplete(\"Paul Graham is \")\nprint(resp)\n```\n\n### Structured Prediction Example\n\n```py\nfrom llama_index.llms.anthropic import Anthropic\nfrom llama_index.core.prompts import PromptTemplate\nfrom llama_index.core.bridge.pydantic import BaseModel\nfrom typing import List\n\n\nclass MenuItem(BaseModel):\n    \"\"\"A menu item in a restaurant.\"\"\"\n\n    course_name: str\n    is_vegetarian: bool\n\n\nclass Restaurant(BaseModel):\n    \"\"\"A restaurant with name, city, and cuisine.\"\"\"\n\n    name: str\n    city: str\n    cuisine: str\n    menu_items: List[MenuItem]\n\n\nllm = Anthropic(\"claude-3-5-sonnet-20240620\")\nprompt_tmpl = PromptTemplate(\n    \"Generate a restaurant in a given city {city_name}\"\n)\n\n# Option 1: Use `as_structured_llm`\nrestaurant_obj = (\n    llm.as_structured_llm(Restaurant)\n    .complete(prompt_tmpl.format(city_name=\"Miami\"))\n    .raw\n)\nprint(restaurant_obj)\n\n# Option 2: Use `structured_predict`\n# restaurant_obj = llm.structured_predict(Restaurant, prompt_tmpl, city_name=\"Miami\")\n\n# Streaming Structured Prediction\nfrom llama_index.core.llms import ChatMessage\nfrom IPython.display import clear_output\nfrom pprint import pprint\n\ninput_msg = ChatMessage.from_str(\"Generate a restaurant in San Francisco\")\n\nsllm = llm.as_structured_llm(Restaurant)\nstream_output = sllm.stream_chat([input_msg])\nfor partial_output in stream_output:\n    clear_output(wait=True)\n    pprint(partial_output.raw.dict())\n```\n\n### LLM Implementation example\n\nhttps://docs.llamaindex.ai/en/stable/examples/llm/anthropic/\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "llama-index llms anthropic integration",
    "version": "0.6.3",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6d352afb6eff8a01b378403977fe68978c0a4b57a18fa1dd095aa124dce90f93",
                "md5": "4a4163d6a5ce562c3673a4a875071729",
                "sha256": "7eeee643e7d191d7f1b7cb86c4b86bba7dbf17899ef902010fa0618f8b7b41e0"
            },
            "downloads": -1,
            "filename": "llama_index_llms_anthropic-0.6.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4a4163d6a5ce562c3673a4a875071729",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 9415,
            "upload_time": "2024-12-17T21:41:15",
            "upload_time_iso_8601": "2024-12-17T21:41:15.520649Z",
            "url": "https://files.pythonhosted.org/packages/6d/35/2afb6eff8a01b378403977fe68978c0a4b57a18fa1dd095aa124dce90f93/llama_index_llms_anthropic-0.6.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "50e7ed31babcb8888a4314e8d83ff38d426e7ecca2d603eccbdee64e6edeeedc",
                "md5": "c025e96f2bbbee326d558ba14b5a5e07",
                "sha256": "a02704b8f8028c7cbd6bee22f02cf75e1ff8d4d292361b2f1739cf2569fcd118"
            },
            "downloads": -1,
            "filename": "llama_index_llms_anthropic-0.6.3.tar.gz",
            "has_sig": false,
            "md5_digest": "c025e96f2bbbee326d558ba14b5a5e07",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 10241,
            "upload_time": "2024-12-17T21:41:16",
            "upload_time_iso_8601": "2024-12-17T21:41:16.695615Z",
            "url": "https://files.pythonhosted.org/packages/50/e7/ed31babcb8888a4314e8d83ff38d426e7ecca2d603eccbdee64e6edeeedc/llama_index_llms_anthropic-0.6.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-17 21:41:16",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-llms-anthropic"
}
        
Elapsed time: 0.46015s