# msglm
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->
### Installation
Install the latest version from pypi
``` sh
$ pip install msglm
```
## Usage
To use an LLM we need to structure our messages in a particular format.
Here’s an example of a text chat from the OpenAI docs.
``` python
from openai import OpenAI
client = OpenAI()
completion = client.chat.completions.create(
model="gpt-4o",
messages=[
{"role": "user", "content": "What's the Wild Atlantic Way?"}
]
)
```
Generating the correct format for a particular API can get tedious. The
goal of *msglm* is to make it easier.
The examples below will show you how to use *msglm* for text and image
chats with OpenAI and Anthropic.
### Text Chats
For a text chat simply pass a list of strings and the api format
(e.g. “openai”) to **mk_msgs** and it will generate the correct format.
``` python
mk_msgs(["Hello, world!", "some assistant response"], api="openai")
```
``` js
[
{"role": "user", "content": "Hello, world!"},
{"role": "assistant", "content": "Some assistant response"}
]
```
#### anthropic
``` python
from msglm import mk_msgs_anthropic as mk_msgs
from anthropic import Anthropic
client = Anthropic()
r = client.messages.create(
model="claude-3-haiku-20240307",
max_tokens=1024,
messages=[mk_msgs(["Hello, world!", "some LLM response"])]
)
print(r.content[0].text)
```
#### openai
``` python
from msglm import mk_msgs_openai as mk_msgs
from openai import OpenAI
client = OpenAI()
r = client.chat.completions.create(
model="gpt-4o-mini",
messages=[mk_msgs(["Hello, world!", "some LLM response"])]
)
print(r.choices[0].message.content)
```
### Image Chats
For an image chat simply pass the raw image bytes in a list with your
question to *mk_msgs* and it will generate the correct format.
``` python
mk_msg([img, "What's in this image?"], api="anthropic")
```
``` js
[
{
"role": "user",
"content": [
{"type": "image", "source": {"type": "base64", "media_type": media_type, "data": img}}
{"type": "text", "text": "What's in this image?"}
]
}
]
```
#### anthropic
``` python
import httpx
from msglm import mk_msg_anthropic as mk_msg
from anthropic import Anthropic
client = Anthropic()
img_url = "https://www.atshq.org/wp-content/uploads/2022/07/shutterstock_1626122512.jpg"
img = httpx.get(img_url).content
r = client.messages.create(
model="claude-3-haiku-20240307",
max_tokens=1024,
messages=[mk_msg([img, "Describe the image"])]
)
print(r.content[0].text)
```
#### openai
``` python
import httpx
from msglm import mk_msg_openai as mk_msg
from openai import OpenAI
img_url = "https://www.atshq.org/wp-content/uploads/2022/07/shutterstock_1626122512.jpg"
img = httpx.get(img_url).content
client = OpenAI()
r = client.chat.completions.create(
model="gpt-4o-mini",
messages=[mk_msg([img, "Describe the image"])]
)
print(r.choices[0].message.content)
```
### API Wrappers
To make life a little easier, msglm comes with api specific wrappers for
[`mk_msg`](https://AnswerDotAI.github.io/msglm/core.html#mk_msg) and
[`mk_msgs`](https://AnswerDotAI.github.io/msglm/core.html#mk_msgs).
For Anthropic use
``` python
from msglm import mk_msg_anthropic as mk_msg, mk_msgs_anthropic as mk_msgs
```
For OpenAI use
``` python
from msglm import mk_msg_openai as mk_msg, mk_msgs_openai as mk_msgs
```
### Other use-cases
#### Prompt Caching
*msglm* supports [prompt
caching](https://docs.anthropic.com/en/docs/build-with-claude/prompt-caching)
for Anthropic models. Simply pass *cache=True* to *mk_msg* or *mk_msgs*.
``` python
from msglm import mk_msg_anthropic as mk_msg
mk_msg("please cache my message", cache=True)
```
This generates the expected cache block below
``` js
{
"role": "user",
"content": [
{"type": "text", "text": "Please cache my message", "cache_control": {"type": "ephemeral"}}
]
}
```
#### PDF chats
*msglm* offers PDF
[support](https://docs.anthropic.com/en/docs/build-with-claude/pdf-support)
for Anthropic. Just like an image chat all you need to do is pass the
raw pdf bytes in a list with your question to *mk_msg* and it will
generate the correct format as shown in the example below.
``` python
import httpx
from msglm import mk_msg_anthropic as mk_msg
from anthropic import Anthropic
client = Anthropic(default_headers={'anthropic-beta': 'pdfs-2024-09-25'})
url = "https://assets.anthropic.com/m/1cd9d098ac3e6467/original/Claude-3-Model-Card-October-Addendum.pdf"
pdf = httpx.get(url).content
r = client.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=1024,
messages=[mk_msg([pdf, "Which model has the highest human preference win rates across each use-case?"])]
)
print(r.content[0].text)
```
Note: this feature is currently in beta so you’ll need to:
- use the Anthropic beta client
(e.g. `anthropic.Anthropic(default_headers={'anthropic-beta': 'pdfs-2024-09-25'})`)
- use the `claude-3-5-sonnet-20241022` model
### Summary
We hope *msglm* will make your life a little easier when chatting to
LLMs. To learn more about the package please read this
[doc](https://answerdotai.github.io/msglm/).
Raw data
{
"_id": null,
"home_page": "https://github.com/AnswerDotAI/msglm",
"name": "msglm",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": "nbdev jupyter notebook python",
"author": "Tommy",
"author_email": "tc@answer.ai",
"download_url": "https://files.pythonhosted.org/packages/7f/0c/325e9d14d2450f2f04a4d19dc7baae480815290eb6b5833b4f94ffade10c/msglm-0.0.4.tar.gz",
"platform": null,
"description": "# msglm\n\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n### Installation\n\nInstall the latest version from pypi\n\n``` sh\n$ pip install msglm\n```\n\n## Usage\n\nTo use an LLM we need to structure our messages in a particular format.\n\nHere\u2019s an example of a text chat from the OpenAI docs.\n\n``` python\nfrom openai import OpenAI\nclient = OpenAI()\n\ncompletion = client.chat.completions.create(\n model=\"gpt-4o\",\n messages=[\n {\"role\": \"user\", \"content\": \"What's the Wild Atlantic Way?\"}\n ]\n)\n```\n\nGenerating the correct format for a particular API can get tedious. The\ngoal of *msglm* is to make it easier.\n\nThe examples below will show you how to use *msglm* for text and image\nchats with OpenAI and Anthropic.\n\n### Text Chats\n\nFor a text chat simply pass a list of strings and the api format\n(e.g.\u00a0\u201copenai\u201d) to **mk_msgs** and it will generate the correct format.\n\n``` python\nmk_msgs([\"Hello, world!\", \"some assistant response\"], api=\"openai\")\n```\n\n``` js\n[\n {\"role\": \"user\", \"content\": \"Hello, world!\"},\n {\"role\": \"assistant\", \"content\": \"Some assistant response\"}\n]\n```\n\n#### anthropic\n\n``` python\nfrom msglm import mk_msgs_anthropic as mk_msgs\nfrom anthropic import Anthropic\nclient = Anthropic()\n\nr = client.messages.create(\n model=\"claude-3-haiku-20240307\",\n max_tokens=1024,\n messages=[mk_msgs([\"Hello, world!\", \"some LLM response\"])]\n)\nprint(r.content[0].text)\n```\n\n#### openai\n\n``` python\nfrom msglm import mk_msgs_openai as mk_msgs\nfrom openai import OpenAI\n\nclient = OpenAI()\nr = client.chat.completions.create(\n model=\"gpt-4o-mini\",\n messages=[mk_msgs([\"Hello, world!\", \"some LLM response\"])]\n)\nprint(r.choices[0].message.content)\n```\n\n### Image Chats\n\nFor an image chat simply pass the raw image bytes in a list with your\nquestion to *mk_msgs* and it will generate the correct format.\n\n``` python\nmk_msg([img, \"What's in this image?\"], api=\"anthropic\")\n```\n\n``` js\n[\n {\n \"role\": \"user\", \n \"content\": [\n {\"type\": \"image\", \"source\": {\"type\": \"base64\", \"media_type\": media_type, \"data\": img}}\n {\"type\": \"text\", \"text\": \"What's in this image?\"}\n ]\n }\n]\n```\n\n#### anthropic\n\n``` python\nimport httpx\nfrom msglm import mk_msg_anthropic as mk_msg\nfrom anthropic import Anthropic\n\nclient = Anthropic()\n\nimg_url = \"https://www.atshq.org/wp-content/uploads/2022/07/shutterstock_1626122512.jpg\"\nimg = httpx.get(img_url).content\n\nr = client.messages.create(\n model=\"claude-3-haiku-20240307\",\n max_tokens=1024,\n messages=[mk_msg([img, \"Describe the image\"])]\n)\nprint(r.content[0].text)\n```\n\n#### openai\n\n``` python\nimport httpx\nfrom msglm import mk_msg_openai as mk_msg\nfrom openai import OpenAI\n\nimg_url = \"https://www.atshq.org/wp-content/uploads/2022/07/shutterstock_1626122512.jpg\"\nimg = httpx.get(img_url).content\n\nclient = OpenAI()\nr = client.chat.completions.create(\n model=\"gpt-4o-mini\",\n messages=[mk_msg([img, \"Describe the image\"])]\n)\nprint(r.choices[0].message.content)\n```\n\n### API Wrappers\n\nTo make life a little easier, msglm comes with api specific wrappers for\n[`mk_msg`](https://AnswerDotAI.github.io/msglm/core.html#mk_msg) and\n[`mk_msgs`](https://AnswerDotAI.github.io/msglm/core.html#mk_msgs).\n\nFor Anthropic use\n\n``` python\nfrom msglm import mk_msg_anthropic as mk_msg, mk_msgs_anthropic as mk_msgs\n```\n\nFor OpenAI use\n\n``` python\nfrom msglm import mk_msg_openai as mk_msg, mk_msgs_openai as mk_msgs\n```\n\n### Other use-cases\n\n#### Prompt Caching\n\n*msglm* supports [prompt\ncaching](https://docs.anthropic.com/en/docs/build-with-claude/prompt-caching)\nfor Anthropic models. Simply pass *cache=True* to *mk_msg* or *mk_msgs*.\n\n``` python\nfrom msglm import mk_msg_anthropic as mk_msg\n\nmk_msg(\"please cache my message\", cache=True)\n```\n\nThis generates the expected cache block below\n\n``` js\n{\n \"role\": \"user\",\n \"content\": [\n {\"type\": \"text\", \"text\": \"Please cache my message\", \"cache_control\": {\"type\": \"ephemeral\"}}\n ]\n}\n```\n\n#### PDF chats\n\n*msglm* offers PDF\n[support](https://docs.anthropic.com/en/docs/build-with-claude/pdf-support)\nfor Anthropic. Just like an image chat all you need to do is pass the\nraw pdf bytes in a list with your question to *mk_msg* and it will\ngenerate the correct format as shown in the example below.\n\n``` python\nimport httpx\nfrom msglm import mk_msg_anthropic as mk_msg\nfrom anthropic import Anthropic\n\nclient = Anthropic(default_headers={'anthropic-beta': 'pdfs-2024-09-25'})\n\nurl = \"https://assets.anthropic.com/m/1cd9d098ac3e6467/original/Claude-3-Model-Card-October-Addendum.pdf\"\npdf = httpx.get(url).content\n\nr = client.messages.create(\n model=\"claude-3-5-sonnet-20241022\",\n max_tokens=1024,\n messages=[mk_msg([pdf, \"Which model has the highest human preference win rates across each use-case?\"])]\n)\nprint(r.content[0].text)\n```\n\nNote: this feature is currently in beta so you\u2019ll need to:\n\n- use the Anthropic beta client\n (e.g.\u00a0`anthropic.Anthropic(default_headers={'anthropic-beta': 'pdfs-2024-09-25'})`)\n- use the `claude-3-5-sonnet-20241022` model\n\n### Summary\n\nWe hope *msglm* will make your life a little easier when chatting to\nLLMs. To learn more about the package please read this\n[doc](https://answerdotai.github.io/msglm/).\n",
"bugtrack_url": null,
"license": "Apache Software License 2.0",
"summary": "msglm makes it a little easier to create messages for language models like Claude and OpenAI GPTs.",
"version": "0.0.4",
"project_urls": {
"Homepage": "https://github.com/AnswerDotAI/msglm"
},
"split_keywords": [
"nbdev",
"jupyter",
"notebook",
"python"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "56bc9ff0483dbf4a908b2c4bbc2aa6005eb9501d3a5d707acfcfed8a0a152bdf",
"md5": "a2e8c28eb3456c7f67fa7184b9a787da",
"sha256": "5bc724891decca4d200545595a266a575691cd34beb5a1f9255f670a28d9e009"
},
"downloads": -1,
"filename": "msglm-0.0.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a2e8c28eb3456c7f67fa7184b9a787da",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 10090,
"upload_time": "2024-11-23T00:05:53",
"upload_time_iso_8601": "2024-11-23T00:05:53.081843Z",
"url": "https://files.pythonhosted.org/packages/56/bc/9ff0483dbf4a908b2c4bbc2aa6005eb9501d3a5d707acfcfed8a0a152bdf/msglm-0.0.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "7f0c325e9d14d2450f2f04a4d19dc7baae480815290eb6b5833b4f94ffade10c",
"md5": "d1ddf5452efed85f01aef60b7d6e1c49",
"sha256": "324cac5414ee6d632b5078b4f0485c88dd77c3e5198d6e9d873ee5e825c3f10f"
},
"downloads": -1,
"filename": "msglm-0.0.4.tar.gz",
"has_sig": false,
"md5_digest": "d1ddf5452efed85f01aef60b7d6e1c49",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 11347,
"upload_time": "2024-11-23T00:05:55",
"upload_time_iso_8601": "2024-11-23T00:05:55.116103Z",
"url": "https://files.pythonhosted.org/packages/7f/0c/325e9d14d2450f2f04a4d19dc7baae480815290eb6b5833b4f94ffade10c/msglm-0.0.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-23 00:05:55",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "AnswerDotAI",
"github_project": "msglm",
"github_not_found": true,
"lcname": "msglm"
}