# claudette
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->
*Claudette* is a wrapper for Anthropic’s [Python
SDK](https://github.com/anthropics/anthropic-sdk-python).
TODO: This README is incomplete.
## Install
``` sh
pip install claudette
```
## Getting started
Anthropic’s Python SDK will automatically be installed with Claudette,
if you don’t already have it.
You’ll need to set the `ANTHROPIC_API_KEY` environment variable to the
key provided by Anthropic.
``` python
import os
# os.environ['ANTHROPIC_LOG'] = 'debug'
```
To print every HTTP request and response in full, uncomment the above
line.
``` python
from claudette import *
```
Claudette only exports the symbols that are needed to use the library,
so you can use `import *` to import them. Alternatively, just use:
``` python
import claudette
```
…and then add the prefix `claudette.` to any usages of the module.
``` python
models
```
('claude-3-opus-20240229',
'claude-3-sonnet-20240229',
'claude-3-haiku-20240307')
These are the models currently available from the SDK.
``` python
model = models[-1]
```
For examples, we’ll use Haiku, since it’s fast and cheap (and
surprisingly good!)
## Chat
------------------------------------------------------------------------
<a
href="https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone"
target="_blank" style="float:right; font-size:smaller">source</a>
### Chat
> Chat (model:Optional[str]=None, cli:Optional[claudette.core.Client]=None,
> sp='', tools:Optional[list]=None)
*Anthropic chat client.*
| | **Type** | **Default** | **Details** |
|-------|----------|-------------|------------------------------------------------|
| model | Optional | None | Model to use (leave empty if passing `cli`) |
| cli | Optional | None | Client to use (leave empty if passing `model`) |
| sp | str | | Optional system prompt |
| tools | Optional | None | List of tools to make available to Claude |
``` python
chat = Chat(model, sp="You are a helpful assistant.")
```
------------------------------------------------------------------------
<a
href="https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone"
target="_blank" style="float:right; font-size:smaller">source</a>
### Chat.\_\_call\_\_
> Chat.__call__ (pr, sp='', temp=0, maxtok=4096,
> stop:Optional[list[str]]=None,
> ns:Optional[collections.abc.Mapping]=None, prefill='',
> **kw)
*Add prompt `pr` to dialog and get a response from Claude*
| | **Type** | **Default** | **Details** |
|---------|----------|-------------|-------------------------------------------------------------|
| pr | | | Prompt / message |
| sp | str | | The system prompt |
| temp | int | 0 | Temperature |
| maxtok | int | 4096 | Maximum tokens |
| stop | Optional | None | Stop sequences |
| ns | Optional | None | Namespace to search for tools, defaults to `globals()` |
| prefill | str | | Optional prefill to pass to Claude as start of its response |
| kw | | | |
``` python
chat("I'm Jeremy")
contents(chat("What's my name?"))
```
'Your name is Jeremy, as you told me earlier.'
Claude supports adding an extra `assistant` message at the end, which
contains the *prefill* – i.e. the text we want Claude to assume the
response starts with.
Let’s try it out:
``` python
q = "Concisely, what is the meaning of life?"
pref = 'According to Douglas Adams,'
chat(q, prefill=pref)
```
According to Douglas Adams, “The answer to the ultimate question of
life, the universe, and everything is 42.”
<details>
- id: msg_011BL35YKAgwg8UR7nKjM1p2
- content: \[{‘text’: ‘According to Douglas Adams, “The answer to the
ultimate question of life, the universe, and everything is 42.”’,
‘type’: ‘text’}\]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: end_turn
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 109, ‘output_tokens’: 23}
</details>
------------------------------------------------------------------------
<a
href="https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone"
target="_blank" style="float:right; font-size:smaller">source</a>
### Chat.stream
> Chat.stream (pr, sp='', temp=0, maxtok=4096,
> stop:Optional[list[str]]=None, prefill='', **kw)
*Add prompt `pr` to dialog and stream the response from Claude*
| | **Type** | **Default** | **Details** |
|---------|----------|-------------|-------------------------------------------------------------|
| pr | | | Prompt / message |
| sp | str | | The system prompt |
| temp | int | 0 | Temperature |
| maxtok | int | 4096 | Maximum tokens |
| stop | Optional | None | Stop sequences |
| prefill | str | | Optional prefill to pass to Claude as start of its response |
| kw | | | |
``` python
for o in chat.stream("And what is the question?"): print(o, end='')
```
Unfortunately, the book never explicitly states what the "ultimate question" is that corresponds to the answer of 42. That remains a mystery in the Hitchhiker's Guide to the Galaxy series. The meaning of life is left open to interpretation.
### Tool use
``` python
sp = "If asked to add things up, use the `sums` function instead of doing it yourself. Never mention what tools you use."
```
We automagically get streamlined tool use as well:
``` python
pr = f"What is {a}+{b}?"
pr
```
'What is 604542+6458932?'
``` python
chat = Chat(model, sp=sp, tools=[sums])
r = chat(pr)
r
```
ToolUseBlock(id=‘toolu_018m6yuZwQtn7xZozny37CrZ’, input={‘a’: 604542,
‘b’: 6458932}, name=‘sums’, type=‘tool_use’)
<details>
- id: msg_01MSiGKYedwdpr41VciqydB7
- content: \[{‘id’: ‘toolu_018m6yuZwQtn7xZozny37CrZ’, ‘input’: {‘a’:
604542, ‘b’: 6458932}, ‘name’: ‘sums’, ‘type’: ‘tool_use’}\]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: tool_use
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 418, ‘output_tokens’: 72}
</details>
``` python
chat(r)
```
The sum of 604542 and 6458932 is 7063474.
<details>
- id: msg_016NBFCx5L3HMvY5kwVDdjDE
- content: \[{‘text’: ‘The sum of 604542 and 6458932 is 7063474.’,
‘type’: ‘text’}\]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: end_turn
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 505, ‘output_tokens’: 23}
</details>
It should be correct, because it actually used our Python function to do
the addition. Let’s check:
``` python
a+b
```
7063474
## Images
Claude can handle image data as well. As everyone knows, when testing
image APIs you have to use a cute puppy.
``` python
# Image is Cute_dog.jpg from Wikimedia
fn = Path('puppy.jpg')
display.Image(filename=fn, width=200)
```
![](index_files/figure-commonmark/cell-18-output-1.jpeg)
``` python
img = fn.read_bytes()
```
------------------------------------------------------------------------
<a
href="https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone"
target="_blank" style="float:right; font-size:smaller">source</a>
### img_msg
> img_msg (data:bytes)
*Convert image `data` into an encoded `dict`*
Anthropic have documented the particular `dict` structure that expect
image data to be in, so we have a little function to create that for us.
------------------------------------------------------------------------
<a
href="https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone"
target="_blank" style="float:right; font-size:smaller">source</a>
### text_msg
> text_msg (s:str)
*Convert `s` to a text message*
A Claude message can be a list of image and text parts. So we’ve also
created a helper for making the text parts.
``` python
q = "In brief, what color flowers are in this image?"
msg = mk_msg([img_msg(img), text_msg(q)])
```
``` python
c([msg])
```
The image contains purple and yellow daisy-like flowers, which appear to
be daisies or a similar type of flower.
<details>
- id: msg_01GSzzitXbvkzEJtfJquzSXE
- content: \[{‘text’: ‘The image contains purple and yellow daisy-like
flowers, which appear to be daisies or a similar type of flower.’,
‘type’: ‘text’}\]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: end_turn
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 1665, ‘output_tokens’: 29}
</details>
There’s not need to manually choose the type of message, since we figure
that out from the data of the source data.
``` python
_mk_content('Hi')
```
{'type': 'text', 'text': 'Hi'}
``` python
c([[img, q]])
```
The image contains purple and yellow daisy-like flowers, which appear to
be daisies or a similar type of flower.
<details>
- id: msg_01ArrMvaZoXa1JTjULMentQJ
- content: \[{‘text’: ‘The image contains purple and yellow daisy-like
flowers, which appear to be daisies or a similar type of flower.’,
‘type’: ‘text’}\]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: end_turn
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 1665, ‘output_tokens’: 29}
</details>
Claude also supports uploading an image without any text, in which case
it’ll make a general comment about what it sees. You can then use
[`Chat`](https://AnswerDotAI.github.io/claudette/core.html#chat) to ask
questions:
``` python
chat = Chat(model, sp=sp)
chat(img)
```
The image shows a cute puppy, likely a Cavalier King Charles Spaniel,
sitting in a grassy area surrounded by purple daisy flowers. The puppy
has a friendly, curious expression on its face as it gazes directly at
the camera. The contrast between the puppy’s soft, fluffy fur and the
vibrant flowers creates a charming and picturesque scene.
<details>
- id: msg_01535kuKhiN6Do5PTcTmTst7
- content: \[{‘text’: “The image shows a cute puppy, likely a Cavalier
King Charles Spaniel, sitting in a grassy area surrounded by purple
daisy flowers. The puppy has a friendly, curious expression on its
face as it gazes directly at the camera. The contrast between the
puppy’s soft, fluffy fur and the vibrant flowers creates a charming
and picturesque scene.”, ‘type’: ‘text’}\]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: end_turn
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 1681, ‘output_tokens’: 83}
</details>
``` python
chat('What direction is the puppy facing?')
```
The puppy in the image is facing the camera directly, looking straight
ahead with a curious expression.
<details>
- id: msg_01Ge4M4Z4J6ywg9V8cCXy2aN
- content: \[{‘text’: ‘The puppy in the image is facing the camera
directly, looking straight ahead with a curious expression.’, ‘type’:
‘text’}\]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: end_turn
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 1775, ‘output_tokens’: 23}
</details>
``` python
chat('What color is it?')
```
The puppy in the image has a combination of colors - it has a white and
brown/tan coat. The head and ears appear to be a reddish-brown color,
while the body is mostly white with some tan/brown patches.
<details>
- id: msg_01JbUH6MvqWMvkF8UJVjo33z
- content: \[{‘text’: ‘The puppy in the image has a combination of
colors - it has a white and brown/tan coat. The head and ears appear
to be a reddish-brown color, while the body is mostly white with some
tan/brown patches.’, ‘type’: ‘text’}\]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: end_turn
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 1806, ‘output_tokens’: 53}
</details>
------------------------------------------------------------------------
<a
href="https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone"
target="_blank" style="float:right; font-size:smaller">source</a>
### mk_msg
> mk_msg (content, role='user', **kw)
*Helper to create a `dict` appropriate for a Claude message. `kw` are
added as key/value pairs to the message*
| | **Type** | **Default** | **Details** |
|---------|----------|-------------|----------------------------------------------------------------|
| content | | | A string, list, or dict containing the contents of the message |
| role | str | user | Must be ‘user’ or ‘assistant’ |
| kw | | | |
------------------------------------------------------------------------
<a
href="https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone"
target="_blank" style="float:right; font-size:smaller">source</a>
### mk_msgs
> mk_msgs (msgs:list, **kw)
*Helper to set ‘assistant’ role on alternate messages.*
------------------------------------------------------------------------
<a
href="https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone"
target="_blank" style="float:right; font-size:smaller">source</a>
### Client
> Client (model, cli=None)
*Basic Anthropic messages client.*
------------------------------------------------------------------------
<a
href="https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone"
target="_blank" style="float:right; font-size:smaller">source</a>
### Client.\_\_call\_\_
> Client.__call__ (msgs:list, sp='', temp=0, maxtok=4096,
> stop:Optional[list[str]]=None, **kw)
*Make a call to Claude without streaming.*
| | **Type** | **Default** | **Details** |
|--------|----------|-------------|--------------------------------|
| msgs | list | | List of messages in the dialog |
| sp | str | | The system prompt |
| temp | int | 0 | Temperature |
| maxtok | int | 4096 | Maximum tokens |
| stop | Optional | None | Stop sequences |
| kw | | | |
Defining `__call__` let’s us use an object like a function (i.e it’s
*callable*). We use it as a small wrapper over `messages.create`.
``` python
c('Hi')
```
Hello! How can I assist you today?
<details>
- id: msg_01Vr6t6QdodntSMvHthnRDBc
- content: \[{‘text’: ‘Hello! How can I assist you today?’, ‘type’:
‘text’}\]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: end_turn
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 8, ‘output_tokens’: 12}
</details>
``` python
c.use
```
In: 18; Out: 64; Total: 82
------------------------------------------------------------------------
<a
href="https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone"
target="_blank" style="float:right; font-size:smaller">source</a>
### Client.stream
> Client.stream (msgs:list, sp='', temp=0, maxtok=4096,
> stop:Optional[list[str]]=None, **kw)
*Make a call to Claude, streaming the result.*
| | **Type** | **Default** | **Details** |
|--------|----------|-------------|--------------------------------|
| msgs | list | | List of messages in the dialog |
| sp | str | | The system prompt |
| temp | int | 0 | Temperature |
| maxtok | int | 4096 | Maximum tokens |
| stop | Optional | None | Stop sequences |
| kw | | | |
We also define a wrapper over `messages.stream`, which is like
`messages.create`, but streams the response back incrementally.
``` python
for o in c.stream('Hi'): print(o, end='')
```
Hello! How can I assist you today?
## Tool use
[Tool use](https://docs.anthropic.com/claude/docs/tool-use) lets Claude
use external tools.
We’ll use [docments](https://fastcore.fast.ai/docments.html) to make
defining Python functions as ergonomic as possible. Each parameter (and
the return value) should have a type, and a docments comment with the
description of what it is. As an example we’ll write a simple function
that adds numbers together:
``` python
def sums(
# First thing to sum
a:int,
# Second thing to sum
b:int=1
# The sum of the inputs
) -> int:
"Adds a + b."
return a + b
```
------------------------------------------------------------------------
<a
href="https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone"
target="_blank" style="float:right; font-size:smaller">source</a>
### get_schema
> get_schema (f:<built-infunctioncallable>)
*Convert function `f` into a JSON schema `dict` for tool use.*
``` python
a,b = 604542,6458932
pr = f"What is {a}+{b}?"
sp = "You must use the `sums` function instead of adding yourself, but don't mention what tools you use."
tools=[get_schema(sums)]
```
We’ll start a dialog with Claude now. We’ll store the messages of our
dialog in `msgs`. The first message will be our prompt `pr`, and we’ll
pass our `tools` schema.
``` python
msgs = mk_msgs(pr)
r = c(msgs, sp=sp, tools=tools)
r
```
ToolUseBlock(id=‘toolu_01CsuZfPAas75MkDABXAvjWD’, input={‘a’: 604542,
‘b’: 6458932}, name=‘sums’, type=‘tool_use’)
<details>
- id: msg_01StvQvvrnwaBtuUwHQLrpFt
- content: \[{‘id’: ‘toolu_01CsuZfPAas75MkDABXAvjWD’, ‘input’: {‘a’:
604542, ‘b’: 6458932}, ‘name’: ‘sums’, ‘type’: ‘tool_use’}\]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: tool_use
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 414, ‘output_tokens’: 72}
</details>
When Claude decides that it should use a tool, it passes back a
`ToolUseBlock` with the name of the tool to call, and the params to use.
We need to append the response to the dialog so Claude knows what’s
happening (since it’s stateless).
``` python
msgs.append(mk_msg(r))
```
We don’t want to allow it to call just any possible function (that would
be a security disaster!) so we create a *namespace* – that is, a
dictionary of allowable function names to call.
------------------------------------------------------------------------
<a
href="https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone"
target="_blank" style="float:right; font-size:smaller">source</a>
### call_func
> call_func (tr:collections.abc.Mapping,
> ns:Optional[collections.abc.Mapping]=None)
*Call the function in the tool response `tr`, using namespace `ns`.*
| | **Type** | **Default** | **Details** |
|-----|----------|-------------|--------------------------------------------------------|
| tr | Mapping | | Tool use request response from Claude |
| ns | Optional | None | Namespace to search for tools, defaults to `globals()` |
We can now use the function requested by Claude. We look it up in `ns`,
and pass in the provided parameters.
``` python
res = call_func(r, ns=ns)
res
```
7063474
------------------------------------------------------------------------
<a
href="https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone"
target="_blank" style="float:right; font-size:smaller">source</a>
### mk_toolres
> mk_toolres (r:collections.abc.Mapping, res=None,
> ns:Optional[collections.abc.Mapping]=None)
*Create a `tool_result` message from response `r`.*
| | **Type** | **Default** | **Details** |
|-----|----------|-------------|----------------------------------------------------------------------------------------------------------------------------------------|
| r | Mapping | | Tool use request response from Claude |
| res | NoneType | None | The result of calling the tool (calculated with [`call_func`](https://AnswerDotAI.github.io/claudette/core.html#call_func) by default) |
| ns | Optional | None | Namespace to search for tools |
In order to tell Claude the result of the tool call, we pass back a
`tool_result` message, created by calling
[`call_func`](https://AnswerDotAI.github.io/claudette/core.html#call_func).
``` python
tr = mk_toolres(r, res=res, ns=ns)
tr
```
{'role': 'user',
'content': [{'type': 'tool_result',
'tool_use_id': 'toolu_01CsuZfPAas75MkDABXAvjWD',
'content': '7063474'}]}
We add this to our dialog, and now Claude has all the information it
needs to answer our question.
``` python
msgs.append(tr)
contents(c(msgs, sp=sp, tools=tools))
```
'The sum of 604542 and 6458932 is 7063474.'
## XML helpers
Claude works well with XML inputs, but XML can be a bit clunky to work
with manually. Therefore, we create a couple of more streamlined
approaches for XML generation. You don’t need to use these if you don’t
find them useful – you can always just use plain strings for XML
directly.
------------------------------------------------------------------------
<a
href="https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone"
target="_blank" style="float:right; font-size:smaller">source</a>
### xt
> xt (tag:str, c:Optional[list]=None, **kw)
*Helper to create appropriate data structure for
[`to_xml`](https://AnswerDotAI.github.io/claudette/core.html#to_xml).*
| | **Type** | **Default** | **Details** |
|-----|----------|-------------|--------------|
| tag | str | | XML tag name |
| c | Optional | None | Children |
| kw | | | |
An XML node contains a tag, optional children, and optional attributes.
[`xt`](https://AnswerDotAI.github.io/claudette/core.html#xt) creates a
tuple of these three things, which we will use to general XML shortly.
Attributes are passed as kwargs; since these might conflict with
reserved words in Python, you can optionally add a `_` prefix and it’ll
be stripped off.
``` python
xt('x-custom', ['hi'], _class='bar')
```
('x-custom', ['hi'], {'class': 'bar'})
``` python
from claudette.core import div,img,h1,h2,p,hr,html
```
If you have to use a lot of tags of the same type, it’s convenient to
use `partial` to create specialised functions for them. Here, we’re
creating functions for some common HTML tags. Here’s an example of using
them:
``` python
a = html([
p('This is a paragraph'),
hr(),
img(src='http://example.prg'),
div([
h1('This is a header'),
h2('This is a sub-header', style='k:v'),
], _class='foo')
])
a
```
('html',
[('p', 'This is a paragraph', {}),
('hr', None, {}),
('img', None, {'src': 'http://example.prg'}),
('div',
[('h1', 'This is a header', {}),
('h2', 'This is a sub-header', {'style': 'k:v'})],
{'class': 'foo'})],
{})
------------------------------------------------------------------------
<a
href="https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone"
target="_blank" style="float:right; font-size:smaller">source</a>
### hl_md
> hl_md (s, lang='xml')
*Syntax highlight `s` using `lang`.*
When we display XML in a notebook, it’s nice to highlight it, so we
create a function to simplify that:
``` python
hl_md('<test><xml foo="bar">a child</xml></test>')
```
``` xml
<test><xml foo="bar">a child</xml></test>
```
------------------------------------------------------------------------
<a
href="https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone"
target="_blank" style="float:right; font-size:smaller">source</a>
### to_xml
> to_xml (node:tuple, hl=False)
*Convert `node` to an XML string.*
| | **Type** | **Default** | **Details** |
|------|----------|-------------|--------------------------------------------------------------------------------------|
| node | tuple | | XML structure in [`xt`](https://AnswerDotAI.github.io/claudette/core.html#xt) format |
| hl | bool | False | Syntax highlight response? |
Now we can convert that HTML data structure we created into XML:
``` python
to_xml(a, hl=True)
```
``` xml
<html>
<p>This is a paragraph</p>
<hr />
<img src="http://example.prg" />
<div class="foo">
<h1>This is a header</h1>
<h2 style="k:v">This is a sub-header</h2>
</div>
</html>
```
------------------------------------------------------------------------
<a
href="https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone"
target="_blank" style="float:right; font-size:smaller">source</a>
### json_to_xml
> json_to_xml (d:dict, rnm:str)
*Convert `d` to XML.*
| | **Type** | **Details** |
|-------------|----------|----------------------------|
| d | dict | JSON dictionary to convert |
| rnm | str | Root name |
| **Returns** | **str** | |
JSON doesn’t map as nicely to XML as the data structure used in the
previous section, but for simple XML trees it can be convenient – for
example:
``` python
a = dict(surname='Howard', firstnames=['Jeremy','Peter'],
address=dict(state='Queensland',country='Australia'))
hl_md(json_to_xml(a, 'person'))
```
``` xml
<person>
<surname>Howard</surname>
<firstnames>
<item>Jeremy</item>
<item>Peter</item>
</firstnames>
<address>
<state>Queensland</state>
<country>Australia</country>
</address>
</person>
```
Raw data
{
"_id": null,
"home_page": "https://github.com/AnswerDotAI/claudette",
"name": "claudette",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "nbdev jupyter notebook python",
"author": "Jeremy Howard",
"author_email": "j@fast.ai",
"download_url": "https://files.pythonhosted.org/packages/d8/b8/bf57accf8a826793f96612be81ac7340d7661eb42da59e8f40df4ff40406/claudette-0.0.1.tar.gz",
"platform": null,
"description": "# claudette\n\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n*Claudette* is a wrapper for Anthropic\u2019s [Python\nSDK](https://github.com/anthropics/anthropic-sdk-python).\n\nTODO: This README is incomplete.\n\n## Install\n\n``` sh\npip install claudette\n```\n\n## Getting started\n\nAnthropic\u2019s Python SDK will automatically be installed with Claudette,\nif you don\u2019t already have it.\n\nYou\u2019ll need to set the `ANTHROPIC_API_KEY` environment variable to the\nkey provided by Anthropic.\n\n``` python\nimport os\n# os.environ['ANTHROPIC_LOG'] = 'debug'\n```\n\nTo print every HTTP request and response in full, uncomment the above\nline.\n\n``` python\nfrom claudette import *\n```\n\nClaudette only exports the symbols that are needed to use the library,\nso you can use `import *` to import them. Alternatively, just use:\n\n``` python\nimport claudette\n```\n\n\u2026and then add the prefix `claudette.` to any usages of the module.\n\n``` python\nmodels\n```\n\n ('claude-3-opus-20240229',\n 'claude-3-sonnet-20240229',\n 'claude-3-haiku-20240307')\n\nThese are the models currently available from the SDK.\n\n``` python\nmodel = models[-1]\n```\n\nFor examples, we\u2019ll use Haiku, since it\u2019s fast and cheap (and\nsurprisingly good!)\n\n## Chat\n\n------------------------------------------------------------------------\n\n<a\nhref=\"https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone\"\ntarget=\"_blank\" style=\"float:right; font-size:smaller\">source</a>\n\n### Chat\n\n> Chat (model:Optional[str]=None, cli:Optional[claudette.core.Client]=None,\n> sp='', tools:Optional[list]=None)\n\n*Anthropic chat client.*\n\n| | **Type** | **Default** | **Details** |\n|-------|----------|-------------|------------------------------------------------|\n| model | Optional | None | Model to use (leave empty if passing `cli`) |\n| cli | Optional | None | Client to use (leave empty if passing `model`) |\n| sp | str | | Optional system prompt |\n| tools | Optional | None | List of tools to make available to Claude |\n\n``` python\nchat = Chat(model, sp=\"You are a helpful assistant.\")\n```\n\n------------------------------------------------------------------------\n\n<a\nhref=\"https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone\"\ntarget=\"_blank\" style=\"float:right; font-size:smaller\">source</a>\n\n### Chat.\\_\\_call\\_\\_\n\n> Chat.__call__ (pr, sp='', temp=0, maxtok=4096,\n> stop:Optional[list[str]]=None,\n> ns:Optional[collections.abc.Mapping]=None, prefill='',\n> **kw)\n\n*Add prompt `pr` to dialog and get a response from Claude*\n\n| | **Type** | **Default** | **Details** |\n|---------|----------|-------------|-------------------------------------------------------------|\n| pr | | | Prompt / message |\n| sp | str | | The system prompt |\n| temp | int | 0 | Temperature |\n| maxtok | int | 4096 | Maximum tokens |\n| stop | Optional | None | Stop sequences |\n| ns | Optional | None | Namespace to search for tools, defaults to `globals()` |\n| prefill | str | | Optional prefill to pass to Claude as start of its response |\n| kw | | | |\n\n``` python\nchat(\"I'm Jeremy\")\ncontents(chat(\"What's my name?\"))\n```\n\n 'Your name is Jeremy, as you told me earlier.'\n\nClaude supports adding an extra `assistant` message at the end, which\ncontains the *prefill* \u2013 i.e.\u00a0the text we want Claude to assume the\nresponse starts with.\n\nLet\u2019s try it out:\n\n``` python\nq = \"Concisely, what is the meaning of life?\"\npref = 'According to Douglas Adams,'\nchat(q, prefill=pref)\n```\n\nAccording to Douglas Adams, \u201cThe answer to the ultimate question of\nlife, the universe, and everything is 42.\u201d\n\n<details>\n\n- id: msg_011BL35YKAgwg8UR7nKjM1p2\n- content: \\[{\u2018text\u2019: \u2018According to Douglas Adams, \u201cThe answer to the\n ultimate question of life, the universe, and everything is 42.\u201d\u2019,\n \u2018type\u2019: \u2018text\u2019}\\]\n- model: claude-3-haiku-20240307\n- role: assistant\n- stop_reason: end_turn\n- stop_sequence: None\n- type: message\n- usage: {\u2018input_tokens\u2019: 109, \u2018output_tokens\u2019: 23}\n\n</details>\n\n------------------------------------------------------------------------\n\n<a\nhref=\"https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone\"\ntarget=\"_blank\" style=\"float:right; font-size:smaller\">source</a>\n\n### Chat.stream\n\n> Chat.stream (pr, sp='', temp=0, maxtok=4096,\n> stop:Optional[list[str]]=None, prefill='', **kw)\n\n*Add prompt `pr` to dialog and stream the response from Claude*\n\n| | **Type** | **Default** | **Details** |\n|---------|----------|-------------|-------------------------------------------------------------|\n| pr | | | Prompt / message |\n| sp | str | | The system prompt |\n| temp | int | 0 | Temperature |\n| maxtok | int | 4096 | Maximum tokens |\n| stop | Optional | None | Stop sequences |\n| prefill | str | | Optional prefill to pass to Claude as start of its response |\n| kw | | | |\n\n``` python\nfor o in chat.stream(\"And what is the question?\"): print(o, end='')\n```\n\n Unfortunately, the book never explicitly states what the \"ultimate question\" is that corresponds to the answer of 42. That remains a mystery in the Hitchhiker's Guide to the Galaxy series. The meaning of life is left open to interpretation.\n\n### Tool use\n\n``` python\nsp = \"If asked to add things up, use the `sums` function instead of doing it yourself. Never mention what tools you use.\"\n```\n\nWe automagically get streamlined tool use as well:\n\n``` python\npr = f\"What is {a}+{b}?\"\npr\n```\n\n 'What is 604542+6458932?'\n\n``` python\nchat = Chat(model, sp=sp, tools=[sums])\nr = chat(pr)\nr\n```\n\nToolUseBlock(id=\u2018toolu_018m6yuZwQtn7xZozny37CrZ\u2019, input={\u2018a\u2019: 604542,\n\u2018b\u2019: 6458932}, name=\u2018sums\u2019, type=\u2018tool_use\u2019)\n\n<details>\n\n- id: msg_01MSiGKYedwdpr41VciqydB7\n- content: \\[{\u2018id\u2019: \u2018toolu_018m6yuZwQtn7xZozny37CrZ\u2019, \u2018input\u2019: {\u2018a\u2019:\n 604542, \u2018b\u2019: 6458932}, \u2018name\u2019: \u2018sums\u2019, \u2018type\u2019: \u2018tool_use\u2019}\\]\n- model: claude-3-haiku-20240307\n- role: assistant\n- stop_reason: tool_use\n- stop_sequence: None\n- type: message\n- usage: {\u2018input_tokens\u2019: 418, \u2018output_tokens\u2019: 72}\n\n</details>\n\n``` python\nchat(r)\n```\n\nThe sum of 604542 and 6458932 is 7063474.\n\n<details>\n\n- id: msg_016NBFCx5L3HMvY5kwVDdjDE\n- content: \\[{\u2018text\u2019: \u2018The sum of 604542 and 6458932 is 7063474.\u2019,\n \u2018type\u2019: \u2018text\u2019}\\]\n- model: claude-3-haiku-20240307\n- role: assistant\n- stop_reason: end_turn\n- stop_sequence: None\n- type: message\n- usage: {\u2018input_tokens\u2019: 505, \u2018output_tokens\u2019: 23}\n\n</details>\n\nIt should be correct, because it actually used our Python function to do\nthe addition. Let\u2019s check:\n\n``` python\na+b\n```\n\n 7063474\n\n## Images\n\nClaude can handle image data as well. As everyone knows, when testing\nimage APIs you have to use a cute puppy.\n\n``` python\n# Image is Cute_dog.jpg from Wikimedia\nfn = Path('puppy.jpg')\ndisplay.Image(filename=fn, width=200)\n```\n\n![](index_files/figure-commonmark/cell-18-output-1.jpeg)\n\n``` python\nimg = fn.read_bytes()\n```\n\n------------------------------------------------------------------------\n\n<a\nhref=\"https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone\"\ntarget=\"_blank\" style=\"float:right; font-size:smaller\">source</a>\n\n### img_msg\n\n> img_msg (data:bytes)\n\n*Convert image `data` into an encoded `dict`*\n\nAnthropic have documented the particular `dict` structure that expect\nimage data to be in, so we have a little function to create that for us.\n\n------------------------------------------------------------------------\n\n<a\nhref=\"https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone\"\ntarget=\"_blank\" style=\"float:right; font-size:smaller\">source</a>\n\n### text_msg\n\n> text_msg (s:str)\n\n*Convert `s` to a text message*\n\nA Claude message can be a list of image and text parts. So we\u2019ve also\ncreated a helper for making the text parts.\n\n``` python\nq = \"In brief, what color flowers are in this image?\"\nmsg = mk_msg([img_msg(img), text_msg(q)])\n```\n\n``` python\nc([msg])\n```\n\nThe image contains purple and yellow daisy-like flowers, which appear to\nbe daisies or a similar type of flower.\n\n<details>\n\n- id: msg_01GSzzitXbvkzEJtfJquzSXE\n- content: \\[{\u2018text\u2019: \u2018The image contains purple and yellow daisy-like\n flowers, which appear to be daisies or a similar type of flower.\u2019,\n \u2018type\u2019: \u2018text\u2019}\\]\n- model: claude-3-haiku-20240307\n- role: assistant\n- stop_reason: end_turn\n- stop_sequence: None\n- type: message\n- usage: {\u2018input_tokens\u2019: 1665, \u2018output_tokens\u2019: 29}\n\n</details>\n\nThere\u2019s not need to manually choose the type of message, since we figure\nthat out from the data of the source data.\n\n``` python\n_mk_content('Hi')\n```\n\n {'type': 'text', 'text': 'Hi'}\n\n``` python\nc([[img, q]])\n```\n\nThe image contains purple and yellow daisy-like flowers, which appear to\nbe daisies or a similar type of flower.\n\n<details>\n\n- id: msg_01ArrMvaZoXa1JTjULMentQJ\n- content: \\[{\u2018text\u2019: \u2018The image contains purple and yellow daisy-like\n flowers, which appear to be daisies or a similar type of flower.\u2019,\n \u2018type\u2019: \u2018text\u2019}\\]\n- model: claude-3-haiku-20240307\n- role: assistant\n- stop_reason: end_turn\n- stop_sequence: None\n- type: message\n- usage: {\u2018input_tokens\u2019: 1665, \u2018output_tokens\u2019: 29}\n\n</details>\n\nClaude also supports uploading an image without any text, in which case\nit\u2019ll make a general comment about what it sees. You can then use\n[`Chat`](https://AnswerDotAI.github.io/claudette/core.html#chat) to ask\nquestions:\n\n``` python\nchat = Chat(model, sp=sp)\nchat(img)\n```\n\nThe image shows a cute puppy, likely a Cavalier King Charles Spaniel,\nsitting in a grassy area surrounded by purple daisy flowers. The puppy\nhas a friendly, curious expression on its face as it gazes directly at\nthe camera. The contrast between the puppy\u2019s soft, fluffy fur and the\nvibrant flowers creates a charming and picturesque scene.\n\n<details>\n\n- id: msg_01535kuKhiN6Do5PTcTmTst7\n- content: \\[{\u2018text\u2019: \u201cThe image shows a cute puppy, likely a Cavalier\n King Charles Spaniel, sitting in a grassy area surrounded by purple\n daisy flowers. The puppy has a friendly, curious expression on its\n face as it gazes directly at the camera. The contrast between the\n puppy\u2019s soft, fluffy fur and the vibrant flowers creates a charming\n and picturesque scene.\u201d, \u2018type\u2019: \u2018text\u2019}\\]\n- model: claude-3-haiku-20240307\n- role: assistant\n- stop_reason: end_turn\n- stop_sequence: None\n- type: message\n- usage: {\u2018input_tokens\u2019: 1681, \u2018output_tokens\u2019: 83}\n\n</details>\n\n``` python\nchat('What direction is the puppy facing?')\n```\n\nThe puppy in the image is facing the camera directly, looking straight\nahead with a curious expression.\n\n<details>\n\n- id: msg_01Ge4M4Z4J6ywg9V8cCXy2aN\n- content: \\[{\u2018text\u2019: \u2018The puppy in the image is facing the camera\n directly, looking straight ahead with a curious expression.\u2019, \u2018type\u2019:\n \u2018text\u2019}\\]\n- model: claude-3-haiku-20240307\n- role: assistant\n- stop_reason: end_turn\n- stop_sequence: None\n- type: message\n- usage: {\u2018input_tokens\u2019: 1775, \u2018output_tokens\u2019: 23}\n\n</details>\n\n``` python\nchat('What color is it?')\n```\n\nThe puppy in the image has a combination of colors - it has a white and\nbrown/tan coat. The head and ears appear to be a reddish-brown color,\nwhile the body is mostly white with some tan/brown patches.\n\n<details>\n\n- id: msg_01JbUH6MvqWMvkF8UJVjo33z\n- content: \\[{\u2018text\u2019: \u2018The puppy in the image has a combination of\n colors - it has a white and brown/tan coat. The head and ears appear\n to be a reddish-brown color, while the body is mostly white with some\n tan/brown patches.\u2019, \u2018type\u2019: \u2018text\u2019}\\]\n- model: claude-3-haiku-20240307\n- role: assistant\n- stop_reason: end_turn\n- stop_sequence: None\n- type: message\n- usage: {\u2018input_tokens\u2019: 1806, \u2018output_tokens\u2019: 53}\n\n</details>\n\n------------------------------------------------------------------------\n\n<a\nhref=\"https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone\"\ntarget=\"_blank\" style=\"float:right; font-size:smaller\">source</a>\n\n### mk_msg\n\n> mk_msg (content, role='user', **kw)\n\n*Helper to create a `dict` appropriate for a Claude message. `kw` are\nadded as key/value pairs to the message*\n\n| | **Type** | **Default** | **Details** |\n|---------|----------|-------------|----------------------------------------------------------------|\n| content | | | A string, list, or dict containing the contents of the message |\n| role | str | user | Must be \u2018user\u2019 or \u2018assistant\u2019 |\n| kw | | | |\n\n------------------------------------------------------------------------\n\n<a\nhref=\"https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone\"\ntarget=\"_blank\" style=\"float:right; font-size:smaller\">source</a>\n\n### mk_msgs\n\n> mk_msgs (msgs:list, **kw)\n\n*Helper to set \u2018assistant\u2019 role on alternate messages.*\n\n------------------------------------------------------------------------\n\n<a\nhref=\"https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone\"\ntarget=\"_blank\" style=\"float:right; font-size:smaller\">source</a>\n\n### Client\n\n> Client (model, cli=None)\n\n*Basic Anthropic messages client.*\n\n------------------------------------------------------------------------\n\n<a\nhref=\"https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone\"\ntarget=\"_blank\" style=\"float:right; font-size:smaller\">source</a>\n\n### Client.\\_\\_call\\_\\_\n\n> Client.__call__ (msgs:list, sp='', temp=0, maxtok=4096,\n> stop:Optional[list[str]]=None, **kw)\n\n*Make a call to Claude without streaming.*\n\n| | **Type** | **Default** | **Details** |\n|--------|----------|-------------|--------------------------------|\n| msgs | list | | List of messages in the dialog |\n| sp | str | | The system prompt |\n| temp | int | 0 | Temperature |\n| maxtok | int | 4096 | Maximum tokens |\n| stop | Optional | None | Stop sequences |\n| kw | | | |\n\nDefining `__call__` let\u2019s us use an object like a function (i.e it\u2019s\n*callable*). We use it as a small wrapper over `messages.create`.\n\n``` python\nc('Hi')\n```\n\nHello! How can I assist you today?\n\n<details>\n\n- id: msg_01Vr6t6QdodntSMvHthnRDBc\n- content: \\[{\u2018text\u2019: \u2018Hello! How can I assist you today?\u2019, \u2018type\u2019:\n \u2018text\u2019}\\]\n- model: claude-3-haiku-20240307\n- role: assistant\n- stop_reason: end_turn\n- stop_sequence: None\n- type: message\n- usage: {\u2018input_tokens\u2019: 8, \u2018output_tokens\u2019: 12}\n\n</details>\n\n``` python\nc.use\n```\n\n In: 18; Out: 64; Total: 82\n\n------------------------------------------------------------------------\n\n<a\nhref=\"https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone\"\ntarget=\"_blank\" style=\"float:right; font-size:smaller\">source</a>\n\n### Client.stream\n\n> Client.stream (msgs:list, sp='', temp=0, maxtok=4096,\n> stop:Optional[list[str]]=None, **kw)\n\n*Make a call to Claude, streaming the result.*\n\n| | **Type** | **Default** | **Details** |\n|--------|----------|-------------|--------------------------------|\n| msgs | list | | List of messages in the dialog |\n| sp | str | | The system prompt |\n| temp | int | 0 | Temperature |\n| maxtok | int | 4096 | Maximum tokens |\n| stop | Optional | None | Stop sequences |\n| kw | | | |\n\nWe also define a wrapper over `messages.stream`, which is like\n`messages.create`, but streams the response back incrementally.\n\n``` python\nfor o in c.stream('Hi'): print(o, end='')\n```\n\n Hello! How can I assist you today?\n\n## Tool use\n\n[Tool use](https://docs.anthropic.com/claude/docs/tool-use) lets Claude\nuse external tools.\n\nWe\u2019ll use [docments](https://fastcore.fast.ai/docments.html) to make\ndefining Python functions as ergonomic as possible. Each parameter (and\nthe return value) should have a type, and a docments comment with the\ndescription of what it is. As an example we\u2019ll write a simple function\nthat adds numbers together:\n\n``` python\ndef sums(\n # First thing to sum\n a:int,\n # Second thing to sum\n b:int=1\n# The sum of the inputs\n) -> int:\n \"Adds a + b.\"\n return a + b\n```\n\n------------------------------------------------------------------------\n\n<a\nhref=\"https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone\"\ntarget=\"_blank\" style=\"float:right; font-size:smaller\">source</a>\n\n### get_schema\n\n> get_schema (f:<built-infunctioncallable>)\n\n*Convert function `f` into a JSON schema `dict` for tool use.*\n\n``` python\na,b = 604542,6458932\npr = f\"What is {a}+{b}?\"\nsp = \"You must use the `sums` function instead of adding yourself, but don't mention what tools you use.\"\ntools=[get_schema(sums)]\n```\n\nWe\u2019ll start a dialog with Claude now. We\u2019ll store the messages of our\ndialog in `msgs`. The first message will be our prompt `pr`, and we\u2019ll\npass our `tools` schema.\n\n``` python\nmsgs = mk_msgs(pr)\nr = c(msgs, sp=sp, tools=tools)\nr\n```\n\nToolUseBlock(id=\u2018toolu_01CsuZfPAas75MkDABXAvjWD\u2019, input={\u2018a\u2019: 604542,\n\u2018b\u2019: 6458932}, name=\u2018sums\u2019, type=\u2018tool_use\u2019)\n\n<details>\n\n- id: msg_01StvQvvrnwaBtuUwHQLrpFt\n- content: \\[{\u2018id\u2019: \u2018toolu_01CsuZfPAas75MkDABXAvjWD\u2019, \u2018input\u2019: {\u2018a\u2019:\n 604542, \u2018b\u2019: 6458932}, \u2018name\u2019: \u2018sums\u2019, \u2018type\u2019: \u2018tool_use\u2019}\\]\n- model: claude-3-haiku-20240307\n- role: assistant\n- stop_reason: tool_use\n- stop_sequence: None\n- type: message\n- usage: {\u2018input_tokens\u2019: 414, \u2018output_tokens\u2019: 72}\n\n</details>\n\nWhen Claude decides that it should use a tool, it passes back a\n`ToolUseBlock` with the name of the tool to call, and the params to use.\n\nWe need to append the response to the dialog so Claude knows what\u2019s\nhappening (since it\u2019s stateless).\n\n``` python\nmsgs.append(mk_msg(r))\n```\n\nWe don\u2019t want to allow it to call just any possible function (that would\nbe a security disaster!) so we create a *namespace* \u2013 that is, a\ndictionary of allowable function names to call.\n\n------------------------------------------------------------------------\n\n<a\nhref=\"https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone\"\ntarget=\"_blank\" style=\"float:right; font-size:smaller\">source</a>\n\n### call_func\n\n> call_func (tr:collections.abc.Mapping,\n> ns:Optional[collections.abc.Mapping]=None)\n\n*Call the function in the tool response `tr`, using namespace `ns`.*\n\n| | **Type** | **Default** | **Details** |\n|-----|----------|-------------|--------------------------------------------------------|\n| tr | Mapping | | Tool use request response from Claude |\n| ns | Optional | None | Namespace to search for tools, defaults to `globals()` |\n\nWe can now use the function requested by Claude. We look it up in `ns`,\nand pass in the provided parameters.\n\n``` python\nres = call_func(r, ns=ns)\nres\n```\n\n 7063474\n\n------------------------------------------------------------------------\n\n<a\nhref=\"https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone\"\ntarget=\"_blank\" style=\"float:right; font-size:smaller\">source</a>\n\n### mk_toolres\n\n> mk_toolres (r:collections.abc.Mapping, res=None,\n> ns:Optional[collections.abc.Mapping]=None)\n\n*Create a `tool_result` message from response `r`.*\n\n| | **Type** | **Default** | **Details** |\n|-----|----------|-------------|----------------------------------------------------------------------------------------------------------------------------------------|\n| r | Mapping | | Tool use request response from Claude |\n| res | NoneType | None | The result of calling the tool (calculated with [`call_func`](https://AnswerDotAI.github.io/claudette/core.html#call_func) by default) |\n| ns | Optional | None | Namespace to search for tools |\n\nIn order to tell Claude the result of the tool call, we pass back a\n`tool_result` message, created by calling\n[`call_func`](https://AnswerDotAI.github.io/claudette/core.html#call_func).\n\n``` python\ntr = mk_toolres(r, res=res, ns=ns)\ntr\n```\n\n {'role': 'user',\n 'content': [{'type': 'tool_result',\n 'tool_use_id': 'toolu_01CsuZfPAas75MkDABXAvjWD',\n 'content': '7063474'}]}\n\nWe add this to our dialog, and now Claude has all the information it\nneeds to answer our question.\n\n``` python\nmsgs.append(tr)\ncontents(c(msgs, sp=sp, tools=tools))\n```\n\n 'The sum of 604542 and 6458932 is 7063474.'\n\n## XML helpers\n\nClaude works well with XML inputs, but XML can be a bit clunky to work\nwith manually. Therefore, we create a couple of more streamlined\napproaches for XML generation. You don\u2019t need to use these if you don\u2019t\nfind them useful \u2013 you can always just use plain strings for XML\ndirectly.\n\n------------------------------------------------------------------------\n\n<a\nhref=\"https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone\"\ntarget=\"_blank\" style=\"float:right; font-size:smaller\">source</a>\n\n### xt\n\n> xt (tag:str, c:Optional[list]=None, **kw)\n\n*Helper to create appropriate data structure for\n[`to_xml`](https://AnswerDotAI.github.io/claudette/core.html#to_xml).*\n\n| | **Type** | **Default** | **Details** |\n|-----|----------|-------------|--------------|\n| tag | str | | XML tag name |\n| c | Optional | None | Children |\n| kw | | | |\n\nAn XML node contains a tag, optional children, and optional attributes.\n[`xt`](https://AnswerDotAI.github.io/claudette/core.html#xt) creates a\ntuple of these three things, which we will use to general XML shortly.\nAttributes are passed as kwargs; since these might conflict with\nreserved words in Python, you can optionally add a `_` prefix and it\u2019ll\nbe stripped off.\n\n``` python\nxt('x-custom', ['hi'], _class='bar')\n```\n\n ('x-custom', ['hi'], {'class': 'bar'})\n\n``` python\nfrom claudette.core import div,img,h1,h2,p,hr,html\n```\n\nIf you have to use a lot of tags of the same type, it\u2019s convenient to\nuse `partial` to create specialised functions for them. Here, we\u2019re\ncreating functions for some common HTML tags. Here\u2019s an example of using\nthem:\n\n``` python\na = html([\n p('This is a paragraph'),\n hr(),\n img(src='http://example.prg'),\n div([\n h1('This is a header'),\n h2('This is a sub-header', style='k:v'),\n ], _class='foo')\n])\na\n```\n\n ('html',\n [('p', 'This is a paragraph', {}),\n ('hr', None, {}),\n ('img', None, {'src': 'http://example.prg'}),\n ('div',\n [('h1', 'This is a header', {}),\n ('h2', 'This is a sub-header', {'style': 'k:v'})],\n {'class': 'foo'})],\n {})\n\n------------------------------------------------------------------------\n\n<a\nhref=\"https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone\"\ntarget=\"_blank\" style=\"float:right; font-size:smaller\">source</a>\n\n### hl_md\n\n> hl_md (s, lang='xml')\n\n*Syntax highlight `s` using `lang`.*\n\nWhen we display XML in a notebook, it\u2019s nice to highlight it, so we\ncreate a function to simplify that:\n\n``` python\nhl_md('<test><xml foo=\"bar\">a child</xml></test>')\n```\n\n``` xml\n<test><xml foo=\"bar\">a child</xml></test>\n```\n\n------------------------------------------------------------------------\n\n<a\nhref=\"https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone\"\ntarget=\"_blank\" style=\"float:right; font-size:smaller\">source</a>\n\n### to_xml\n\n> to_xml (node:tuple, hl=False)\n\n*Convert `node` to an XML string.*\n\n| | **Type** | **Default** | **Details** |\n|------|----------|-------------|--------------------------------------------------------------------------------------|\n| node | tuple | | XML structure in [`xt`](https://AnswerDotAI.github.io/claudette/core.html#xt) format |\n| hl | bool | False | Syntax highlight response? |\n\nNow we can convert that HTML data structure we created into XML:\n\n``` python\nto_xml(a, hl=True)\n```\n\n``` xml\n<html>\n <p>This is a paragraph</p>\n <hr />\n <img src=\"http://example.prg\" />\n <div class=\"foo\">\n <h1>This is a header</h1>\n <h2 style=\"k:v\">This is a sub-header</h2>\n </div>\n</html>\n```\n\n------------------------------------------------------------------------\n\n<a\nhref=\"https://github.com/AnswerDotAI/claudette/blob/main/claudette/core.py#LNone\"\ntarget=\"_blank\" style=\"float:right; font-size:smaller\">source</a>\n\n### json_to_xml\n\n> json_to_xml (d:dict, rnm:str)\n\n*Convert `d` to XML.*\n\n| | **Type** | **Details** |\n|-------------|----------|----------------------------|\n| d | dict | JSON dictionary to convert |\n| rnm | str | Root name |\n| **Returns** | **str** | |\n\nJSON doesn\u2019t map as nicely to XML as the data structure used in the\nprevious section, but for simple XML trees it can be convenient \u2013 for\nexample:\n\n``` python\na = dict(surname='Howard', firstnames=['Jeremy','Peter'],\n address=dict(state='Queensland',country='Australia'))\nhl_md(json_to_xml(a, 'person'))\n```\n\n``` xml\n<person>\n <surname>Howard</surname>\n <firstnames>\n <item>Jeremy</item>\n <item>Peter</item>\n </firstnames>\n <address>\n <state>Queensland</state>\n <country>Australia</country>\n </address>\n</person>\n```\n",
"bugtrack_url": null,
"license": "Apache Software License 2.0",
"summary": "Claudette is Claude's friend",
"version": "0.0.1",
"project_urls": {
"Homepage": "https://github.com/AnswerDotAI/claudette"
},
"split_keywords": [
"nbdev",
"jupyter",
"notebook",
"python"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "6ea3d6eebe1a20cfa94d4d0a6b5245ac588266e9caa653f9b5d11838565e2199",
"md5": "dd3ef453683c49b6cb0ab8013313cd21",
"sha256": "d480f7fee23bd8768ea5d31ddf69f95db231bb1a50ca155e6b4fa439d5d39c65"
},
"downloads": -1,
"filename": "claudette-0.0.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "dd3ef453683c49b6cb0ab8013313cd21",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 17671,
"upload_time": "2024-05-01T00:37:32",
"upload_time_iso_8601": "2024-05-01T00:37:32.029842Z",
"url": "https://files.pythonhosted.org/packages/6e/a3/d6eebe1a20cfa94d4d0a6b5245ac588266e9caa653f9b5d11838565e2199/claudette-0.0.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "d8b8bf57accf8a826793f96612be81ac7340d7661eb42da59e8f40df4ff40406",
"md5": "ec9c3df79c997a4a370d28504aee5513",
"sha256": "5d97dbab2e5a8712ee2dc7e45bf577840fb1366ae5fc9fe357f6fce8ac8f94f2"
},
"downloads": -1,
"filename": "claudette-0.0.1.tar.gz",
"has_sig": false,
"md5_digest": "ec9c3df79c997a4a370d28504aee5513",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 24280,
"upload_time": "2024-05-01T00:37:35",
"upload_time_iso_8601": "2024-05-01T00:37:35.121941Z",
"url": "https://files.pythonhosted.org/packages/d8/b8/bf57accf8a826793f96612be81ac7340d7661eb42da59e8f40df4ff40406/claudette-0.0.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-05-01 00:37:35",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "AnswerDotAI",
"github_project": "claudette",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "claudette"
}