# schemafunc
Generates an OpenAI-compatible tool schema *automatically* from a python function
signature.
The intended use case is for LLM tool usage and output constraints.
Future updates should include explicit support for other LLMs, though the current
functionality probably works with most or can be easily adapted.
Supports Python 3.8+.
## Output constraints?
You don't actually have to *really* want the LLM to "use a tool". You might just want
to ensure it always returns valid JSON in a specific format. "Function calling" or
"tool usage" actually ends up being a great way to enforce that. Just create a
function whose arguments match the output you want. You don't actually have to use the
function, but when you tell the LLM the function is available, it will constrain its
output to match the schema of the function.
## Why?
* Manually keeping the JSON description of a python function up-to-date is
error-prone. Even if you use something like `pydantic` to build and enforce the
schema, you still end up with two sources of truth that you have to keep in sync.
* It's tedious and irritating to have to write the same information twice.
* In my experience, writing a Python function is more ergonomic, natural, and less
error-prone than writing a JSON schema by hand. Even if you were to use `pydantic`
and create a model that models the expected schema, I still find that it's not a
great mental model to map from a `BaseModel` to the type of "tool call" that OpenAI
and others expect.
## Key features
* **Automatic**: The schema is generated from the function.
* Add `@add_schemafunc` to your function and your schema is done.
* Tool schema available as a property of the function, so you can access it easily.
* `your_own_function.schemafunc.schema`
* Easy tool kwargs for `openai` chat completions API.
* `your_own_function.schemafunc.openai_tool_kwargs`
* Use by unpacking the kwargs into the `openai` API call.
* Extracts the function description from the first line of the docstring.
* Extracts parameter descriptions from the docstring parameter list.
* Supports Numpy-style, Google-style, and RestructuredText-style docstrings.
## Installation
```bash
pip install schemafunc
```
## Example
### Quick Example
```python
import openai
import json
from schemafunc import add_schemafunc
@add_schemafunc # 🪄✨ MAGIC DECORATOR
def my_awesome_tool(foo: str, bar: int):
"""
This is a really cool tool that does something amazing.
:param foo: A string parameter.
:param bar: An integer parameter.
"""
return {"foo": foo, "bar": bar}
client = openai.Client()
messages = [{"role": "user", "content": "When baz happens, use my_awesome_tool."}]
response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=messages,
# 🪄✨ THE MAGIC HAPPENS HERE!
**my_awesome_tool.schemafunc.openai_tool_kwargs
)
print(json.loads(response.choices[0].message.tool_calls[0].function.arguments))
{
"foo": "baz",
"bar": 42
}
```
## Detailed example
You want to add a Wikipedia-searching tool to your chatbot.
```python
from typing import List
from schemafunc import add_schemafunc
@add_schemafunc # 🪄✨ MAGIC DECORATOR
def search_wikipedia(query: str, num_results: int = 5) -> List[str]:
"""
Searches Wikipedia for the given query and returns the specified number of results.
This will be a real function used in your code.
:param query: The search query.
:param num_results: The number of results to return (default: 5).
:return: A list of search result summaries.
"""
...
```
Here's what the generated schema looks like:
```python
{
"function": {
"description": "Searches Wikipedia for the given query and returns the specified number of results.",
"name": "search_wikipedia",
"parameters": {
"properties": {
"num_results": {
"default": 5,
"description": "The number of results to return (default: 5).",
"type": "integer",
},
"query": {"description": "The search query.", "type": "string"},
},
"required": ["query"],
"type": "object",
},
},
"type": "function",
}
```
However, there's not a lot of reason to see or interact with the schema. You only need
to pass it to the LLM. Here we use the `openai` package for interacting with GPT-3.5:
```python
from typing import Callable
import json
import openai
client = openai.Client()
def run_conversation(query: str, func: Callable):
messages = [{"role": "user", "content": query}]
response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=messages,
# 🪄✨ THE MAGIC HAPPENS HERE!
**search_wikipedia.schemafunc.openai_tool_kwargs
)
return json.loads(response.choices[0].message.tool_calls[0].function.arguments)
```
And then we can use it like this:
```python
arguments = run_conversation(
"Search Wikipedia for that cool programming language with significant whitespace.",
search_wikipedia
)
```
Which will give you the arguments for the `search_wikipedia` function that the LLM
decided to use. Note how it matches up to the `search_wikipedia` function signature:
```python
print(arguments)
{
"query": "Python",
"num_results": 10
}
```
## Contributing
### Quick Start
1. **Fork & Clone**: Fork the project, then clone your fork and switch to a new branch for your feature or fix.
```bash
git clone https://github.com/your-username/schemafunc.git
cd schemafunc
git checkout -b your-feature-branch
```
2. **Set Up Environment**: Use Poetry to install dependencies and set up your development environment.
```bash
poetry install
```
3. **Make Changes**: Implement your feature or fix. Remember to add or update tests and documentation as needed.
4. **Test Locally**: Run the tests to ensure everything works as expected.
```bash
poetry run test
```
5. **Commit & Push**: Commit your changes with a clear message, then push them to your fork.
```bash
git commit -am "Add a brief but descriptive commit message"
git push origin your-feature-branch
```
6. **Pull Request**: Open a pull request from your branch to the main `schemafunc` repository. Describe your changes and their impact.
### Guidelines
- Keep commits concise and relevant.
- Include comments in your code where necessary.
- Follow the coding style and standards of the project.
For any questions or to discuss larger changes, please open an issue first.
Raw data
{
"_id": null,
"home_page": "https://github.com/dmwyatt/schemafunc",
"name": "schemafunc",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0.0,>=3.10.11",
"maintainer_email": null,
"keywords": "python, openai, llm, language-model, schema, function, tool",
"author": "Dustin Wyatt",
"author_email": "dustin.wyatt@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/08/0a/6005063c7801f92ca22d80c8904769b9779ff7babdfc531c69a9c6ed4826/schemafunc-0.3.19.tar.gz",
"platform": null,
"description": "# schemafunc\n\nGenerates an OpenAI-compatible tool schema *automatically* from a python function \nsignature.\n\nThe intended use case is for LLM tool usage and output constraints.\n\nFuture updates should include explicit support for other LLMs, though the current \nfunctionality probably works with most or can be easily adapted.\n\nSupports Python 3.8+.\n\n## Output constraints?\n\nYou don't actually have to *really* want the LLM to \"use a tool\". You might just want \nto ensure it always returns valid JSON in a specific format. \"Function calling\" or \n\"tool usage\" actually ends up being a great way to enforce that. Just create a \nfunction whose arguments match the output you want. You don't actually have to use the \nfunction, but when you tell the LLM the function is available, it will constrain its \noutput to match the schema of the function.\n\n\n## Why?\n\n* Manually keeping the JSON description of a python function up-to-date is \n error-prone. Even if you use something like `pydantic` to build and enforce the \n schema, you still end up with two sources of truth that you have to keep in sync.\n* It's tedious and irritating to have to write the same information twice.\n* In my experience, writing a Python function is more ergonomic, natural, and less \n error-prone than writing a JSON schema by hand. Even if you were to use `pydantic` \n and create a model that models the expected schema, I still find that it's not a \n great mental model to map from a `BaseModel` to the type of \"tool call\" that OpenAI \n and others expect.\n\n## Key features\n\n* **Automatic**: The schema is generated from the function.\n * Add `@add_schemafunc` to your function and your schema is done.\n* Tool schema available as a property of the function, so you can access it easily.\n * `your_own_function.schemafunc.schema` \n* Easy tool kwargs for `openai` chat completions API.\n * `your_own_function.schemafunc.openai_tool_kwargs`\n * Use by unpacking the kwargs into the `openai` API call.\n* Extracts the function description from the first line of the docstring.\n* Extracts parameter descriptions from the docstring parameter list.\n* Supports Numpy-style, Google-style, and RestructuredText-style docstrings.\n\n## Installation\n \n```bash\npip install schemafunc\n```\n\n## Example\n\n### Quick Example\n\n```python\nimport openai\nimport json\nfrom schemafunc import add_schemafunc\n\n@add_schemafunc # \ud83e\ude84\u2728 MAGIC DECORATOR\ndef my_awesome_tool(foo: str, bar: int):\n \"\"\"\n This is a really cool tool that does something amazing.\n\n :param foo: A string parameter.\n :param bar: An integer parameter.\n \"\"\"\n return {\"foo\": foo, \"bar\": bar}\n\nclient = openai.Client()\nmessages = [{\"role\": \"user\", \"content\": \"When baz happens, use my_awesome_tool.\"}]\n \nresponse = client.chat.completions.create(\n model=\"gpt-3.5-turbo\",\n messages=messages,\n # \ud83e\ude84\u2728 THE MAGIC HAPPENS HERE!\n **my_awesome_tool.schemafunc.openai_tool_kwargs\n)\nprint(json.loads(response.choices[0].message.tool_calls[0].function.arguments))\n{\n \"foo\": \"baz\",\n \"bar\": 42\n}\n```\n\n## Detailed example\n\nYou want to add a Wikipedia-searching tool to your chatbot. \n\n```python\nfrom typing import List\nfrom schemafunc import add_schemafunc\n\n@add_schemafunc # \ud83e\ude84\u2728 MAGIC DECORATOR\ndef search_wikipedia(query: str, num_results: int = 5) -> List[str]:\n \"\"\"\n Searches Wikipedia for the given query and returns the specified number of results.\n \n This will be a real function used in your code.\n\n :param query: The search query.\n :param num_results: The number of results to return (default: 5).\n :return: A list of search result summaries.\n \"\"\"\n ...\n```\n\n\nHere's what the generated schema looks like:\n\n\n```python\n{\n \"function\": {\n \"description\": \"Searches Wikipedia for the given query and returns the specified number of results.\",\n \"name\": \"search_wikipedia\",\n \"parameters\": {\n \"properties\": {\n \"num_results\": {\n \"default\": 5,\n \"description\": \"The number of results to return (default: 5).\",\n \"type\": \"integer\",\n },\n \"query\": {\"description\": \"The search query.\", \"type\": \"string\"},\n },\n \"required\": [\"query\"],\n \"type\": \"object\",\n },\n },\n \"type\": \"function\",\n}\n```\n\nHowever, there's not a lot of reason to see or interact with the schema. You only need \nto pass it to the LLM. Here we use the `openai` package for interacting with GPT-3.5:\n\n```python\nfrom typing import Callable\nimport json\nimport openai\n\n\nclient = openai.Client()\n\n\ndef run_conversation(query: str, func: Callable):\n messages = [{\"role\": \"user\", \"content\": query}]\n \n response = client.chat.completions.create(\n model=\"gpt-3.5-turbo\",\n messages=messages,\n # \ud83e\ude84\u2728 THE MAGIC HAPPENS HERE! \n **search_wikipedia.schemafunc.openai_tool_kwargs\n )\n return json.loads(response.choices[0].message.tool_calls[0].function.arguments)\n```\n\nAnd then we can use it like this:\n\n```python\narguments = run_conversation(\n \"Search Wikipedia for that cool programming language with significant whitespace.\",\n search_wikipedia\n)\n```\n\nWhich will give you the arguments for the `search_wikipedia` function that the LLM \ndecided to use. Note how it matches up to the `search_wikipedia` function signature:\n\n\n```python\nprint(arguments)\n{\n \"query\": \"Python\",\n \"num_results\": 10\n}\n```\n\n## Contributing\n\n### Quick Start\n\n1. **Fork & Clone**: Fork the project, then clone your fork and switch to a new branch for your feature or fix.\n\n ```bash\n git clone https://github.com/your-username/schemafunc.git\n cd schemafunc\n git checkout -b your-feature-branch\n ```\n\n2. **Set Up Environment**: Use Poetry to install dependencies and set up your development environment.\n ```bash\n poetry install\n ```\n\n3. **Make Changes**: Implement your feature or fix. Remember to add or update tests and documentation as needed.\n\n4. **Test Locally**: Run the tests to ensure everything works as expected.\n ```bash\n poetry run test\n ```\n\n5. **Commit & Push**: Commit your changes with a clear message, then push them to your fork.\n ```bash\n git commit -am \"Add a brief but descriptive commit message\"\n git push origin your-feature-branch\n ```\n\n6. **Pull Request**: Open a pull request from your branch to the main `schemafunc` repository. Describe your changes and their impact.\n\n### Guidelines\n\n- Keep commits concise and relevant.\n- Include comments in your code where necessary.\n- Follow the coding style and standards of the project.\n\nFor any questions or to discuss larger changes, please open an issue first.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Python function-to-LLM tool maker.",
"version": "0.3.19",
"project_urls": {
"Homepage": "https://github.com/dmwyatt/schemafunc",
"Repository": "https://github.com/dmwyatt/schemafunc"
},
"split_keywords": [
"python",
" openai",
" llm",
" language-model",
" schema",
" function",
" tool"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "f66a1c1242696b2994c3d0f4e5ca71b5d842a5e8f60cfc7d73003a5d0bb9d115",
"md5": "31aad310ba0dbef1d864653680607924",
"sha256": "9e82f7511a006c612159293be388fa29a28b7269806c8e48efd00136568ffcb5"
},
"downloads": -1,
"filename": "schemafunc-0.3.19-py3-none-any.whl",
"has_sig": false,
"md5_digest": "31aad310ba0dbef1d864653680607924",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0.0,>=3.10.11",
"size": 13885,
"upload_time": "2024-07-16T00:15:22",
"upload_time_iso_8601": "2024-07-16T00:15:22.148758Z",
"url": "https://files.pythonhosted.org/packages/f6/6a/1c1242696b2994c3d0f4e5ca71b5d842a5e8f60cfc7d73003a5d0bb9d115/schemafunc-0.3.19-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "080a6005063c7801f92ca22d80c8904769b9779ff7babdfc531c69a9c6ed4826",
"md5": "212738a95394e5c354597ef1c66fa81c",
"sha256": "63fabe6961b7ec53f15970688d812ae11fe76eb79881ca4f2e32d373c6bf0882"
},
"downloads": -1,
"filename": "schemafunc-0.3.19.tar.gz",
"has_sig": false,
"md5_digest": "212738a95394e5c354597ef1c66fa81c",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0.0,>=3.10.11",
"size": 12735,
"upload_time": "2024-07-16T00:15:27",
"upload_time_iso_8601": "2024-07-16T00:15:27.922216Z",
"url": "https://files.pythonhosted.org/packages/08/0a/6005063c7801f92ca22d80c8904769b9779ff7babdfc531c69a9c6ed4826/schemafunc-0.3.19.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-07-16 00:15:27",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "dmwyatt",
"github_project": "schemafunc",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "schemafunc"
}