ai-exchange


Nameai-exchange JSON
Version 0.9.8 PyPI version JSON
download
home_pageNone
Summarya uniform python SDK for message generation with LLMs
upload_time2024-10-21 00:34:29
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <p align="center">
<a href="https://opensource.org/licenses/Apache-2.0"><img src="https://img.shields.io/badge/License-Apache_2.0-blue.svg"></a>
</p>

<p align="center">
  <a href="#example">Example</a> •
  <a href="#plugins">Plugins</a>
</p>

<p align="center"><strong>Exchange</strong> <em>- a uniform python SDK for message generation with LLMs</em></p>

- Provides a flexible layer for message handling and generation
- Directly integrates python functions into tool calling
- Persistently surfaces errors to the underlying models to support reflection

## Example

> [!NOTE]
> Before you can run this example, you need to setup an API key with
> `export OPENAI_API_KEY=your-key-here`

``` python
from exchange import Exchange, Message, Tool
from exchange.providers import OpenAiProvider

def word_count(text: str):
    """Get the count of words in text

    Args:
        text (str): The text with words to count
    """
    return len(text.split(" "))

ex = Exchange(
    provider=OpenAiProvider.from_env(),
    model="gpt-4o",
    system="You are a helpful assistant.",
    tools=[Tool.from_function(word_count)],
)
ex.add(Message.user("Count the number of words in this current message"))

# The model sees it has a word count tool, and should use it along the way to answer
# This will call all the tools as needed until the model replies with the final result
reply = ex.reply()
print(reply.text)

# you can see all the tool calls in the message history
print(ex.messages)
```

## Plugins

*exchange* has a plugin mechanism to add support for additional providers and moderators. If you need a 
provider not supported here, we'd be happy to review contributions. But you
can also consider building and using your own plugin. 

To create a `Provider` plugin, subclass `exchange.provider.Provider`. You will need to 
implement the `complete` method. For example this is what we use as a mock in our tests.
You can see a full implementation example of the [OpenAiProvider][openaiprovider]. We
also generally recommend implementing a `from_env` classmethod to instantiate the provider.

``` python
class MockProvider(Provider):
    def __init__(self, sequence: List[Message]):
        # We'll use init to provide a preplanned reply sequence
        self.sequence = sequence
        self.call_count = 0

    def complete(
        self, model: str, system: str, messages: List[Message], tools: List[Tool]
    ) -> Message:
        output = self.sequence[self.call_count]
        self.call_count += 1
        return output
```

Then use [python packaging's entrypoints][plugins] to register your plugin. 

``` toml
[project.entry-points.'exchange.provider']
example = 'path.to.plugin:ExampleProvider'
```

Your plugin will then be available in your application or other applications built on *exchange*
through:

``` python
from exchange.providers import get_provider

provider = get_provider('example').from_env()
```

[openaiprovider]: src/exchange/providers/openai.py
[plugins]: https://packaging.python.org/en/latest/guides/creating-and-discovering-plugins/

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "ai-exchange",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/f9/7b/41303d2affc471626734cb65faf1d34a7c4dd04b82e95d8618161ddf584d/ai_exchange-0.9.8.tar.gz",
    "platform": null,
    "description": "<p align=\"center\">\n<a href=\"https://opensource.org/licenses/Apache-2.0\"><img src=\"https://img.shields.io/badge/License-Apache_2.0-blue.svg\"></a>\n</p>\n\n<p align=\"center\">\n  <a href=\"#example\">Example</a> \u2022\n  <a href=\"#plugins\">Plugins</a>\n</p>\n\n<p align=\"center\"><strong>Exchange</strong> <em>- a uniform python SDK for message generation with LLMs</em></p>\n\n- Provides a flexible layer for message handling and generation\n- Directly integrates python functions into tool calling\n- Persistently surfaces errors to the underlying models to support reflection\n\n## Example\n\n> [!NOTE]\n> Before you can run this example, you need to setup an API key with\n> `export OPENAI_API_KEY=your-key-here`\n\n``` python\nfrom exchange import Exchange, Message, Tool\nfrom exchange.providers import OpenAiProvider\n\ndef word_count(text: str):\n    \"\"\"Get the count of words in text\n\n    Args:\n        text (str): The text with words to count\n    \"\"\"\n    return len(text.split(\" \"))\n\nex = Exchange(\n    provider=OpenAiProvider.from_env(),\n    model=\"gpt-4o\",\n    system=\"You are a helpful assistant.\",\n    tools=[Tool.from_function(word_count)],\n)\nex.add(Message.user(\"Count the number of words in this current message\"))\n\n# The model sees it has a word count tool, and should use it along the way to answer\n# This will call all the tools as needed until the model replies with the final result\nreply = ex.reply()\nprint(reply.text)\n\n# you can see all the tool calls in the message history\nprint(ex.messages)\n```\n\n## Plugins\n\n*exchange* has a plugin mechanism to add support for additional providers and moderators. If you need a \nprovider not supported here, we'd be happy to review contributions. But you\ncan also consider building and using your own plugin. \n\nTo create a `Provider` plugin, subclass `exchange.provider.Provider`. You will need to \nimplement the `complete` method. For example this is what we use as a mock in our tests.\nYou can see a full implementation example of the [OpenAiProvider][openaiprovider]. We\nalso generally recommend implementing a `from_env` classmethod to instantiate the provider.\n\n``` python\nclass MockProvider(Provider):\n    def __init__(self, sequence: List[Message]):\n        # We'll use init to provide a preplanned reply sequence\n        self.sequence = sequence\n        self.call_count = 0\n\n    def complete(\n        self, model: str, system: str, messages: List[Message], tools: List[Tool]\n    ) -> Message:\n        output = self.sequence[self.call_count]\n        self.call_count += 1\n        return output\n```\n\nThen use [python packaging's entrypoints][plugins] to register your plugin. \n\n``` toml\n[project.entry-points.'exchange.provider']\nexample = 'path.to.plugin:ExampleProvider'\n```\n\nYour plugin will then be available in your application or other applications built on *exchange*\nthrough:\n\n``` python\nfrom exchange.providers import get_provider\n\nprovider = get_provider('example').from_env()\n```\n\n[openaiprovider]: src/exchange/providers/openai.py\n[plugins]: https://packaging.python.org/en/latest/guides/creating-and-discovering-plugins/\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "a uniform python SDK for message generation with LLMs",
    "version": "0.9.8",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2bda123aeb45aca079c229527fddbe2ffddc659c4572cb5489dc16c78c997c36",
                "md5": "01c86dcdb26664d1bf136818ec9a0345",
                "sha256": "85cb58165ad94a45c1587ddd7858d67f8c69b2ff348c44f0c1e0fb293366ac04"
            },
            "downloads": -1,
            "filename": "ai_exchange-0.9.8-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "01c86dcdb26664d1bf136818ec9a0345",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 36826,
            "upload_time": "2024-10-21T00:34:26",
            "upload_time_iso_8601": "2024-10-21T00:34:26.703139Z",
            "url": "https://files.pythonhosted.org/packages/2b/da/123aeb45aca079c229527fddbe2ffddc659c4572cb5489dc16c78c997c36/ai_exchange-0.9.8-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f97b41303d2affc471626734cb65faf1d34a7c4dd04b82e95d8618161ddf584d",
                "md5": "fbe8346a553339e0ce56f67053507cc6",
                "sha256": "b795296af452f4e12dbf9a46f29c8eef3cddf4825d2f100e2b7c1b33c21d7f4f"
            },
            "downloads": -1,
            "filename": "ai_exchange-0.9.8.tar.gz",
            "has_sig": false,
            "md5_digest": "fbe8346a553339e0ce56f67053507cc6",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 166827,
            "upload_time": "2024-10-21T00:34:29",
            "upload_time_iso_8601": "2024-10-21T00:34:29.923383Z",
            "url": "https://files.pythonhosted.org/packages/f9/7b/41303d2affc471626734cb65faf1d34a7c4dd04b82e95d8618161ddf584d/ai_exchange-0.9.8.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-21 00:34:29",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "ai-exchange"
}
        
Elapsed time: 0.37683s