interfaces-to


Nameinterfaces-to JSON
Version 0.2.30 PyPI version JSON
download
home_pagehttps://interfaces.to
SummaryThe quickest way to make Large Language Models do things.
upload_time2024-08-10 06:37:26
maintainerNone
docs_urlNone
authorBlair Hudson
requires_python<4.0,>=3.10
licenseMIT
keywords llm-agent openai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
<center><img src="https://github.com/interfaces-to/interfaces-to/raw/main/interfaces-to.png" alt="interfaces.to" width="250" /></center>


# Add a little action to your LLM adventure with 🐙 Interfaces

**🐙 Interfaces** (aka `into`) is the quickest way to make Large Language Models _do_ things. It's a simple, powerful and flexible way to build more useful, more engaging and more valuable agent-driven applications with LLMs.

## ✨ Key Features

⭐️ Built-in tools for common tasks and platforms ([see all](#-available-tools))<br/>
⭐️ Dynamic message sources for real-time interactions ([see all](#-experimental-dynamic-messages))<br/>
⭐️ Start building with ~~just 4~~ only 3(!) lines of code<br/>
⭐️ Create agents with system messages to control behaviour ([more info](#-setting-the-system-message))<br/>
⭐️ Beginner-friendly Python library, learn and teach coding with **🐙 Interfaces**!<br/>
⭐️ Simple and secure configuration<br/>
⭐️ Fully compatible with the OpenAI API SDK<br/>
⭐️ Works with gpt-4o, gpt-4o-mini and other OpenAI models<br/>
⭐️ Works with llama3.1, mistral-large and more via [Ollama](https://ollama.com/search?c=tools)<br/>
⭐️ Works with Azure OpenAI ([more info](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/switching-endpoints))<br/>
⭐️ Supports (thrives) on `parallel_tool_calls` ([more info](https://platform.openai.com/docs/guides/function-calling/parallel-function-calling)) <br/>
⭐️ Works with any other LLM applications and services that support the OpenAI API<br/>
⭐️ Runs on your local machine, in the cloud, or on the edge<br/>
⭐️ Run tools from the command line with the `into` CLI ([see all](#-experimental-cli-support))<br/>
⭐️ Extensible design for building custom tools ([example](https://github.com/interfaces-to/interfaces-to/blob/main/interfaces_to/tools/peopledatalabs.py)) and message sources ([example](https://github.com/interfaces-to/interfaces-to/blob/main/interfaces_to/messages/ngrok.py))<br/>
⭐️ Open-source, MIT licensed, and community-driven<br/>

## 🚀 Quick Start

### Installation

Install with pip:
```bash
pip install interfaces-to
```

or

Install with poetry:
```bash
poetry add interfaces-to
```

### Usage

Turn your OpenAI completion into a fully capable agent with 3 lines of code:

```python
# 1️⃣ import `into`
import interfaces_to as into

# 2️⃣ import the OpenAI client as normal
from openai import OpenAI
client = OpenAI()

# 3️⃣ add your favourite tools and set a message
agent = into.Agent().add_tools(['Slack','OpenAI']).add_messages("What was the last thing said in each slack channel? Write a 5 line poem to summarise and share it in an appropriate channel")

# 4️⃣ start the agent loop, with an OpenAI completion
while agent:
  agent.completion = client.chat.completions.create(
    model="gpt-4o",
    messages=agent.messages,
    tools=agent.tools,
    tool_choice="auto"
  )

# 5️⃣ watch the magic happen! 🎩✨
```

This prints the following output:
```python
[user]		  What was the last thing said in each slack channel? Write a 5 line poem to summ
		    arise and share it in an appropriate channel

[assistant]     Calling 1 tool:
		    list_channels({})

[tool]	      Output of tool call list_channels({})
		      Channels: [{'id': 'C07EEUES770', 'name': 'general', 'is_channel':...

[assistant]	  Calling 7 tools:
                read_messages({"channel": "general"})
                read_messages({"channel": "hello"})
                read_messages({"channel": "fun-times"})
                read_messages({"channel": "poetry"})
                read_messages({"channel": "jokes"})
                read_messages({"channel": "welcome"})
                read_messages({"channel": "random"})

[tool]		  Output of tool call read_messages({"channel": "random"})
		    Messages: [{'subtype': 'channel_join', 'user': 'U07ET3LMDB7', ...

[tool]		  Output of tool call read_messages({"channel": "welcome"})
		    Error reading messages: not_in_channel

[tool]		  Output of tool call read_messages({"channel": "jokes"})
		    Messages: [{'user': 'U07EP9STUTU', 'type': 'message', 'ts': '1722477596.990159', ...

[tool]		  Output of tool call read_messages({"channel": "poetry"})
		    Messages: [{'user': 'U07EP9STUTU', 'type': 'message', 'ts':...

[tool]		  Output of tool call read_messages({"channel": "fun-times"})
		    Messages: [{'user': 'U07EP9STUTU', 'type': 'message', 'ts': ...

[tool]		  Output of tool call read_messages({"channel": "hello"})
		    Messages: [{'subtype': 'channel_join', 'user': 'U07EP9STUTU', 'text': '<@U07EP9STUTU> has joined the channel', 'type': 'message', 'ts': '1722421262.722359'}]

[tool]		  Output of tool call read_messages({"channel": "general"})
		    Messages: [{'user': 'U07EP9STUTU', 'type': 'message', 'ts': ...

[assistant]	  Here is a summary poem of the last messages shared in the Slack channels:
		
                In **#general**, they said with glee,
                "Atoms make up all that we see!"
                **#hello** greeted someone new,
                Joining the channel, the greetings flew.
                
                **#fun-times** loved a joke,
                On atoms that can't be broke.
                In the **#poetry** nook, atoms we praised,
                With playful rhymes, our spirits raised.
                
                **#jokes** was all about the spin,
                Electrons in a debate always win.
                And though **#welcome** remained hushed,
                **#random** laughed at atoms crushed.
                
                I'll share this poem in the **#poetry** channel.

[tool]		  Output of tool call send_slack_message({"channel":"poetry","message":"Here's a "...
                Message sent to poetry with timestamp 1722493789.651039: 
                Here's a summary poem of our last messages:
                
                In **#general**, they said with glee,
                "Atoms make up all that we see!"
                **#hello** greeted someone new,
                Joining the channel, the greetings flew.
                
                **#fun-times** loved a joke,
                On atoms that can't be broke.
                In the **#poetry** nook, atoms we praised,
                With playful rhymes, our spirits raised.
                
                **#jokes** was all about the spin,
                Electrons in a debate always win.
                And though **#welcome** remained hushed,
                **#random** laughed at atoms crushed.

[assistant]	  I have shared the poem summarizing the last messages in each channel
                to the **#poetry** channel.
```

`agent.messages` is also updated with the latest messages and retains the format needed by the OpenAI SDK, so you can continue the adventure and build more complex applications.

You can run this example in [this Jupyter notebook](./quickstart.ipynb).

### Setting the system message

You can modify the behaviour of your agent by setting the system message.

```python
agent = into.Agent("Always talk like a pirate")
```

### Configuring tools

#### Using environment variables (Recommended for production)

Tools usually require a `token`. Tokens can always be configured by setting the relevant environment variables. e.g. for `Slack` you can set the `SLACK_BOT_TOKEN` environment variable.

If you are using environment variables, you can take advantage of `agent.add_tools` or the `into.import_tools` function to automatically configure your tools. This function will look for the relevant environment variables and configure the tools with default settings.

```python
agent.add_tools(['Slack'])
```

or

```python
tools = into.import_tools(['Slack'])
```

#### Using a `.env` file (Recommended for local development)

You can also configure your tools using a `.env` file. This is useful if you want to keep your tokens and other settings in a single file and helpful for local development.

Simply add a `.env` file in the root of your project with the following format:

```env
SLACK_BOT_TOKEN=xoxb-12345678-xxxxxxxxxx
```

#### Setting tokens directly in code

If you prefer to set the token directly in your code or have more control over tool settings, you can do so by passing arguments to each tool. Tokens provided in code will override any environment variables.

You can optionally restrict `functions` to only those which you need.

Here's an example of configuring the Slack tool:

```python
tools = [*into.Slack(
    token="xoxb-12345678-xxxxxxxxxx",
    functions=["send_slack_message"]
)]
```

Note that each tool is preceded by an asterisk `*` to unpack the tool's functions into a list, which the OpenAI API SDK expects.

## 📦 Available tools

`into` comes with loads of pre-built tools to help you get started quickly. These tools are designed to be simple, powerful and flexible, and can be used in any combination to create a wide range of applications.

| Tool | Description | Functions | Configuration |
| --- | --- | --- | --- |
| [Self](https://interfaces.to/tools/self) | Encourage self awareness, time awareness and logical evaluation. | `wait`, `plan`, `get_time`, `do_math` | None required. |
| [System](https://interfaces.to/tools/system) | Control the system behaviour. | `get_system_message`, `set_system_message`, `clear_system_message` | None required. |
| [OpenAI](https://interfaces.to/tools/openai) | Create completions and embeddings with the OpenAI API (Yes, that means self-prompting 🔥) | `create_chat_completion`, `create_embedding` | Uses `OPENAI_API_KEY` environment variable |
| [Slack](https://interfaces.to/tools/slack) | Send messages to Slack channels, create channels, list channels, and read messages | `send_slack_message`, `create_channel`, `list_channels`, `read_messages` | Uses `SLACK_BOT_TOKEN` environment variable |
| [Notion](https://interfaces.to/tools/notion) | Find, read and create pages in Notion | `search_notion`, `query_notion_database`, `read_notion_page`, `create_notion_page` | Uses `NOTION_TOKEN` environment variable. Databases must be explicitly shared with the integration. |
| [Airtable](https://interfaces.to/tools/airtable) | Find, read and create records in Airtable | `list_all_bases`, `get_base`, `list_base_records`, `create_base_records` | Uses `AIRTABLE_TOKEN` environment variable |
| [People Data Labs](https://interfaces.to/tools/people-data-labs) | Find information about people and companies | `find_person`, `find_company` | Uses `PDL_API_KEY` environment variable |

More tools are coming soon:

* Twilio
* GitHub
* Jira
* Discord
* and more!

See the [🛠️ Tools Project plan](https://github.com/orgs/interfaces-to/projects/1) for more information on upcoming tools.

## 🌐 Experimental: Dynamic messages

`into` supports reading messages dynamically. The `messages` variable required by OpenAI SDK is a list of dictionaries, where each dictionary represents a message. Each message must have a `role` key with a value of either `user` or `assistant`, and a `content` key with the message content.

You can use `agent.add_messages` or `into.read_messages` to configure dynamic messages.

```python
agent.add_messages(["What was the last thing said in each slack channel? Write a 5 line poem to summarise and share it in an appropriate channel"])
```

or

```python
messages = into.read_messages(["Slack"])
```

The following sources are currently supported:

| Source | Description | Configuration |
| --- | --- | --- |
| [Slack](https://interfaces.to/messages/slack) | Read messages from a Slack channel where your app is mentioned or in direct messages | Requires `SLACK_APP_TOKEN` and `SLACK_BOT_TOKEN` environment variable. Socket Mode must be enabled with the appropriate events. |
| [Ngrok](https://interfaces.to/messages/ngrok) | Receive POST /message body using Ngrok. Useful for testing webhooks locally. | Requires `NGROK_AUTHTOKEN` environment variable. |
| [FastAPI](https://interfaces.to/messages/fastapi) | Receive POST /message body on Port 8080 with FastAPI. | None required. |
| [Gradio](https://interfaces.to/messages/gradio) | Receive messages from Gradio's ChatInterface. | None required. |
| [CLI](https://interfaces.to/messages/cli) | Read messages from the command line. For use in scripts executed on the command line or with running `into` itself (see below). | None required. |

See the [💬 Messages Project plan](https://github.com/orgs/interfaces-to/projects/3) for more information on upcoming tools.

### Limitations

* Currently only one source can be configured at a time.
* History is not retained between the resolution of messages, however `into` is able to simulate message history by calling the Slack `read_messages` tool if equipped with `into.import_tools(['Slack'])`. 

## 📟 Experimental: CLI Support

`into` supports running tools from the command line. This is useful when `CLI` is the message source, allowing you to run as a standalone application.

By default this uses OpenAI and requires the `OPENAI_API_KEY` environment variable.

You can install `into` with the CLI support by running:

```bash
pipx ensurepath
pipx install interfaces_to
```

Since `pipx` installs packages in an isolated environment, you may need to add the dependencies for the tools you want to use. To do this for the Slack tool, you can do this at install time by running:

```bash
pipx install interfaces_to[slack]
```

If you want to use additional tools after install, you can install the dependencies with:

```bash
pipx inject interfaces_to ngrok
```

Then you can run `into` with the following command:

```bash
into --tools=Slack "What was the last thing said in each slack channel? Write a 5 line poem to summarise and share it in an appropriate channel"
```

or run it with poetry by cloning this repository:

```bash
poetry install
```
  
Then you can run `into` with the following command:
  
  ```bash
poetry run into --tools=Slack "What was the last thing said in each slack channel? Write a 5 line poem to summarise and share it in an appropriate channel"
```

You can also pipe messages to `into`, which will output JSON for manipulation in other tools like `jq`:

```bash
echo "What was the last thing said in each slack channel? Write a 5 line poem to summarise and share it in an appropriate channel" | into --tools=Slack
```

### Usage

* `--help` - Show help message.
* `--tools` - A comma-separated list of tools to import. e.g. `--tools=Slack,OpenAI`
* `--messages` - A comma-separated list of message sources to import. e.g. `--messages=Slack,CLI`
* `--model` - The model to use for completions. e.g. `--model=gpt-4o`
* `--api-key` - The OpenAI API key to use for completions. e.g. `--api-key=sk-12345678`
* `--endpoint` - The endpoint to use for completions. e.g. `--endpoint=https://myendpoint`
* `--azure` - Use Azure functions for completions. e.g. `--azure`
* `[message]` - The message to send to the tools when `--messages=CLI` is set. This can passed in via stdin or as the last argument. When provided, `into` will run the tools and output the result as JSON to stdout.

### Use with Azure OpenAI

You can use `into` with Azure OpenAI by setting the flags below.

```bash
into --tools=Slack --azure --endpoint=https://azure-endpoint --model=gpt-4o --api-key=sk-12345678 "summarise the last messages in each slack channel"
```

### Use with open source models via Ollama

You can use `into` with open source models via Ollama by setting the flags below. **Important:** The model MUST support function calling. Full list of compatible models can be found [here](https://ollama.com/search?c=tools).

```bash
into --tools=Slack --endpoint=http://localhost:11434/v1 --model=llama3.1:8b "what is the time?"
```

## 📚 Documentation (coming soon!)

For more information, check out the [detailed documentation](https://interfaces.to).

## 💬 Community

Join the [🐙 Interfaces Slack](https://join.slack.com/t/interfacesto/shared_invite/zt-2nocjgn6q-SkrZJ9wppcJLz0Cn9Utw8A) to chat with other LLM adventurers, ask questions, and share your projects.

## 🤝 Contributors (coming soon!)

We welcome contributions from the community! Please see our [contributing guide](./contributing.md) for more information.

Notable contributors and acknowledgements:

[@blairhudson](https://github.com/blairhudson) &bull; 🐙

## 🫶 License

This project is licensed under the MIT License - see the [LICENSE](./LICENSE) file for details.


            

Raw data

            {
    "_id": null,
    "home_page": "https://interfaces.to",
    "name": "interfaces-to",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.10",
    "maintainer_email": null,
    "keywords": "llm-agent, openai",
    "author": "Blair Hudson",
    "author_email": "blairhudson@users.noreply.github.com",
    "download_url": "https://files.pythonhosted.org/packages/fe/8a/a14a9a85df4e3ea43d8729a18b895f5ba6da7112c9eb2cccc4d70ca802f1/interfaces_to-0.2.30.tar.gz",
    "platform": null,
    "description": "\n<center><img src=\"https://github.com/interfaces-to/interfaces-to/raw/main/interfaces-to.png\" alt=\"interfaces.to\" width=\"250\" /></center>\n\n\n# Add a little action to your LLM adventure with \ud83d\udc19 Interfaces\n\n**\ud83d\udc19 Interfaces** (aka `into`) is the quickest way to make Large Language Models _do_ things. It's a simple, powerful and flexible way to build more useful, more engaging and more valuable agent-driven applications with LLMs.\n\n## \u2728 Key Features\n\n\u2b50\ufe0f Built-in tools for common tasks and platforms ([see all](#-available-tools))<br/>\n\u2b50\ufe0f Dynamic message sources for real-time interactions ([see all](#-experimental-dynamic-messages))<br/>\n\u2b50\ufe0f Start building with ~~just 4~~ only 3(!) lines of code<br/>\n\u2b50\ufe0f Create agents with system messages to control behaviour ([more info](#-setting-the-system-message))<br/>\n\u2b50\ufe0f Beginner-friendly Python library, learn and teach coding with **\ud83d\udc19 Interfaces**!<br/>\n\u2b50\ufe0f Simple and secure configuration<br/>\n\u2b50\ufe0f Fully compatible with the OpenAI API SDK<br/>\n\u2b50\ufe0f Works with gpt-4o, gpt-4o-mini and other OpenAI models<br/>\n\u2b50\ufe0f Works with llama3.1, mistral-large and more via [Ollama](https://ollama.com/search?c=tools)<br/>\n\u2b50\ufe0f Works with Azure OpenAI ([more info](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/switching-endpoints))<br/>\n\u2b50\ufe0f Supports (thrives) on `parallel_tool_calls` ([more info](https://platform.openai.com/docs/guides/function-calling/parallel-function-calling)) <br/>\n\u2b50\ufe0f Works with any other LLM applications and services that support the OpenAI API<br/>\n\u2b50\ufe0f Runs on your local machine, in the cloud, or on the edge<br/>\n\u2b50\ufe0f Run tools from the command line with the `into` CLI ([see all](#-experimental-cli-support))<br/>\n\u2b50\ufe0f Extensible design for building custom tools ([example](https://github.com/interfaces-to/interfaces-to/blob/main/interfaces_to/tools/peopledatalabs.py)) and message sources ([example](https://github.com/interfaces-to/interfaces-to/blob/main/interfaces_to/messages/ngrok.py))<br/>\n\u2b50\ufe0f Open-source, MIT licensed, and community-driven<br/>\n\n## \ud83d\ude80 Quick Start\n\n### Installation\n\nInstall with pip:\n```bash\npip install interfaces-to\n```\n\nor\n\nInstall with poetry:\n```bash\npoetry add interfaces-to\n```\n\n### Usage\n\nTurn your OpenAI completion into a fully capable agent with 3 lines of code:\n\n```python\n# 1\ufe0f\u20e3 import `into`\nimport interfaces_to as into\n\n# 2\ufe0f\u20e3 import the OpenAI client as normal\nfrom openai import OpenAI\nclient = OpenAI()\n\n# 3\ufe0f\u20e3 add your favourite tools and set a message\nagent = into.Agent().add_tools(['Slack','OpenAI']).add_messages(\"What was the last thing said in each slack channel? Write a 5 line poem to summarise and share it in an appropriate channel\")\n\n# 4\ufe0f\u20e3 start the agent loop, with an OpenAI completion\nwhile agent:\n  agent.completion = client.chat.completions.create(\n    model=\"gpt-4o\",\n    messages=agent.messages,\n    tools=agent.tools,\n    tool_choice=\"auto\"\n  )\n\n# 5\ufe0f\u20e3 watch the magic happen! \ud83c\udfa9\u2728\n```\n\nThis prints the following output:\n```python\n[user]\t\t  What was the last thing said in each slack channel? Write a 5 line poem to summ\n\t\t    arise and share it in an appropriate channel\n\n[assistant]     Calling 1 tool:\n\t\t    list_channels({})\n\n[tool]\t      Output of tool call list_channels({})\n\t\t      Channels: [{'id': 'C07EEUES770', 'name': 'general', 'is_channel':...\n\n[assistant]\t  Calling 7 tools:\n                read_messages({\"channel\": \"general\"})\n                read_messages({\"channel\": \"hello\"})\n                read_messages({\"channel\": \"fun-times\"})\n                read_messages({\"channel\": \"poetry\"})\n                read_messages({\"channel\": \"jokes\"})\n                read_messages({\"channel\": \"welcome\"})\n                read_messages({\"channel\": \"random\"})\n\n[tool]\t\t  Output of tool call read_messages({\"channel\": \"random\"})\n\t\t    Messages: [{'subtype': 'channel_join', 'user': 'U07ET3LMDB7', ...\n\n[tool]\t\t  Output of tool call read_messages({\"channel\": \"welcome\"})\n\t\t    Error reading messages: not_in_channel\n\n[tool]\t\t  Output of tool call read_messages({\"channel\": \"jokes\"})\n\t\t    Messages: [{'user': 'U07EP9STUTU', 'type': 'message', 'ts': '1722477596.990159', ...\n\n[tool]\t\t  Output of tool call read_messages({\"channel\": \"poetry\"})\n\t\t    Messages: [{'user': 'U07EP9STUTU', 'type': 'message', 'ts':...\n\n[tool]\t\t  Output of tool call read_messages({\"channel\": \"fun-times\"})\n\t\t    Messages: [{'user': 'U07EP9STUTU', 'type': 'message', 'ts': ...\n\n[tool]\t\t  Output of tool call read_messages({\"channel\": \"hello\"})\n\t\t    Messages: [{'subtype': 'channel_join', 'user': 'U07EP9STUTU', 'text': '<@U07EP9STUTU> has joined the channel', 'type': 'message', 'ts': '1722421262.722359'}]\n\n[tool]\t\t  Output of tool call read_messages({\"channel\": \"general\"})\n\t\t    Messages: [{'user': 'U07EP9STUTU', 'type': 'message', 'ts': ...\n\n[assistant]\t  Here is a summary poem of the last messages shared in the Slack channels:\n\t\t\n                In **#general**, they said with glee,\n                \"Atoms make up all that we see!\"\n                **#hello** greeted someone new,\n                Joining the channel, the greetings flew.\n                \n                **#fun-times** loved a joke,\n                On atoms that can't be broke.\n                In the **#poetry** nook, atoms we praised,\n                With playful rhymes, our spirits raised.\n                \n                **#jokes** was all about the spin,\n                Electrons in a debate always win.\n                And though **#welcome** remained hushed,\n                **#random** laughed at atoms crushed.\n                \n                I'll share this poem in the **#poetry** channel.\n\n[tool]\t\t  Output of tool call send_slack_message({\"channel\":\"poetry\",\"message\":\"Here's a \"...\n                Message sent to poetry with timestamp 1722493789.651039: \n                Here's a summary poem of our last messages:\n                \n                In **#general**, they said with glee,\n                \"Atoms make up all that we see!\"\n                **#hello** greeted someone new,\n                Joining the channel, the greetings flew.\n                \n                **#fun-times** loved a joke,\n                On atoms that can't be broke.\n                In the **#poetry** nook, atoms we praised,\n                With playful rhymes, our spirits raised.\n                \n                **#jokes** was all about the spin,\n                Electrons in a debate always win.\n                And though **#welcome** remained hushed,\n                **#random** laughed at atoms crushed.\n\n[assistant]\t  I have shared the poem summarizing the last messages in each channel\n                to the **#poetry** channel.\n```\n\n`agent.messages` is also updated with the latest messages and retains the format needed by the OpenAI SDK, so you can continue the adventure and build more complex applications.\n\nYou can run this example in [this Jupyter notebook](./quickstart.ipynb).\n\n### Setting the system message\n\nYou can modify the behaviour of your agent by setting the system message.\n\n```python\nagent = into.Agent(\"Always talk like a pirate\")\n```\n\n### Configuring tools\n\n#### Using environment variables (Recommended for production)\n\nTools usually require a `token`. Tokens can always be configured by setting the relevant environment variables. e.g. for `Slack` you can set the `SLACK_BOT_TOKEN` environment variable.\n\nIf you are using environment variables, you can take advantage of `agent.add_tools` or the `into.import_tools` function to automatically configure your tools. This function will look for the relevant environment variables and configure the tools with default settings.\n\n```python\nagent.add_tools(['Slack'])\n```\n\nor\n\n```python\ntools = into.import_tools(['Slack'])\n```\n\n#### Using a `.env` file (Recommended for local development)\n\nYou can also configure your tools using a `.env` file. This is useful if you want to keep your tokens and other settings in a single file and helpful for local development.\n\nSimply add a `.env` file in the root of your project with the following format:\n\n```env\nSLACK_BOT_TOKEN=xoxb-12345678-xxxxxxxxxx\n```\n\n#### Setting tokens directly in code\n\nIf you prefer to set the token directly in your code or have more control over tool settings, you can do so by passing arguments to each tool. Tokens provided in code will override any environment variables.\n\nYou can optionally restrict `functions` to only those which you need.\n\nHere's an example of configuring the Slack tool:\n\n```python\ntools = [*into.Slack(\n    token=\"xoxb-12345678-xxxxxxxxxx\",\n    functions=[\"send_slack_message\"]\n)]\n```\n\nNote that each tool is preceded by an asterisk `*` to unpack the tool's functions into a list, which the OpenAI API SDK expects.\n\n## \ud83d\udce6 Available tools\n\n`into` comes with loads of pre-built tools to help you get started quickly. These tools are designed to be simple, powerful and flexible, and can be used in any combination to create a wide range of applications.\n\n| Tool | Description | Functions | Configuration |\n| --- | --- | --- | --- |\n| [Self](https://interfaces.to/tools/self) | Encourage self awareness, time awareness and logical evaluation. | `wait`, `plan`, `get_time`, `do_math` | None required. |\n| [System](https://interfaces.to/tools/system) | Control the system behaviour. | `get_system_message`, `set_system_message`, `clear_system_message` | None required. |\n| [OpenAI](https://interfaces.to/tools/openai) | Create completions and embeddings with the OpenAI API (Yes, that means self-prompting \ud83d\udd25) | `create_chat_completion`, `create_embedding` | Uses `OPENAI_API_KEY` environment variable |\n| [Slack](https://interfaces.to/tools/slack) | Send messages to Slack channels, create channels, list channels, and read messages | `send_slack_message`, `create_channel`, `list_channels`, `read_messages` | Uses `SLACK_BOT_TOKEN` environment variable |\n| [Notion](https://interfaces.to/tools/notion) | Find, read and create pages in Notion | `search_notion`, `query_notion_database`, `read_notion_page`, `create_notion_page` | Uses `NOTION_TOKEN` environment variable. Databases must be explicitly shared with the integration. |\n| [Airtable](https://interfaces.to/tools/airtable) | Find, read and create records in Airtable | `list_all_bases`, `get_base`, `list_base_records`, `create_base_records` | Uses `AIRTABLE_TOKEN` environment variable |\n| [People Data Labs](https://interfaces.to/tools/people-data-labs) | Find information about people and companies | `find_person`, `find_company` | Uses `PDL_API_KEY` environment variable |\n\nMore tools are coming soon:\n\n* Twilio\n* GitHub\n* Jira\n* Discord\n* and more!\n\nSee the [\ud83d\udee0\ufe0f Tools Project plan](https://github.com/orgs/interfaces-to/projects/1) for more information on upcoming tools.\n\n## \ud83c\udf10 Experimental: Dynamic messages\n\n`into` supports reading messages dynamically. The `messages` variable required by OpenAI SDK is a list of dictionaries, where each dictionary represents a message. Each message must have a `role` key with a value of either `user` or `assistant`, and a `content` key with the message content.\n\nYou can use `agent.add_messages` or `into.read_messages` to configure dynamic messages.\n\n```python\nagent.add_messages([\"What was the last thing said in each slack channel? Write a 5 line poem to summarise and share it in an appropriate channel\"])\n```\n\nor\n\n```python\nmessages = into.read_messages([\"Slack\"])\n```\n\nThe following sources are currently supported:\n\n| Source | Description | Configuration |\n| --- | --- | --- |\n| [Slack](https://interfaces.to/messages/slack) | Read messages from a Slack channel where your app is mentioned or in direct messages | Requires `SLACK_APP_TOKEN` and `SLACK_BOT_TOKEN` environment variable. Socket Mode must be enabled with the appropriate events. |\n| [Ngrok](https://interfaces.to/messages/ngrok) | Receive POST /message body using Ngrok. Useful for testing webhooks locally. | Requires `NGROK_AUTHTOKEN` environment variable. |\n| [FastAPI](https://interfaces.to/messages/fastapi) | Receive POST /message body on Port 8080 with FastAPI. | None required. |\n| [Gradio](https://interfaces.to/messages/gradio) | Receive messages from Gradio's ChatInterface. | None required. |\n| [CLI](https://interfaces.to/messages/cli) | Read messages from the command line. For use in scripts executed on the command line or with running `into` itself (see below). | None required. |\n\nSee the [\ud83d\udcac Messages Project plan](https://github.com/orgs/interfaces-to/projects/3) for more information on upcoming tools.\n\n### Limitations\n\n* Currently only one source can be configured at a time.\n* History is not retained between the resolution of messages, however `into` is able to simulate message history by calling the Slack `read_messages` tool if equipped with `into.import_tools(['Slack'])`. \n\n## \ud83d\udcdf Experimental: CLI Support\n\n`into` supports running tools from the command line. This is useful when `CLI` is the message source, allowing you to run as a standalone application.\n\nBy default this uses OpenAI and requires the `OPENAI_API_KEY` environment variable.\n\nYou can install `into` with the CLI support by running:\n\n```bash\npipx ensurepath\npipx install interfaces_to\n```\n\nSince `pipx` installs packages in an isolated environment, you may need to add the dependencies for the tools you want to use. To do this for the Slack tool, you can do this at install time by running:\n\n```bash\npipx install interfaces_to[slack]\n```\n\nIf you want to use additional tools after install, you can install the dependencies with:\n\n```bash\npipx inject interfaces_to ngrok\n```\n\nThen you can run `into` with the following command:\n\n```bash\ninto --tools=Slack \"What was the last thing said in each slack channel? Write a 5 line poem to summarise and share it in an appropriate channel\"\n```\n\nor run it with poetry by cloning this repository:\n\n```bash\npoetry install\n```\n  \nThen you can run `into` with the following command:\n  \n  ```bash\npoetry run into --tools=Slack \"What was the last thing said in each slack channel? Write a 5 line poem to summarise and share it in an appropriate channel\"\n```\n\nYou can also pipe messages to `into`, which will output JSON for manipulation in other tools like `jq`:\n\n```bash\necho \"What was the last thing said in each slack channel? Write a 5 line poem to summarise and share it in an appropriate channel\" | into --tools=Slack\n```\n\n### Usage\n\n* `--help` - Show help message.\n* `--tools` - A comma-separated list of tools to import. e.g. `--tools=Slack,OpenAI`\n* `--messages` - A comma-separated list of message sources to import. e.g. `--messages=Slack,CLI`\n* `--model` - The model to use for completions. e.g. `--model=gpt-4o`\n* `--api-key` - The OpenAI API key to use for completions. e.g. `--api-key=sk-12345678`\n* `--endpoint` - The endpoint to use for completions. e.g. `--endpoint=https://myendpoint`\n* `--azure` - Use Azure functions for completions. e.g. `--azure`\n* `[message]` - The message to send to the tools when `--messages=CLI` is set. This can passed in via stdin or as the last argument. When provided, `into` will run the tools and output the result as JSON to stdout.\n\n### Use with Azure OpenAI\n\nYou can use `into` with Azure OpenAI by setting the flags below.\n\n```bash\ninto --tools=Slack --azure --endpoint=https://azure-endpoint --model=gpt-4o --api-key=sk-12345678 \"summarise the last messages in each slack channel\"\n```\n\n### Use with open source models via Ollama\n\nYou can use `into` with open source models via Ollama by setting the flags below. **Important:** The model MUST support function calling. Full list of compatible models can be found [here](https://ollama.com/search?c=tools).\n\n```bash\ninto --tools=Slack --endpoint=http://localhost:11434/v1 --model=llama3.1:8b \"what is the time?\"\n```\n\n## \ud83d\udcda Documentation (coming soon!)\n\nFor more information, check out the [detailed documentation](https://interfaces.to).\n\n## \ud83d\udcac Community\n\nJoin the [\ud83d\udc19 Interfaces Slack](https://join.slack.com/t/interfacesto/shared_invite/zt-2nocjgn6q-SkrZJ9wppcJLz0Cn9Utw8A) to chat with other LLM adventurers, ask questions, and share your projects.\n\n## \ud83e\udd1d Contributors (coming soon!)\n\nWe welcome contributions from the community! Please see our [contributing guide](./contributing.md) for more information.\n\nNotable contributors and acknowledgements:\n\n[@blairhudson](https://github.com/blairhudson) &bull; \ud83d\udc19\n\n## \ud83e\udef6 License\n\nThis project is licensed under the MIT License - see the [LICENSE](./LICENSE) file for details.\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "The quickest way to make Large Language Models do things.",
    "version": "0.2.30",
    "project_urls": {
        "Documentation": "https://interfaces.to/interfaces-to",
        "Homepage": "https://interfaces.to",
        "Repository": "https://github.com/interfaces-to/interfaces-to"
    },
    "split_keywords": [
        "llm-agent",
        " openai"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e8589aa4129cf18ed110f266de09e1ff954c4daf929cd021d41295c3a05d35ae",
                "md5": "a8c2972f2b8f713f2e302aabf91ae373",
                "sha256": "626f0eb55623d293caa110a1571176aa349a04f7031291475f642778436dd7cb"
            },
            "downloads": -1,
            "filename": "interfaces_to-0.2.30-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a8c2972f2b8f713f2e302aabf91ae373",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.10",
            "size": 29136,
            "upload_time": "2024-08-10T06:37:25",
            "upload_time_iso_8601": "2024-08-10T06:37:25.083530Z",
            "url": "https://files.pythonhosted.org/packages/e8/58/9aa4129cf18ed110f266de09e1ff954c4daf929cd021d41295c3a05d35ae/interfaces_to-0.2.30-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "fe8aa14a9a85df4e3ea43d8729a18b895f5ba6da7112c9eb2cccc4d70ca802f1",
                "md5": "0209d44becd15640fd33f1d3c4cf0728",
                "sha256": "6efe572075aa1a6bcc0b440c0a15d53d63aeae5ea4f48820570906846d5c05ad"
            },
            "downloads": -1,
            "filename": "interfaces_to-0.2.30.tar.gz",
            "has_sig": false,
            "md5_digest": "0209d44becd15640fd33f1d3c4cf0728",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.10",
            "size": 27527,
            "upload_time": "2024-08-10T06:37:26",
            "upload_time_iso_8601": "2024-08-10T06:37:26.700960Z",
            "url": "https://files.pythonhosted.org/packages/fe/8a/a14a9a85df4e3ea43d8729a18b895f5ba6da7112c9eb2cccc4d70ca802f1/interfaces_to-0.2.30.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-08-10 06:37:26",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "interfaces-to",
    "github_project": "interfaces-to",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "interfaces-to"
}
        
Elapsed time: 0.56999s