tool-calling-llm


Nametool-calling-llm JSON
Version 0.1.2 PyPI version JSON
download
home_pageNone
SummaryConvert any LangChain Chat Model into a Tool Calling LLM
upload_time2024-09-19 19:12:13
maintainerNone
docs_urlNone
authorKarim Lalani
requires_python<4.0,>=3.9
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            Tool Calling LLM
================

Tool Calling LLM is a python mixin that lets you add tool calling capabilities effortlessly to [LangChain](https://langchain.com)'s Chat Models that don't yet support tool/function calling natively. Simply create a new chat model class with ToolCallingLLM and your favorite chat model to get started.

With ToolCallingLLM you also get access to the following functions:
1. `.bind_tools()` allows you to bind tool definitions with a llm.
2. `.with_structured_output()` allows you to return structured data from your model. This is now being provided by LangChain's `BaseChatModel` class.

At this time, ToolCallingLLM has been tested to work with ChatOllama, ChatNVIDIA, and ChatLiteLLM with Ollama provider.

The [OllamaFunctions](https://python.langchain.com/v0.2/docs/integrations/chat/ollama_functions/) was the original inspiration for this effort. The code for ToolCallingLLM was abstracted out of `OllamaFunctions` to allow it to be reused with other non tool calling Chat Models.

Installation
------------

```bash
pip install --upgrade tool_calling_llm
```

Usage
-----

Creating a Tool Calling LLM is as simple as creating a new sub class of the original ChatModel you wish to add tool calling features to.  

Below sample code demonstrates how you might enhance `ChatOllama` chat model from `langchain-ollama` package with tool calling capabilities.

```python
from tool_calling_llm import ToolCallingLLM
from langchain_ollama import ChatOllama
from langchain_community.tools import DuckDuckGoSearchRun


class OllamaWithTools(ToolCallingLLM, ChatOllama):
    def __init__(self, **kwargs):
        super().__init__(**kwargs)

    @property
    def _llm_type(self):
        return "ollama_with_tools"


llm = OllamaWithTools(model="llama3.1",format="json")
tools = [DuckDuckGoSearchRun()]
llm_tools = llm.bind_tools(tools=tools)

llm_tools.invoke("Who won the silver medal in shooting in the Paris Olympics in 2024?")
```

This yields output as follows:
```
AIMessage(content='', id='run-9c3c7a78-97af-4d06-835e-aa81174fd7e8-0', tool_calls=[{'name': 'duckduckgo_search', 'args': {'query': 'Paris Olympics 2024 shooting silver medal winner'}, 'id': 'call_67b06088e208482497f6f8314e0f1a0e', 'type': 'tool_call'}])
```
For more comprehensive examples, refer to [ToolCallingLLM-Tutorial.ipynb](ToolCallingLLM-Tutorial.ipynb) jupyter notebook.


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "tool-calling-llm",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": "Karim Lalani",
    "author_email": "jimmy00784@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/2a/be/0e7d3f4d75c49cfe672f408b5ca8e0bb24a4415570cd14c46e968958b5fb/tool_calling_llm-0.1.2.tar.gz",
    "platform": null,
    "description": "Tool Calling LLM\n================\n\nTool Calling LLM is a python mixin that lets you add tool calling capabilities effortlessly to [LangChain](https://langchain.com)'s Chat Models that don't yet support tool/function calling natively. Simply create a new chat model class with ToolCallingLLM and your favorite chat model to get started.\n\nWith ToolCallingLLM you also get access to the following functions:\n1. `.bind_tools()` allows you to bind tool definitions with a llm.\n2. `.with_structured_output()` allows you to return structured data from your model. This is now being provided by LangChain's `BaseChatModel` class.\n\nAt this time, ToolCallingLLM has been tested to work with ChatOllama, ChatNVIDIA, and ChatLiteLLM with Ollama provider.\n\nThe [OllamaFunctions](https://python.langchain.com/v0.2/docs/integrations/chat/ollama_functions/) was the original inspiration for this effort. The code for ToolCallingLLM was abstracted out of `OllamaFunctions` to allow it to be reused with other non tool calling Chat Models.\n\nInstallation\n------------\n\n```bash\npip install --upgrade tool_calling_llm\n```\n\nUsage\n-----\n\nCreating a Tool Calling LLM is as simple as creating a new sub class of the original ChatModel you wish to add tool calling features to.  \n\nBelow sample code demonstrates how you might enhance `ChatOllama` chat model from `langchain-ollama` package with tool calling capabilities.\n\n```python\nfrom tool_calling_llm import ToolCallingLLM\nfrom langchain_ollama import ChatOllama\nfrom langchain_community.tools import DuckDuckGoSearchRun\n\n\nclass OllamaWithTools(ToolCallingLLM, ChatOllama):\n    def __init__(self, **kwargs):\n        super().__init__(**kwargs)\n\n    @property\n    def _llm_type(self):\n        return \"ollama_with_tools\"\n\n\nllm = OllamaWithTools(model=\"llama3.1\",format=\"json\")\ntools = [DuckDuckGoSearchRun()]\nllm_tools = llm.bind_tools(tools=tools)\n\nllm_tools.invoke(\"Who won the silver medal in shooting in the Paris Olympics in 2024?\")\n```\n\nThis yields output as follows:\n```\nAIMessage(content='', id='run-9c3c7a78-97af-4d06-835e-aa81174fd7e8-0', tool_calls=[{'name': 'duckduckgo_search', 'args': {'query': 'Paris Olympics 2024 shooting silver medal winner'}, 'id': 'call_67b06088e208482497f6f8314e0f1a0e', 'type': 'tool_call'}])\n```\nFor more comprehensive examples, refer to [ToolCallingLLM-Tutorial.ipynb](ToolCallingLLM-Tutorial.ipynb) jupyter notebook.\n\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Convert any LangChain Chat Model into a Tool Calling LLM",
    "version": "0.1.2",
    "project_urls": {
        "Homepage": "https://github.com/lalanikarim/tool_calling_llm",
        "Issues": "https://github.com/lalanikarim/tool_calling_llm/issues"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "75bb21d50be4cb02e64e2700bc2dbe26a0646bf74534f0b11c858d5bbc2fefad",
                "md5": "39e507b2d862eed7a62c4b432488a008",
                "sha256": "bc0aa3a1f9f522a0ca9d131e8241be6861233d5ebfb649a04ccf2a00f2ade50c"
            },
            "downloads": -1,
            "filename": "tool_calling_llm-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "39e507b2d862eed7a62c4b432488a008",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 7460,
            "upload_time": "2024-09-19T19:12:11",
            "upload_time_iso_8601": "2024-09-19T19:12:11.465200Z",
            "url": "https://files.pythonhosted.org/packages/75/bb/21d50be4cb02e64e2700bc2dbe26a0646bf74534f0b11c858d5bbc2fefad/tool_calling_llm-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2abe0e7d3f4d75c49cfe672f408b5ca8e0bb24a4415570cd14c46e968958b5fb",
                "md5": "cf55aaf5b7dce14c802a0d901ff3f509",
                "sha256": "b558d0229b6cee840ca3b123673067c2c5fa9323590ed1dcbb5e2c06432c5d8a"
            },
            "downloads": -1,
            "filename": "tool_calling_llm-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "cf55aaf5b7dce14c802a0d901ff3f509",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 6666,
            "upload_time": "2024-09-19T19:12:13",
            "upload_time_iso_8601": "2024-09-19T19:12:13.080536Z",
            "url": "https://files.pythonhosted.org/packages/2a/be/0e7d3f4d75c49cfe672f408b5ca8e0bb24a4415570cd14c46e968958b5fb/tool_calling_llm-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-09-19 19:12:13",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "lalanikarim",
    "github_project": "tool_calling_llm",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "tool-calling-llm"
}
        
Elapsed time: 0.28383s