langchain-prolog


Namelangchain-prolog JSON
Version 0.1.0 PyPI version JSON
download
home_pagehttps://github.com/apisani1/langchain-prolog
SummaryAn integration package connecting Prolog and LangChain
upload_time2025-02-11 22:42:48
maintainerNone
docs_urlNone
authorAntonio Pisani
requires_python<4.0,>=3.10
licenseMIT
keywords langchain prolog swi-prolog llm agent
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
[![PyPI version](https://badge.fury.io/py/langchain-prolog.svg)](https://badge.fury.io/py/langchain-prolog)
[![Python 3.9+](https://img.shields.io/badge/python-3.9+-blue.svg)](https://www.python.org/downloads/release/python-390/)
[![Documentation Status](https://readthedocs.org/projects/langchain-prolog/badge/?version=latest)](https://langchain-prolog.readthedocs.io/en/latest/?badge=latest)
![Version](https://img.shields.io/badge/version-0.1.0-blue)

# LangChain-Prolog

A Python library that integrates SWI-Prolog with LangChain, allowing seamless
blending of Prolog's logic programming capabilities into LangChain applications.


## Features

- Seamless integration between LangChain and SWI-Prolog
- Use Prolog queries as LangChain runnables and tools
- Invoke Prolog predicates from LangChain LLM models, chains and agents
- Support for both synchronous and asynchronous operations
- Comprehensive error handling and logging
- Cross-platform support (macOS, Linux, Windows)

## Installation

#### Prerequisites

- Python 3.10 or later
- SWI-Prolog installed on your system
- The following python libraries installed:
    - langchain 0.3.0 or later
    - janus-swi 1.5.0 or later
    - pydantic 0.2.0 or later

langchain-prolog can be installed using pip:
```bash
pip install langchain-prolog
```

## Runnable Interface

The PrologRunnable class allows the generation of langchain runnables that use Prolog rules to generate answers.

Let's assume that we have the following set of Prolog rules in the file family.pl:

```prolog
parent(john, bianca, mary).
parent(john, bianca, michael).
parent(peter, patricia, jennifer).
partner(X, Y) :- parent(X, Y, _).
```

There are three diferent ways to use a PrologRunnable to query Prolog:

#### 1) Using a Prolog runnable with a full predicate string

```python
from langchain_prolog import PrologConfig, PrologRunnable

config = PrologConfig(rules_file="family.pl")
prolog = PrologRunnable(prolog_config=config)
result = prolog.invoke("partner(X, Y)")
print(result)
```
We can pass a string representing a single predicate query. The invoke method will return `True`, `False` or a list of dictionaries with all the solutions to the query:
```python
[{'X': 'john', 'Y': 'bianca'},
 {'X': 'john', 'Y': 'bianca'},
 {'X': 'peter', 'Y': 'patricia'}]
 ```

#### 2) Using a Prolog runnable with a default predicate

```python
from langchain_prolog import PrologConfig, PrologRunnable

config = PrologConfig(rules_file="family.pl", default_predicate="partner")
prolog = PrologRunnable(prolog_config=config)
result = prolog.invoke("peter, X")
print(result)
```
When using a default predicate, only the arguments for the predicate are passed to the Prolog runable, as a single string. Following Prolog conventions, uppercase identifiers are variables and lowercase identifiers are values (atoms or strings):

```python
[{'X': 'patricia'}]
```

### 3) Using a Prolog runnable with a dictionary and schema validation

```python
from langchain_prolog import PrologConfig, PrologRunnable

schema = PrologRunnable.create_schema("partner", ["man", "woman"])
config = PrologConfig(rules_file="family.pl", query_schema=schema)
prolog = PrologRunnable(prolog_config=config)
result = prolog.invoke({"man": None, "woman": "bianca"})
print(result)
```
If a schema is defined, we can pass a dictionary using the names of the parameters in the schema as the keys in the dictionary. The values can represent Prolog variables (uppercase first letter) or strings (lower case first letter). A `None` value is interpreted as a variable and replaced with the key capitalized:
```python
[{'Man': 'john'}, {'Man': 'john'}]
```

You can also pass a Pydantic object generated with the schema to the invoke method:
```python
args = schema(man='M', woman='W')
result = prolog.invoke(args)
print(result)
```
Uppercase values are treated as variables:
```python
[{'M': 'john', 'W': 'bianca'},
 {'M': 'john', 'W': 'bianca'},
 {'M': 'peter', 'W': 'patricia'}]
 ```

## Tool Interface

The PrologTool class allows the generation of langchain tools that use Prolog rules to generate answers.

See the Runnable Interface section for the conventions on hown to pass variables and values to the Prolog interpreter.

Let's assume that we have the following set of Prolog rules in the file family.pl:

```prolog
parent(john, bianca, mary).
parent(john, bianca, michael).
parent(peter, patricia, jennifer).
partner(X, Y) :- parent(X, Y, _).
```

There are two ways to use a Prolog tool:

### 1) Using a Prolog tool with an LLM and function calling

First create the Prolog tool:
```python
from langchain_prolog import PrologConfig, PrologTool

schema = PrologRunnable.create_schema('parent', ['men', 'women', 'child'])
config = PrologConfig(
            rules_file="family.pl",
            query_schema=schema,
        )
prolog_tool = PrologTool(
    prolog_config=config,
    name="family_query",
    description="""
        Query family relationships using Prolog.
        parent(X, Y, Z) implies only that Z is a child of X and Y.
        Input can be a query string like 'parent(john, X, Y)'
        or 'john, X, Y'"
        You have to specify 3 parameters: men, woman, child.
        Do not use quotes.
    """
)
```

Then bind it to the LLM model and query the model:
```python
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage

llm = ChatOpenAI(model="gpt-4o-mini")
llm_with_tools = llm.bind_tools([prolog_tool])
messages = [HumanMessage("Who are John's children?")]
response = llm_with_tools.invoke(messages)
messages.append(response)
print(response.tool_calls[0])
```
The LLM will respond with a tool call request:
```python
{'name': 'family_query',
 'args': {'men': 'john', 'women': None, 'child': None},
 'id': 'call_V6NUsJwhF41G9G7q6TBmghR0',
 'type': 'tool_call'}
 ```
 The tool takes this request and queries the Prolog database:
 ```python
 tool_msg = prolog_tool.invoke(response.tool_calls[0])
messages.append(tool_msg)
print(tool_msg)
 ```
The tool returns a list with all the solutions for the query:
 ```python
 content='[{"Women": "bianca", "Child": "mary"}, {"Women": "bianca", "Child": "michael"}]'
 name='family_query'
 tool_call_id='call_V6NUsJwhF41G9G7q6TBmghR0'
 ```
 That we then pass to the LLM:
 ```python
 answer = llm_with_tools.invoke(messages)
 print(answer.content)
 ```
 And the LLM answers the original query using the tool response:
 ```python
 John has two children: Mary and Michael. Their mother is Bianca.
 ```

### 2) Using a Prolog tool with an agent

First create the Prolog tool:
```python
from langchain_prolog import PrologConfig, PrologTool

schema = PrologRunnable.create_schema('parent', ['men', 'women', 'child'])
config = PrologConfig(
            rules_file="family.pl",
            query_schema=schema,
        )
prolog_tool = PrologTool(
    prolog_config=config,
    name="family_query",
    description="""
        Query family relationships using Prolog.
        parent(X, Y, Z) implies only that Z is a child of X and Y.
        Input can be a query string like 'parent(john, X, Y)'
        or 'john, X, Y'"
        You have to specify 3 parameters: men, woman, child.
        Do not use quotes.
    """
)
```
Then pass it to the agent's constructor:
```python
from langchain_core.prompts import ChatPromptTemplate
from langchain.agents import create_tool_calling_agent, AgentExecutor

llm = ChatOpenAI(model="gpt-4o-mini")
prompt = ChatPromptTemplate.from_messages(
    [
        ("system", "You are a helpful assistant"),
        ("human", "{input}"),
        ("placeholder", "{agent_scratchpad}"),
    ]
)
tools = [prolog_tool]
agent = create_tool_calling_agent(llm, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools)
```
The agent takes the query and use the Prolog tool if needed:
```python
answer = agent_executor.invoke({"input": "Who are John's children?"})
print(answer)
```
Then the agent recieves the tool response as part of the {agent_scratchpad} placeholder and generates the answer:
```python
{'input': "Who are John's children?", 'output': 'John has two children: Mary and Michael, with Bianca as their mother.'}

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/apisani1/langchain-prolog",
    "name": "langchain-prolog",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.10",
    "maintainer_email": null,
    "keywords": "langchain, prolog, swi-prolog, llm, agent",
    "author": "Antonio Pisani",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/7f/2f/3fbb488e4681443b8e3d42d4b48dd1ae7140c1ee312fc800e1aa0bd07149/langchain_prolog-0.1.0.tar.gz",
    "platform": null,
    "description": "[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)\n[![PyPI version](https://badge.fury.io/py/langchain-prolog.svg)](https://badge.fury.io/py/langchain-prolog)\n[![Python 3.9+](https://img.shields.io/badge/python-3.9+-blue.svg)](https://www.python.org/downloads/release/python-390/)\n[![Documentation Status](https://readthedocs.org/projects/langchain-prolog/badge/?version=latest)](https://langchain-prolog.readthedocs.io/en/latest/?badge=latest)\n![Version](https://img.shields.io/badge/version-0.1.0-blue)\n\n# LangChain-Prolog\n\nA Python library that integrates SWI-Prolog with LangChain, allowing seamless\nblending of Prolog's logic programming capabilities into LangChain applications.\n\n\n## Features\n\n- Seamless integration between LangChain and SWI-Prolog\n- Use Prolog queries as LangChain runnables and tools\n- Invoke Prolog predicates from LangChain LLM models, chains and agents\n- Support for both synchronous and asynchronous operations\n- Comprehensive error handling and logging\n- Cross-platform support (macOS, Linux, Windows)\n\n## Installation\n\n#### Prerequisites\n\n- Python 3.10 or later\n- SWI-Prolog installed on your system\n- The following python libraries installed:\n    - langchain 0.3.0 or later\n    - janus-swi 1.5.0 or later\n    - pydantic 0.2.0 or later\n\nlangchain-prolog can be installed using pip:\n```bash\npip install langchain-prolog\n```\n\n## Runnable Interface\n\nThe PrologRunnable class allows the generation of langchain runnables that use Prolog rules to generate answers.\n\nLet's assume that we have the following set of Prolog rules in the file family.pl:\n\n```prolog\nparent(john, bianca, mary).\nparent(john, bianca, michael).\nparent(peter, patricia, jennifer).\npartner(X, Y) :- parent(X, Y, _).\n```\n\nThere are three diferent ways to use a PrologRunnable to query Prolog:\n\n#### 1) Using a Prolog runnable with a full predicate string\n\n```python\nfrom langchain_prolog import PrologConfig, PrologRunnable\n\nconfig = PrologConfig(rules_file=\"family.pl\")\nprolog = PrologRunnable(prolog_config=config)\nresult = prolog.invoke(\"partner(X, Y)\")\nprint(result)\n```\nWe can pass a string representing a single predicate query. The invoke method will return `True`, `False` or a list of dictionaries with all the solutions to the query:\n```python\n[{'X': 'john', 'Y': 'bianca'},\n {'X': 'john', 'Y': 'bianca'},\n {'X': 'peter', 'Y': 'patricia'}]\n ```\n\n#### 2) Using a Prolog runnable with a default predicate\n\n```python\nfrom langchain_prolog import PrologConfig, PrologRunnable\n\nconfig = PrologConfig(rules_file=\"family.pl\", default_predicate=\"partner\")\nprolog = PrologRunnable(prolog_config=config)\nresult = prolog.invoke(\"peter, X\")\nprint(result)\n```\nWhen using a default predicate, only the arguments for the predicate are passed to the Prolog runable, as a single string. Following Prolog conventions, uppercase identifiers are variables and lowercase identifiers are values (atoms or strings):\n\n```python\n[{'X': 'patricia'}]\n```\n\n### 3) Using a Prolog runnable with a dictionary and schema validation\n\n```python\nfrom langchain_prolog import PrologConfig, PrologRunnable\n\nschema = PrologRunnable.create_schema(\"partner\", [\"man\", \"woman\"])\nconfig = PrologConfig(rules_file=\"family.pl\", query_schema=schema)\nprolog = PrologRunnable(prolog_config=config)\nresult = prolog.invoke({\"man\": None, \"woman\": \"bianca\"})\nprint(result)\n```\nIf a schema is defined, we can pass a dictionary using the names of the parameters in the schema as the keys in the dictionary. The values can represent Prolog variables (uppercase first letter) or strings (lower case first letter). A `None` value is interpreted as a variable and replaced with the key capitalized:\n```python\n[{'Man': 'john'}, {'Man': 'john'}]\n```\n\nYou can also pass a Pydantic object generated with the schema to the invoke method:\n```python\nargs = schema(man='M', woman='W')\nresult = prolog.invoke(args)\nprint(result)\n```\nUppercase values are treated as variables:\n```python\n[{'M': 'john', 'W': 'bianca'},\n {'M': 'john', 'W': 'bianca'},\n {'M': 'peter', 'W': 'patricia'}]\n ```\n\n## Tool Interface\n\nThe PrologTool class allows the generation of langchain tools that use Prolog rules to generate answers.\n\nSee the Runnable Interface section for the conventions on hown to pass variables and values to the Prolog interpreter.\n\nLet's assume that we have the following set of Prolog rules in the file family.pl:\n\n```prolog\nparent(john, bianca, mary).\nparent(john, bianca, michael).\nparent(peter, patricia, jennifer).\npartner(X, Y) :- parent(X, Y, _).\n```\n\nThere are two ways to use a Prolog tool:\n\n### 1) Using a Prolog tool with an LLM and function calling\n\nFirst create the Prolog tool:\n```python\nfrom langchain_prolog import PrologConfig, PrologTool\n\nschema = PrologRunnable.create_schema('parent', ['men', 'women', 'child'])\nconfig = PrologConfig(\n            rules_file=\"family.pl\",\n            query_schema=schema,\n        )\nprolog_tool = PrologTool(\n    prolog_config=config,\n    name=\"family_query\",\n    description=\"\"\"\n        Query family relationships using Prolog.\n        parent(X, Y, Z) implies only that Z is a child of X and Y.\n        Input can be a query string like 'parent(john, X, Y)'\n        or 'john, X, Y'\"\n        You have to specify 3 parameters: men, woman, child.\n        Do not use quotes.\n    \"\"\"\n)\n```\n\nThen bind it to the LLM model and query the model:\n```python\nfrom langchain_openai import ChatOpenAI\nfrom langchain_core.messages import HumanMessage\n\nllm = ChatOpenAI(model=\"gpt-4o-mini\")\nllm_with_tools = llm.bind_tools([prolog_tool])\nmessages = [HumanMessage(\"Who are John's children?\")]\nresponse = llm_with_tools.invoke(messages)\nmessages.append(response)\nprint(response.tool_calls[0])\n```\nThe LLM will respond with a tool call request:\n```python\n{'name': 'family_query',\n 'args': {'men': 'john', 'women': None, 'child': None},\n 'id': 'call_V6NUsJwhF41G9G7q6TBmghR0',\n 'type': 'tool_call'}\n ```\n The tool takes this request and queries the Prolog database:\n ```python\n tool_msg = prolog_tool.invoke(response.tool_calls[0])\nmessages.append(tool_msg)\nprint(tool_msg)\n ```\nThe tool returns a list with all the solutions for the query:\n ```python\n content='[{\"Women\": \"bianca\", \"Child\": \"mary\"}, {\"Women\": \"bianca\", \"Child\": \"michael\"}]'\n name='family_query'\n tool_call_id='call_V6NUsJwhF41G9G7q6TBmghR0'\n ```\n That we then pass to the LLM:\n ```python\n answer = llm_with_tools.invoke(messages)\n print(answer.content)\n ```\n And the LLM answers the original query using the tool response:\n ```python\n John has two children: Mary and Michael. Their mother is Bianca.\n ```\n\n### 2) Using a Prolog tool with an agent\n\nFirst create the Prolog tool:\n```python\nfrom langchain_prolog import PrologConfig, PrologTool\n\nschema = PrologRunnable.create_schema('parent', ['men', 'women', 'child'])\nconfig = PrologConfig(\n            rules_file=\"family.pl\",\n            query_schema=schema,\n        )\nprolog_tool = PrologTool(\n    prolog_config=config,\n    name=\"family_query\",\n    description=\"\"\"\n        Query family relationships using Prolog.\n        parent(X, Y, Z) implies only that Z is a child of X and Y.\n        Input can be a query string like 'parent(john, X, Y)'\n        or 'john, X, Y'\"\n        You have to specify 3 parameters: men, woman, child.\n        Do not use quotes.\n    \"\"\"\n)\n```\nThen pass it to the agent's constructor:\n```python\nfrom langchain_core.prompts import ChatPromptTemplate\nfrom langchain.agents import create_tool_calling_agent, AgentExecutor\n\nllm = ChatOpenAI(model=\"gpt-4o-mini\")\nprompt = ChatPromptTemplate.from_messages(\n    [\n        (\"system\", \"You are a helpful assistant\"),\n        (\"human\", \"{input}\"),\n        (\"placeholder\", \"{agent_scratchpad}\"),\n    ]\n)\ntools = [prolog_tool]\nagent = create_tool_calling_agent(llm, tools, prompt)\nagent_executor = AgentExecutor(agent=agent, tools=tools)\n```\nThe agent takes the query and use the Prolog tool if needed:\n```python\nanswer = agent_executor.invoke({\"input\": \"Who are John's children?\"})\nprint(answer)\n```\nThen the agent recieves the tool response as part of the {agent_scratchpad} placeholder and generates the answer:\n```python\n{'input': \"Who are John's children?\", 'output': 'John has two children: Mary and Michael, with Bianca as their mother.'}\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "An integration package connecting Prolog and LangChain",
    "version": "0.1.0",
    "project_urls": {
        "Homepage": "https://github.com/apisani1/langchain-prolog",
        "Release Notes": "https://github.com/langchain-ai/langchain/releases?q=tag%3A%22prolog%3D%3D0%22&expanded=true",
        "Repository": "https://github.com/apisani1/langchain-prolog",
        "Source Code": "https://github.com/langchain-ai/langchain/tree/master/libs/partners/prolog"
    },
    "split_keywords": [
        "langchain",
        " prolog",
        " swi-prolog",
        " llm",
        " agent"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "171b4806e5342de0107409ea77238eb8b8327e46c624f43712242887c0045b9e",
                "md5": "51850981e1870f4bd0d5660ba499564c",
                "sha256": "63be57877e66b31854a4819e7da74c2e6745b5807448176c2089a6c2af98ca15"
            },
            "downloads": -1,
            "filename": "langchain_prolog-0.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "51850981e1870f4bd0d5660ba499564c",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.10",
            "size": 17164,
            "upload_time": "2025-02-11T22:42:45",
            "upload_time_iso_8601": "2025-02-11T22:42:45.834663Z",
            "url": "https://files.pythonhosted.org/packages/17/1b/4806e5342de0107409ea77238eb8b8327e46c624f43712242887c0045b9e/langchain_prolog-0.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "7f2f3fbb488e4681443b8e3d42d4b48dd1ae7140c1ee312fc800e1aa0bd07149",
                "md5": "f3a649d1e9a347028f9dafc0751805bc",
                "sha256": "9b8f08f97a757a2f83fae663f4cb5117dde4b8ebed9976130751e761d51e6bbc"
            },
            "downloads": -1,
            "filename": "langchain_prolog-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "f3a649d1e9a347028f9dafc0751805bc",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.10",
            "size": 16991,
            "upload_time": "2025-02-11T22:42:48",
            "upload_time_iso_8601": "2025-02-11T22:42:48.943390Z",
            "url": "https://files.pythonhosted.org/packages/7f/2f/3fbb488e4681443b8e3d42d4b48dd1ae7140c1ee312fc800e1aa0bd07149/langchain_prolog-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-02-11 22:42:48",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "apisani1",
    "github_project": "langchain-prolog",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "langchain-prolog"
}
        
Elapsed time: 0.40277s