# GraphRAG
[![Dockerhub](https://img.shields.io/docker/pulls/falkordb/falkordb?label=Docker)](https://hub.docker.com/r/falkordb/falkordb/)
[![pypi](https://badge.fury.io/py/graphrag_sdk.svg)](https://pypi.org/project/graphrag_sdk/)
[![Discord](https://img.shields.io/discord/1146782921294884966?style=flat-square)](https://discord.gg/6M4QwDXn2w)
[![Contributor Covenant](https://img.shields.io/badge/Contributor%20Covenant-2.1-4baaaa.svg)](code_of_conduct.md)
<p align="center">
<img alt="FalkorDB GraphRAG-SDK README Banner" src="images/FalkorDB GraphRAG-SDK README Banner.png" width="1500">
</p>
### Build fast and accurate GenAI apps with GraphRAG SDK at scale
Simplify the development of your next GenAI application with GraphRAG-SDK, a specialized toolkit for building Graph Retrieval-Augmented Generation (GraphRAG) systems. It integrates knowledge graphs, ontology management, and state-of-the-art LLMs to deliver accurate, efficient, and customizable RAG workflows.
# GraphRAG Setup
### Database Setup
[![Try Free](https://img.shields.io/badge/Try%20Free-FalkorDB%20Cloud-FF8101?labelColor=FDE900&style=for-the-badge&link=https://app.falkordb.cloud)](https://app.falkordb.cloud)
Or use on premise with Docker:
```sh
docker run -p 6379:6379 -p 3000:3000 -it --rm -v ./data:/data falkordb/falkordb:latest
```
### Dependencies:
```sh
# For all LLM providers
pip install graphrag_sdk[all]
# For specific LLM provider (Example: LiteLLM)
pip install graphrag_sdk[litellm]
```
### Configure Credentials. See [.env](.env.template) for examples.
* [LiteLLM](https://docs.litellm.ai): A framework supporting inference of large language models, allowing flexibility in deployment and use cases.
* [OpenAI](https://openai.com/index/openai-api) Recommended model:`gpt-4o`
* [Google](https://makersuite.google.com/app/apikey) Recommended model:`gemini-2.0-flash-exp`
* [Azure-OpenAI](https://ai.azure.com) Recommended model:`gpt-4o`
* [Ollama](https://ollama.com/) Available only to the Q&A step. Recommended models: `llama3`. Ollama models are suitable for the Q&A step only (after the knowledge graph (KG) created).
# How to use
### Step 1: Creating Ontologies
Automate ontology creation from unstructured data or define it manually.
```python
from dotenv import load_dotenv
from graphrag_sdk.source import URL
from graphrag_sdk import KnowledgeGraph, Ontology
from graphrag_sdk.models.litellm import LiteModel
from graphrag_sdk.model_config import KnowledgeGraphModelConfig
load_dotenv()
# Import Data
urls = ["https://www.rottentomatoes.com/m/side_by_side_2012",
"https://www.rottentomatoes.com/m/matrix",
"https://www.rottentomatoes.com/m/matrix_revolutions",
"https://www.rottentomatoes.com/m/matrix_reloaded",
"https://www.rottentomatoes.com/m/speed_1994",
"https://www.rottentomatoes.com/m/john_wick_chapter_4"]
sources = [URL(url) for url in urls]
# Model
model = LiteModel(model_name="gemini/gemini-2.0-flash-exp")
# Ontology Auto-Detection
ontology = Ontology.from_sources(
sources=sources,
model=model,
)
# Save the ontology to the disk as a json file.
with open("ontology.json", "w", encoding="utf-8") as file:
file.write(json.dumps(ontology.to_json(), indent=2))
```
### Step 2: Creating a knowledge graph agent
Build, query, and manage knowledge graphs optimized for retrieval and augmentation tasks.
Leverages FalkorDB for high-performance graph querying and multi-tenancy.
```python
# After approving the ontology, load it from disk.
ontology_file = "ontology.json"
with open(ontology_file, "r", encoding="utf-8") as file:
ontology = Ontology.from_json(json.loads(file.read()))
kg = KnowledgeGraph(
name="kg_name",
model_config=KnowledgeGraphModelConfig.with_model(model),
ontology=ontology,
host="127.0.0.1",
port=6379,
# username=falkor_username, # optional
# password=falkor_password # optional
)
kg.process_sources(sources)
```
### Step 3: Query your Graph RAG
At this point, you have a Knowledge Graph that can be queried using this SDK. Use the method `chat_session` for start a conversation.
```python
# Conversation
chat = kg.chat_session()
response = chat.send_message("Who is the director of the movie The Matrix?")
print(response)
response = chat.send_message("How this director connected to Keanu Reeves?")
print(response)
```
## Next Steps
With these 3 steps now completed, you're ready to interact and query your knowledge graph. Here are suggestions for use cases:
<p align="left">
<img alt="GraphRAG-SDK Use Cases Banner from FalkorDB" src="images/use-cases.png" width="800">
</p>
**Need help with your use case? let's [talk](https://www.falkordb.com/get-demo/)**
<br />
# AI Agents with GraphRAG
### Orchestrator
The GraphRAG-SDK supports Knowledge Graph-based agents. Each agent is an expert in his domain, and the orchestrator orchestrates the agents.
Check out the example:
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/FalkorDB/GraphRAG-SDK/blob/main/examples/trip/demo_orchestrator_trip.ipynb)
### Agents
See the [Step 1](#how-to-use) section to understand how to create Knowledge Graph objects for the agents.
```python
# Define the model
model = LiteModel(model_name="gemini/gemini-2.0-flash-exp")
# Create the Knowledge Graph from the predefined ontology.
# In this example, we will use the restaurants agent and the attractions agent.
restaurants_kg = KnowledgeGraph(
name="restaurants",
ontology=restaurants_ontology,
model_config=KnowledgeGraphModelConfig.with_model(model),
host="127.0.0.1",
port=6379,
# username=falkor_username, # optional
# password=falkor_password # optional
)
attractions_kg = KnowledgeGraph(
name="attractions",
ontology=attractions_ontology,
model_config=KnowledgeGraphModelConfig.with_model(model),
host="127.0.0.1",
port=6379,
# username=falkor_username, # optional
# password=falkor_password # optional
)
# The following agent is specialized in finding restaurants.
restaurants_agent = KGAgent(
agent_id="restaurants_agent",
kg=restaurants_kg,
introduction="I'm a restaurant agent, specialized in finding the best restaurants for you.",
)
# The following agent is specialized in finding tourist attractions.
attractions_agent = KGAgent(
agent_id="attractions_agent",
kg=attractions_kg,
introduction="I'm an attractions agent, specialized in finding the best tourist attractions for you.",
)
```
### Orchestrator - Multi-Agent System
The orchestrator manages the usage of agents and handles questioning.
```python
# Initialize the orchestrator while giving it the backstory.
orchestrator = Orchestrator(
model,
backstory="You are a trip planner, and you want to provide the best possible itinerary for your clients.",
)
# Register the agents that we created above.
orchestrator.register_agent(restaurants_agent)
orchestrator.register_agent(attractions_agent)
# Query the orchestrator.
runner = orchestrator.ask("Create a two-day itinerary for a trip to Rome. Please don't ask me any questions; just provide the best itinerary you can.")
print(runner.output)
```
## Community
Have questions or feedback? Reach out via:
- [GitHub Issues](https://github.com/FalkorDB/GraphRAG-SDK/issues)
- Join our [Discord](https://discord.com/invite/6M4QwDXn2w)
⭐️ If you find this repository helpful, please consider giving it a star!
## Additional Enhancement: Configuring your prompts
When creating your Knowledge Graph (KG) agent, you can customize the prompts to tailor its behavior.
💡 This step is optional but can enhance functionality.
There are five types of prompts:
1. **`cypher_system_instruction`**
- System instructions for the Cypher generation step.
- **Note:** Ensure your prompt includes `{ontology}`.
2. **`qa_system_instruction`**
- System instructions for the Q&A step.
3. **`cypher_gen_prompt`**
- The prompt used during the Cypher generation step.
- **Note:** Include `{question}` in your prompt.
4. **`cypher_gen_prompt_history`**
- The prompt for Cypher generation when history needs to be considered.
- **Note:** Include `{question}` and `{last_answer}` in your prompt.
5. **`qa_prompt`**
- The prompt used during the Q&A step.
- **Note:** Include `{question}`, `{context}`, and `{cypher}` in your prompt.
Here’s an example configuration:
```python
kg = KnowledgeGraph(
name="kg_name",
model_config=KnowledgeGraphModelConfig.with_model(model),
ontology=ontology,
cypher_system_instruction=cypher_system_instruction,
qa_system_instruction=qa_system_instruction,
cypher_gen_prompt=cypher_gen_prompt,
cypher_gen_prompt_history=cypher_gen_prompt_history,
qa_prompt=qa_prompt
host="127.0.0.1",
port=6379,
# username=falkor_username, # optional
# password=falkor_password # optional
)
```
## FAQ
**Which databases are supported?**
GraphRAG-SDK is optimized for FalkorDB. Other backends may require adapters.
**How scalable is the SDK?**
GraphRAG-SDK is designed for multi-tenancy and large-scale applications. Performance depends on FalkorDB deployment configuration.
**How does this SDK improve retrieval-augmented generation?**
By leveraging knowledge graphs, GraphRAG-SDK enables semantic relationships and ontology-driven queries that go beyond standard vector similarity.
**Which file formats does the SDK support?**
Supported formats include PDF, JSONL, CSV, HTML, TEXT, and URLs.
**How does the SDK handle latency?**
The SDK is optimized for low-latency operations through FalkorDB, using techniques like query optimization and in-memory processing.
**Does the SDK support multi-graph querying?**
Yes. Multi-graph querying is supported with APIs designed for cross-domain and hierarchical graph exploration.
<br />
# License
This project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.
Keywords: RAG, graphrag, Retrieval-Augmented Generation, NLP, AI, Information Retrieval, Natural Language Processing, LLM, Embeddings, Semantic Search
Raw data
{
"_id": null,
"home_page": "http://RAG-SDK.readthedocs.io",
"name": "graphrag_sdk",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0.0,>=3.9.0",
"maintainer_email": null,
"keywords": "RAG, Graph-RAG, KnowledgeGraph",
"author": "FalkorDB",
"author_email": "info@falkordb.com",
"download_url": "https://files.pythonhosted.org/packages/04/0d/7dfe77b4f99009ce37786e7dc7fee07b2194408c5c8578fc8d801e29220a/graphrag_sdk-0.4.0.tar.gz",
"platform": null,
"description": "# GraphRAG \n[![Dockerhub](https://img.shields.io/docker/pulls/falkordb/falkordb?label=Docker)](https://hub.docker.com/r/falkordb/falkordb/)\n[![pypi](https://badge.fury.io/py/graphrag_sdk.svg)](https://pypi.org/project/graphrag_sdk/)\n[![Discord](https://img.shields.io/discord/1146782921294884966?style=flat-square)](https://discord.gg/6M4QwDXn2w)\n[![Contributor Covenant](https://img.shields.io/badge/Contributor%20Covenant-2.1-4baaaa.svg)](code_of_conduct.md) \n<p align=\"center\">\n <img alt=\"FalkorDB GraphRAG-SDK README Banner\" src=\"images/FalkorDB GraphRAG-SDK README Banner.png\" width=\"1500\">\n</p>\n\n### Build fast and accurate GenAI apps with GraphRAG SDK at scale\n\nSimplify the development of your next GenAI application with GraphRAG-SDK, a specialized toolkit for building Graph Retrieval-Augmented Generation (GraphRAG) systems. It integrates knowledge graphs, ontology management, and state-of-the-art LLMs to deliver accurate, efficient, and customizable RAG workflows.\n\n# GraphRAG Setup\n### Database Setup\n\n[![Try Free](https://img.shields.io/badge/Try%20Free-FalkorDB%20Cloud-FF8101?labelColor=FDE900&style=for-the-badge&link=https://app.falkordb.cloud)](https://app.falkordb.cloud)\n\nOr use on premise with Docker:\n\n```sh\ndocker run -p 6379:6379 -p 3000:3000 -it --rm -v ./data:/data falkordb/falkordb:latest\n```\n\n### Dependencies:\n```sh\n# For all LLM providers\npip install graphrag_sdk[all]\n\n# For specific LLM provider (Example: LiteLLM)\npip install graphrag_sdk[litellm]\n```\n\n### Configure Credentials. See [.env](.env.template) for examples.\n\n* [LiteLLM](https://docs.litellm.ai): A framework supporting inference of large language models, allowing flexibility in deployment and use cases.\n* [OpenAI](https://openai.com/index/openai-api) Recommended model:`gpt-4o`\n* [Google](https://makersuite.google.com/app/apikey) Recommended model:`gemini-2.0-flash-exp`\n* [Azure-OpenAI](https://ai.azure.com) Recommended model:`gpt-4o`\n* [Ollama](https://ollama.com/) Available only to the Q&A step. Recommended models: `llama3`. Ollama models are suitable for the Q&A step only (after the knowledge graph (KG) created).\n\n\n# How to use\n\n### Step 1: Creating Ontologies\nAutomate ontology creation from unstructured data or define it manually.\n\n```python\nfrom dotenv import load_dotenv\n\nfrom graphrag_sdk.source import URL\nfrom graphrag_sdk import KnowledgeGraph, Ontology\nfrom graphrag_sdk.models.litellm import LiteModel\nfrom graphrag_sdk.model_config import KnowledgeGraphModelConfig\nload_dotenv()\n\n# Import Data\nurls = [\"https://www.rottentomatoes.com/m/side_by_side_2012\",\n\"https://www.rottentomatoes.com/m/matrix\",\n\"https://www.rottentomatoes.com/m/matrix_revolutions\",\n\"https://www.rottentomatoes.com/m/matrix_reloaded\",\n\"https://www.rottentomatoes.com/m/speed_1994\",\n\"https://www.rottentomatoes.com/m/john_wick_chapter_4\"]\n\nsources = [URL(url) for url in urls]\n\n# Model\nmodel = LiteModel(model_name=\"gemini/gemini-2.0-flash-exp\")\n\n# Ontology Auto-Detection\nontology = Ontology.from_sources(\n sources=sources,\n model=model,\n)\n# Save the ontology to the disk as a json file.\nwith open(\"ontology.json\", \"w\", encoding=\"utf-8\") as file:\n file.write(json.dumps(ontology.to_json(), indent=2))\n```\n\n### Step 2: Creating a knowledge graph agent\nBuild, query, and manage knowledge graphs optimized for retrieval and augmentation tasks. \nLeverages FalkorDB for high-performance graph querying and multi-tenancy.\n\n```python\n# After approving the ontology, load it from disk.\nontology_file = \"ontology.json\"\nwith open(ontology_file, \"r\", encoding=\"utf-8\") as file:\n ontology = Ontology.from_json(json.loads(file.read()))\n\nkg = KnowledgeGraph(\n name=\"kg_name\",\n model_config=KnowledgeGraphModelConfig.with_model(model),\n ontology=ontology,\n host=\"127.0.0.1\",\n port=6379,\n # username=falkor_username, # optional\n # password=falkor_password # optional\n)\n\nkg.process_sources(sources)\n```\n\n### Step 3: Query your Graph RAG\n\nAt this point, you have a Knowledge Graph that can be queried using this SDK. Use the method `chat_session` for start a conversation.\n\n```python\n# Conversation\nchat = kg.chat_session()\nresponse = chat.send_message(\"Who is the director of the movie The Matrix?\")\nprint(response)\nresponse = chat.send_message(\"How this director connected to Keanu Reeves?\")\nprint(response)\n```\n\n## Next Steps\nWith these 3 steps now completed, you're ready to interact and query your knowledge graph. Here are suggestions for use cases:\n<p align=\"left\">\n <img alt=\"GraphRAG-SDK Use Cases Banner from FalkorDB\" src=\"images/use-cases.png\" width=\"800\">\n</p>\n\n**Need help with your use case? let's [talk](https://www.falkordb.com/get-demo/)**\n\n<br />\n\n# AI Agents with GraphRAG\n\n### Orchestrator\nThe GraphRAG-SDK supports Knowledge Graph-based agents. Each agent is an expert in his domain, and the orchestrator orchestrates the agents.\n\nCheck out the example:\n\n[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/FalkorDB/GraphRAG-SDK/blob/main/examples/trip/demo_orchestrator_trip.ipynb)\n\n\n### Agents\n\nSee the [Step 1](#how-to-use) section to understand how to create Knowledge Graph objects for the agents.\n\n```python\n# Define the model\nmodel = LiteModel(model_name=\"gemini/gemini-2.0-flash-exp\")\n\n# Create the Knowledge Graph from the predefined ontology.\n# In this example, we will use the restaurants agent and the attractions agent.\nrestaurants_kg = KnowledgeGraph(\n name=\"restaurants\",\n ontology=restaurants_ontology,\n model_config=KnowledgeGraphModelConfig.with_model(model),\n host=\"127.0.0.1\",\n port=6379,\n # username=falkor_username, # optional\n # password=falkor_password # optional\n)\nattractions_kg = KnowledgeGraph(\n name=\"attractions\",\n ontology=attractions_ontology,\n model_config=KnowledgeGraphModelConfig.with_model(model),\n host=\"127.0.0.1\",\n port=6379,\n # username=falkor_username, # optional\n # password=falkor_password # optional\n)\n\n\n# The following agent is specialized in finding restaurants.\nrestaurants_agent = KGAgent(\n agent_id=\"restaurants_agent\",\n kg=restaurants_kg,\n introduction=\"I'm a restaurant agent, specialized in finding the best restaurants for you.\",\n)\n\n# The following agent is specialized in finding tourist attractions.\nattractions_agent = KGAgent(\n agent_id=\"attractions_agent\",\n kg=attractions_kg,\n introduction=\"I'm an attractions agent, specialized in finding the best tourist attractions for you.\",\n)\n```\n\n### Orchestrator - Multi-Agent System\n\nThe orchestrator manages the usage of agents and handles questioning.\n\n```python\n# Initialize the orchestrator while giving it the backstory.\norchestrator = Orchestrator(\n model,\n backstory=\"You are a trip planner, and you want to provide the best possible itinerary for your clients.\",\n)\n\n# Register the agents that we created above.\norchestrator.register_agent(restaurants_agent)\norchestrator.register_agent(attractions_agent)\n\n# Query the orchestrator.\nrunner = orchestrator.ask(\"Create a two-day itinerary for a trip to Rome. Please don't ask me any questions; just provide the best itinerary you can.\")\nprint(runner.output)\n\n```\n## Community\n\nHave questions or feedback? Reach out via:\n- [GitHub Issues](https://github.com/FalkorDB/GraphRAG-SDK/issues)\n- Join our [Discord](https://discord.com/invite/6M4QwDXn2w)\n\n\u2b50\ufe0f If you find this repository helpful, please consider giving it a star!\n\n## Additional Enhancement: Configuring your prompts\nWhen creating your Knowledge Graph (KG) agent, you can customize the prompts to tailor its behavior.\n\n\ud83d\udca1 This step is optional but can enhance functionality.\n\nThere are five types of prompts:\n\n1. **`cypher_system_instruction`** \n - System instructions for the Cypher generation step. \n - **Note:** Ensure your prompt includes `{ontology}`.\n\n2. **`qa_system_instruction`** \n - System instructions for the Q&A step.\n\n3. **`cypher_gen_prompt`** \n - The prompt used during the Cypher generation step. \n - **Note:** Include `{question}` in your prompt.\n\n4. **`cypher_gen_prompt_history`** \n - The prompt for Cypher generation when history needs to be considered. \n - **Note:** Include `{question}` and `{last_answer}` in your prompt.\n\n5. **`qa_prompt`** \n - The prompt used during the Q&A step. \n - **Note:** Include `{question}`, `{context}`, and `{cypher}` in your prompt.\n\nHere\u2019s an example configuration:\n\n```python\nkg = KnowledgeGraph(\n name=\"kg_name\",\n model_config=KnowledgeGraphModelConfig.with_model(model),\n ontology=ontology,\n cypher_system_instruction=cypher_system_instruction,\n qa_system_instruction=qa_system_instruction,\n cypher_gen_prompt=cypher_gen_prompt,\n cypher_gen_prompt_history=cypher_gen_prompt_history,\n qa_prompt=qa_prompt\n host=\"127.0.0.1\",\n port=6379,\n # username=falkor_username, # optional\n # password=falkor_password # optional\n)\n```\n\n\n## FAQ\n**Which databases are supported?**\n\nGraphRAG-SDK is optimized for FalkorDB. Other backends may require adapters.\n\n**How scalable is the SDK?**\n\nGraphRAG-SDK is designed for multi-tenancy and large-scale applications. Performance depends on FalkorDB deployment configuration.\n\n**How does this SDK improve retrieval-augmented generation?**\n\nBy leveraging knowledge graphs, GraphRAG-SDK enables semantic relationships and ontology-driven queries that go beyond standard vector similarity.\n\n**Which file formats does the SDK support?**\n\nSupported formats include PDF, JSONL, CSV, HTML, TEXT, and URLs.\n\n**How does the SDK handle latency?**\n\nThe SDK is optimized for low-latency operations through FalkorDB, using techniques like query optimization and in-memory processing.\n\n**Does the SDK support multi-graph querying?**\n\nYes. Multi-graph querying is supported with APIs designed for cross-domain and hierarchical graph exploration.\n\n<br />\n\n# License\n\nThis project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.\n\nKeywords: RAG, graphrag, Retrieval-Augmented Generation, NLP, AI, Information Retrieval, Natural Language Processing, LLM, Embeddings, Semantic Search\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Enables graph-based RAG and integration with LLMs for advanced querying and knowledge graph creation.",
"version": "0.4.0",
"project_urls": {
"Homepage": "http://RAG-SDK.readthedocs.io",
"Repository": "https://github.com/FalkorDB/GraphRAG-SDK"
},
"split_keywords": [
"rag",
" graph-rag",
" knowledgegraph"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "040d7dfe77b4f99009ce37786e7dc7fee07b2194408c5c8578fc8d801e29220a",
"md5": "1f156fca6c9798e472f5bfc055d3c97e",
"sha256": "1aecc164087dff3dad4fe3e61ffd6c01c937bcec213a5d120892a3f2cfa23432"
},
"downloads": -1,
"filename": "graphrag_sdk-0.4.0.tar.gz",
"has_sig": false,
"md5_digest": "1f156fca6c9798e472f5bfc055d3c97e",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0.0,>=3.9.0",
"size": 46068,
"upload_time": "2024-12-18T12:53:24",
"upload_time_iso_8601": "2024-12-18T12:53:24.781816Z",
"url": "https://files.pythonhosted.org/packages/04/0d/7dfe77b4f99009ce37786e7dc7fee07b2194408c5c8578fc8d801e29220a/graphrag_sdk-0.4.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-18 12:53:24",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "FalkorDB",
"github_project": "GraphRAG-SDK",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "appnope",
"specs": [
[
"==",
"0.1.4"
]
]
},
{
"name": "asttokens",
"specs": [
[
"==",
"2.4.1"
]
]
},
{
"name": "async-timeout",
"specs": [
[
"==",
"5.0.1"
]
]
},
{
"name": "backoff",
"specs": [
[
"==",
"2.2.1"
]
]
},
{
"name": "beautifulsoup4",
"specs": [
[
"==",
"4.12.3"
]
]
},
{
"name": "bs4",
"specs": [
[
"==",
"0.0.2"
]
]
},
{
"name": "certifi",
"specs": [
[
"==",
"2024.8.30"
]
]
},
{
"name": "cffi",
"specs": [
[
"==",
"1.17.1"
]
]
},
{
"name": "charset-normalizer",
"specs": [
[
"==",
"3.4.0"
]
]
},
{
"name": "colorama",
"specs": [
[
"==",
"0.4.6"
]
]
},
{
"name": "comm",
"specs": [
[
"==",
"0.2.2"
]
]
},
{
"name": "debugpy",
"specs": [
[
"==",
"1.8.8"
]
]
},
{
"name": "decorator",
"specs": [
[
"==",
"5.1.1"
]
]
},
{
"name": "exceptiongroup",
"specs": [
[
"==",
"1.2.2"
]
]
},
{
"name": "executing",
"specs": [
[
"==",
"2.1.0"
]
]
},
{
"name": "falkordb",
"specs": [
[
"==",
"1.0.9"
]
]
},
{
"name": "fix-busted-json",
"specs": [
[
"==",
"0.0.18"
]
]
},
{
"name": "idna",
"specs": [
[
"==",
"3.10"
]
]
},
{
"name": "importlib-metadata",
"specs": [
[
"==",
"8.5.0"
]
]
},
{
"name": "ipykernel",
"specs": [
[
"==",
"6.29.5"
]
]
},
{
"name": "ipython",
"specs": [
[
"==",
"8.18.1"
]
]
},
{
"name": "jedi",
"specs": [
[
"==",
"0.19.2"
]
]
},
{
"name": "jupyter-client",
"specs": [
[
"==",
"8.6.3"
]
]
},
{
"name": "jupyter-core",
"specs": [
[
"==",
"5.7.2"
]
]
},
{
"name": "matplotlib-inline",
"specs": [
[
"==",
"0.1.7"
]
]
},
{
"name": "nest-asyncio",
"specs": [
[
"==",
"1.6.0"
]
]
},
{
"name": "packaging",
"specs": [
[
"==",
"24.2"
]
]
},
{
"name": "parso",
"specs": [
[
"==",
"0.8.4"
]
]
},
{
"name": "pexpect",
"specs": [
[
"==",
"4.9.0"
]
]
},
{
"name": "platformdirs",
"specs": [
[
"==",
"4.3.6"
]
]
},
{
"name": "prompt-toolkit",
"specs": [
[
"==",
"3.0.48"
]
]
},
{
"name": "psutil",
"specs": [
[
"==",
"6.1.0"
]
]
},
{
"name": "ptyprocess",
"specs": [
[
"==",
"0.7.0"
]
]
},
{
"name": "pure-eval",
"specs": [
[
"==",
"0.2.3"
]
]
},
{
"name": "pycparser",
"specs": [
[
"==",
"2.22"
]
]
},
{
"name": "pygments",
"specs": [
[
"==",
"2.18.0"
]
]
},
{
"name": "pypdf",
"specs": [
[
"==",
"4.3.1"
]
]
},
{
"name": "python-abc",
"specs": [
[
"==",
"0.2.0"
]
]
},
{
"name": "python-dateutil",
"specs": [
[
"==",
"2.9.0.post0"
]
]
},
{
"name": "python-dotenv",
"specs": [
[
"==",
"1.0.1"
]
]
},
{
"name": "pywin32",
"specs": [
[
"==",
"308"
]
]
},
{
"name": "pyzmq",
"specs": [
[
"==",
"26.2.0"
]
]
},
{
"name": "ratelimit",
"specs": [
[
"==",
"2.2.1"
]
]
},
{
"name": "redis",
"specs": [
[
"==",
"5.2.0"
]
]
},
{
"name": "requests",
"specs": [
[
"==",
"2.32.3"
]
]
},
{
"name": "six",
"specs": [
[
"==",
"1.16.0"
]
]
},
{
"name": "soupsieve",
"specs": [
[
"==",
"2.6"
]
]
},
{
"name": "stack-data",
"specs": [
[
"==",
"0.6.3"
]
]
},
{
"name": "tornado",
"specs": [
[
"==",
"6.4.1"
]
]
},
{
"name": "traitlets",
"specs": [
[
"==",
"5.14.3"
]
]
},
{
"name": "typing-extensions",
"specs": [
[
"==",
"4.12.2"
]
]
},
{
"name": "urllib3",
"specs": [
[
"==",
"2.2.3"
]
]
},
{
"name": "wcwidth",
"specs": [
[
"==",
"0.2.13"
]
]
},
{
"name": "zipp",
"specs": [
[
"==",
"3.21.0"
]
]
}
],
"lcname": "graphrag_sdk"
}