# LLM Exo-Graph π§ πΈοΈ
[](https://pypi.org/project/llm-exo-graph/)
[](https://www.python.org/downloads/)
[](https://opensource.org/licenses/MIT)
An advanced knowledge graph engine that externalizes LLM memory into Neo4j, creating a persistent, searchable brain for AI systems.

## π Why Exo-Graph?
Traditional LLMs have ephemeral memory. **LLM Exo-Graph** creates an *exocortex* - an external brain that:
- π **Persists** knowledge across sessions
- π **Searches** with both semantic and graph algorithms
- π§© **Connects** information through relationships
- β‘ **Scales** beyond context window limitations
## π― The Power of Graph Structure
### Subject β Relationship β Object = Triplet(metadata)
Our graph structure captures not just entities, but the rich context of their relationships:
```
God β CREATED β man = (summary: God created man in his own image) [conf: 0.90]
God β DIVIDED β waters = (summary: God divided the waters) [conf: 0.90]
light β EXISTS β light = (summary: there was light) [conf: 0.90]
```
### Benefits of This Approach
1. **Enhanced Graph Search**
- Traverse relationships with Cypher queries
- Find indirect connections (friend-of-friend)
- Discover patterns and clusters
2. **Superior Vector Search**
- Summaries provide rich semantic context
- Embeddings capture relationship meaning
- Hybrid search combines graph + semantic
3. **Temporal Intelligence**
- Track relationship changes over time
- Handle contradictions gracefully
- Maintain complete history
## ποΈ How It Works
### Entity Extraction Pipeline
```mermaid
graph LR
A[Natural Language Input] --> B[LLM Processor]
B --> C{Entity Extraction}
C --> D[Subject Recognition]
C --> E[Relationship Detection]
C --> F[Object Identification]
D --> G[Graph Edge Creation]
E --> G
F --> G
G --> H[Neo4j Storage]
G --> I[Vector Embedding]
I --> J[Semantic Index]
```
### Entity Standardization Process
```mermaid
graph TD
A[Raw Entity/Relationship] --> B[BiEncoder Embedding]
B --> C[Category Classification]
C --> D{Similarity Check}
D -->|High Similarity| E[Use Existing Standard]
D -->|Low Similarity| F[CrossEncoder Verification]
F --> G{Cross-Validation Score}
G -->|Score > Threshold| H[Merge with Standard]
G -->|Score < Threshold| I[Create New Standard]
E --> J[Standardized Output]
H --> J
I --> J
K[Existing Categories] --> C
L[Cached Embeddings] --> D
style B fill:#e1f5fe,stroke:#01579b,stroke-width:2px
style F fill:#f3e5f5,stroke:#4a148c,stroke-width:2px
style J fill:#e8f5e8,stroke:#2e7d32,stroke-width:3px
```
### Item Processing Workflow
```mermaid
graph TD
A[InputItem] --> B[LLM Entity Extraction]
B --> C[Standardization Process]
C --> D{Negation Detection}
D -->|Positive Statement| E[Duplicate Check]
D -->|Negation| F[Conflict Detection]
E -->|New Relationship| G[Create Edge]
E -->|Duplicate Found| H[Skip/Ignore]
F -->|Conflict Found| I[Temporal Resolution]
F -->|No Conflict| J[Log Error]
G --> K[Neo4j Storage]
I --> L[Obsolete Existing]
L --> M[Update Metadata]
K --> N[Vector Embedding]
M --> N
N --> O[Index Update]
P[Temporal Metadata] --> G
P --> I
Q[Confidence Scoring] --> G
Q --> I
style D fill:#fff3e0,stroke:#e65100,stroke-width:2px
style I fill:#f3e5f5,stroke:#4a148c,stroke-width:2px
style N fill:#e8f5e8,stroke:#2e7d32,stroke-width:2px
style J fill:#ffebee,stroke:#c62828,stroke-width:2px
```
**Key Processing Features:**
1. **π Standardization**: Entities and relationships are normalized using BiEncoder + CrossEncoder
2. **β οΈ Negation Handling**: "Alice no longer works at Google" β obsoletes existing relationship
3. **β° Temporal Resolution**: Automatic conflict resolution with date-based transitions
4. **π― Confidence Scoring**: Each relationship has confidence metadata for reliability
5. **π Duplicate Prevention**: Exact matches are detected and skipped
6. **π Vector Integration**: All changes immediately update semantic search indexes
## π Quick Start
### Prerequisites
```bash
# Using Docker (Recommended)
docker-compose up -d
# Or use Neo4j Cloud
# Set NEO4J_URI=neo4j+s://your-instance.neo4j.io
```
### Installation
**From PyPI (Recommended):**
```bash
pip install llm-exo-graph
```
**From Source:**
```bash
git clone https://github.com/your-org/llm-exo-graph
cd llm-exo-graph
pip install -e .
```
**With Optional Dependencies:**
```bash
# For document processing
pip install "llm-exo-graph[documents]"
# For development
pip install "llm-exo-graph[dev]"
# All features
pip install "llm-exo-graph[all]"
```
### Basic Usage
```python
from llm_exo_graph import ExoGraphEngine, InputItem
# Initialize with auto-configuration
engine = ExoGraphEngine()
# Or with custom encoder models
config = {
"encoder_model": "all-mpnet-base-v2",
"cross_encoder_model": "cross-encoder/ms-marco-MiniLM-L-12-v2"
}
engine = ExoGraphEngine(config=config)
# Feed knowledge
engine.process_input([
InputItem("Marie Curie discovered radium in 1898"),
InputItem("Radium glows green in the dark"),
InputItem("Marie Curie won the Nobel Prize twice")
])
# Query naturally
response = engine.search("What did Marie Curie discover?")
print(response.answer)
# β "Marie Curie discovered radium in 1898."
```
## π€ MCP Integration (Model Context Protocol)
### What is MCP?
MCP enables AI assistants like Claude to directly interact with your knowledge graph via **Server-Sent Events (SSE)**, creating a persistent memory layer that survives across conversations.
### Quick Setup with Docker
1. **Start the MCP Server**
```bash
# Use the notebook docker-compose for MCP development
docker-compose -f docker-compose.notebook.yml up -d
# This starts:
# - Neo4j on port 7687/7474
# - MCP SSE server on port 3000
```
2. **Configure Claude Desktop**
```json
// ~/Library/Application Support/Claude/claude_desktop_config.json
{
"mcpServers": {
"exo-graph": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"http://localhost:3000/sse",
"--allow-http"
]
}
}
}
```
3. **Restart Claude Desktop** - The MCP server will connect automatically
### Graph Data Examples
After setup, Claude can work with rich graph relationships like these from our Biblical knowledge graph:
```
God β CREATED β man (God created man in his own image) [conf: 0.90]
God β DIVIDED β waters (God divided the waters) [conf: 0.90]
light β EXISTS β light (there was light) [conf: 0.90]
God β SAID β "Let there be light" (God spoke creation into existence) [conf: 0.95]
man β MADE_IN_IMAGE_OF β God (humanity reflects divine nature) [conf: 0.85]
waters β SEPARATED_BY β firmament (division of waters above and below) [conf: 0.88]
```
### Using MCP in Claude
Once configured, Claude gains persistent memory and can:
**πΎ Store Knowledge Permanently**
```
Claude: "I'll remember that John works at OpenAI as a researcher"
β Creates: John β WORKS_AT β OpenAI (researcher role) [conf: 0.95]
```
**π Query Across Sessions**
```
User: "What did we discuss about John yesterday?"
Claude: "You told me John works at OpenAI as a researcher. I have that stored in the knowledge graph."
```
**π Discover Connections**
```
User: "How is John connected to AI research?"
Claude: "Through the knowledge graph, I can see John β WORKS_AT β OpenAI β FOCUSES_ON β AI Research"
```
**π Analyze Patterns**
```
User: "Show me all employment relationships you know about"
Claude: "I found 15 employment relationships in the graph, including John at OpenAI, Alice at Google..."
```
**β° Track Changes Over Time**
```
User: "John left OpenAI and joined Google"
Claude: "I've updated the graph - obsoleted John's OpenAI relationship and created a new Google relationship with today's date."
```
## π REST API
### Quick API Usage
```bash
# Start API server
cd kg_api_server
python app/main.py
# Add knowledge
curl -X POST http://localhost:8080/api/v1/process \
-H "Content-Type: application/json" \
-d '{"items": [{"description": "Einstein developed E=mcΒ²"}]}'
# Search
curl http://localhost:8080/api/v1/search?query=Einstein
```
### API Endpoints
- `POST /api/v1/process` - Add knowledge
- `GET /api/v1/search` - Natural language search
- `GET /api/v1/entities/{name}` - Get entity details
- `DELETE /api/v1/edges/{id}` - Remove relationships
## π Visualization
Generate beautiful graph visualizations:
```bash
python visualize_graph.py
```
Creates three outputs in `/output`:
- π `knowledge_graph_relationships.txt` - Human-readable relationships
- πΌοΈ `knowledge_graph_static.png` - Publication-ready visualization
- π `knowledge_graph_interactive.html` - Interactive exploration
## π§ Configuration
### Engine Configuration
```python
from llm_exo_graph import ExoGraphEngine, Neo4jConfig, OllamaConfig
# Custom encoder configuration
config = {
"encoder_model": "all-mpnet-base-v2", # BiEncoder model
"cross_encoder_model": "cross-encoder/ms-marco-MiniLM-L-12-v2" # CrossEncoder model
}
# Initialize with all configurations
engine = ExoGraphEngine(
llm_config=OllamaConfig(model="llama3.2"),
neo4j_config=Neo4jConfig(),
config=config
)
```
### Available Encoder Models
**BiEncoder Models** (for semantic embeddings):
- `all-MiniLM-L6-v2` (default) - Fast, good quality
- `all-mpnet-base-v2` - Higher quality, slower
- `sentence-transformers/all-MiniLM-L12-v2` - Balanced
**CrossEncoder Models** (for relationship validation):
- `cross-encoder/ms-marco-MiniLM-L-6-v2` (default) - Fast
- `cross-encoder/ms-marco-MiniLM-L-12-v2` - More accurate
- `cross-encoder/ms-marco-electra-base` - Highest accuracy
### Environment Variables
```bash
# LLM Configuration (auto-detected)
OPENAI_API_KEY=sk-... # For OpenAI
OLLAMA_BASE_URL=http://localhost:11434 # For Ollama
OLLAMA_MODEL=llama3
# Neo4j Configuration
NEO4J_URI=bolt://localhost:7687
NEO4J_USERNAME=neo4j
NEO4J_PASSWORD=password
NEO4J_DATABASE=neo4j
# Optional
LOG_LEVEL=INFO
```
## π Advanced Features
### Custom Model Configuration
Choose encoder models based on your needs:
```python
# High Performance Setup (Fast processing)
fast_config = {
"encoder_model": "all-MiniLM-L6-v2",
"cross_encoder_model": "cross-encoder/ms-marco-MiniLM-L-6-v2"
}
# High Accuracy Setup (Better quality)
accurate_config = {
"encoder_model": "all-mpnet-base-v2",
"cross_encoder_model": "cross-encoder/ms-marco-MiniLM-L-12-v2"
}
# Domain-Specific Setup (for scientific/technical content)
domain_config = {
"encoder_model": "sentence-transformers/allenai-specter",
"cross_encoder_model": "cross-encoder/ms-marco-electra-base"
}
engine = ExoGraphEngine(config=accurate_config)
```
### Document Processing
```python
from llm_exo_graph import DocumentProcessor
processor = DocumentProcessor()
results = processor.process_directory("./research_papers/")
```
### Temporal Relationships & Negation Handling
```python
# Example: Career transitions with temporal intelligence
engine.process_input([
InputItem("Alice works as a software engineer at Google"),
InputItem("Alice no longer works at Google"), # Negation - obsoletes previous
InputItem("Alice started working at OpenAI in January 2024") # New relationship
])
# The system automatically:
# 1. Detects "no longer" as negation
# 2. Finds conflicting relationships
# 3. Obsoletes old relationship with end date
# 4. Creates new relationship with start date
```
### Standardization in Action
```python
# These variations are automatically standardized:
engine.process_input([
InputItem("John works at Microsoft"),
InputItem("John is employed by Microsoft"), # Standardized to "WORKS_AT"
InputItem("John's employer is Microsoft"), # Also standardized to "WORKS_AT"
])
# Result: All create the same standardized relationship
# John β WORKS_AT β Microsoft (with different summaries)
```
### Conflict Resolution
```python
# Handles contradictions intelligently
history = engine.get_entity_relationships("Alice")
# Shows both relationships with temporal metadata:
# - Alice β WORKS_AT β Google [obsolete: 2024-01-15]
# - Alice β WORKS_AT β OpenAI [active: 2024-01-16]
```
## π§ͺ Examples
- π [Bible Knowledge Graph](examples/bible_processing.ipynb)
- 𧬠[Bio Research Graph](examples/bio_example.py)
- π [Document Processing](examples/document_processing_example.py)
- π [API Integration](kg_api_server/tests/)
## π οΈ Development
### Running Tests
```bash
pytest tests/
cd kg_api_server && pytest tests/
```
### Contributing
See [CONTRIBUTING.md](docs/development/contributing.md)
## π Performance
- β‘ 50-74% faster queries with optimizations
- π Batch processing for large datasets
- πΎ Intelligent caching layers
- π― Optimized Neo4j indexes
## π¦ Package Information
- **PyPI**: [https://pypi.org/project/llm-exo-graph/](https://pypi.org/project/llm-exo-graph/)
- **Install**: `pip install llm-exo-graph`
- **Version**: Check latest on PyPI
- **Extras**: `[documents]`, `[dev]`, `[all]`
## π€ Community
- π [Documentation](docs/)
- π [Issues](https://github.com/your-org/llm-exo-graph/issues)
- π¬ [Discussions](https://github.com/your-org/llm-exo-graph/discussions)
- π¦ [PyPI Package](https://pypi.org/project/llm-exo-graph/)
## π License
MIT License - see [LICENSE](LICENSE)
---
**LLM Exo-Graph** - Giving AI a persistent, searchable memory π§ β¨
Raw data
{
"_id": null,
"home_page": null,
"name": "llm-exo-graph",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "ai, conflict-resolution, entity-recognition, graph-database, hybrid-search, information-extraction, knowledge-graph, llm, machine-learning, neo4j, nlp, openai, relationship-extraction, semantic-search, temporal-tracking, vector-search",
"author": null,
"author_email": "Maksims Gavrilovs <acidpictures@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/18/0f/c16eaca928933c46225a633a644779043956e7689ce14485c3abe2d3a136/llm_exo_graph-1.2.1.tar.gz",
"platform": null,
"description": "# LLM Exo-Graph \ud83e\udde0\ud83d\udd78\ufe0f\n\n[](https://pypi.org/project/llm-exo-graph/)\n[](https://www.python.org/downloads/)\n[](https://opensource.org/licenses/MIT)\n\nAn advanced knowledge graph engine that externalizes LLM memory into Neo4j, creating a persistent, searchable brain for AI systems.\n\n\n\n## \ud83c\udf1f Why Exo-Graph?\n\nTraditional LLMs have ephemeral memory. **LLM Exo-Graph** creates an *exocortex* - an external brain that:\n- \ud83d\udcdd **Persists** knowledge across sessions\n- \ud83d\udd0d **Searches** with both semantic and graph algorithms\n- \ud83e\udde9 **Connects** information through relationships\n- \u26a1 **Scales** beyond context window limitations\n\n## \ud83c\udfaf The Power of Graph Structure\n\n### Subject \u2192 Relationship \u2192 Object = Triplet(metadata)\n\nOur graph structure captures not just entities, but the rich context of their relationships:\n\n```\nGod \u2192 CREATED \u2192 man = (summary: God created man in his own image) [conf: 0.90]\nGod \u2192 DIVIDED \u2192 waters = (summary: God divided the waters) [conf: 0.90]\nlight \u2192 EXISTS \u2192 light = (summary: there was light) [conf: 0.90]\n```\n\n### Benefits of This Approach\n\n1. **Enhanced Graph Search**\n - Traverse relationships with Cypher queries\n - Find indirect connections (friend-of-friend)\n - Discover patterns and clusters\n \n2. **Superior Vector Search**\n - Summaries provide rich semantic context\n - Embeddings capture relationship meaning\n - Hybrid search combines graph + semantic\n\n3. **Temporal Intelligence**\n - Track relationship changes over time\n - Handle contradictions gracefully\n - Maintain complete history\n\n## \ud83c\udfd7\ufe0f How It Works\n\n### Entity Extraction Pipeline\n\n```mermaid\ngraph LR\n A[Natural Language Input] --> B[LLM Processor]\n B --> C{Entity Extraction}\n C --> D[Subject Recognition]\n C --> E[Relationship Detection]\n C --> F[Object Identification]\n D --> G[Graph Edge Creation]\n E --> G\n F --> G\n G --> H[Neo4j Storage]\n G --> I[Vector Embedding]\n I --> J[Semantic Index]\n```\n\n### Entity Standardization Process\n\n```mermaid\ngraph TD\n A[Raw Entity/Relationship] --> B[BiEncoder Embedding]\n B --> C[Category Classification]\n C --> D{Similarity Check}\n D -->|High Similarity| E[Use Existing Standard]\n D -->|Low Similarity| F[CrossEncoder Verification]\n F --> G{Cross-Validation Score}\n G -->|Score > Threshold| H[Merge with Standard]\n G -->|Score < Threshold| I[Create New Standard]\n E --> J[Standardized Output]\n H --> J\n I --> J\n \n K[Existing Categories] --> C\n L[Cached Embeddings] --> D\n \n style B fill:#e1f5fe,stroke:#01579b,stroke-width:2px\n style F fill:#f3e5f5,stroke:#4a148c,stroke-width:2px\n style J fill:#e8f5e8,stroke:#2e7d32,stroke-width:3px\n```\n\n### Item Processing Workflow\n\n```mermaid\ngraph TD\n A[InputItem] --> B[LLM Entity Extraction]\n B --> C[Standardization Process]\n C --> D{Negation Detection}\n \n D -->|Positive Statement| E[Duplicate Check]\n D -->|Negation| F[Conflict Detection]\n \n E -->|New Relationship| G[Create Edge]\n E -->|Duplicate Found| H[Skip/Ignore]\n \n F -->|Conflict Found| I[Temporal Resolution]\n F -->|No Conflict| J[Log Error]\n \n G --> K[Neo4j Storage]\n I --> L[Obsolete Existing]\n L --> M[Update Metadata]\n \n K --> N[Vector Embedding]\n M --> N\n N --> O[Index Update]\n \n P[Temporal Metadata] --> G\n P --> I\n Q[Confidence Scoring] --> G\n Q --> I\n \n style D fill:#fff3e0,stroke:#e65100,stroke-width:2px\n style I fill:#f3e5f5,stroke:#4a148c,stroke-width:2px\n style N fill:#e8f5e8,stroke:#2e7d32,stroke-width:2px\n style J fill:#ffebee,stroke:#c62828,stroke-width:2px\n```\n\n**Key Processing Features:**\n\n1. **\ud83d\udd0d Standardization**: Entities and relationships are normalized using BiEncoder + CrossEncoder\n2. **\u26a0\ufe0f Negation Handling**: \"Alice no longer works at Google\" \u2192 obsoletes existing relationship\n3. **\u23f0 Temporal Resolution**: Automatic conflict resolution with date-based transitions\n4. **\ud83c\udfaf Confidence Scoring**: Each relationship has confidence metadata for reliability\n5. **\ud83d\udd04 Duplicate Prevention**: Exact matches are detected and skipped\n6. **\ud83d\udcca Vector Integration**: All changes immediately update semantic search indexes\n\n## \ud83d\ude80 Quick Start\n\n### Prerequisites\n```bash\n# Using Docker (Recommended)\ndocker-compose up -d\n\n# Or use Neo4j Cloud\n# Set NEO4J_URI=neo4j+s://your-instance.neo4j.io\n```\n\n### Installation\n\n**From PyPI (Recommended):**\n```bash\npip install llm-exo-graph\n```\n\n**From Source:**\n```bash\ngit clone https://github.com/your-org/llm-exo-graph\ncd llm-exo-graph\npip install -e .\n```\n\n**With Optional Dependencies:**\n```bash\n# For document processing\npip install \"llm-exo-graph[documents]\"\n\n# For development\npip install \"llm-exo-graph[dev]\"\n\n# All features\npip install \"llm-exo-graph[all]\"\n```\n\n### Basic Usage\n\n```python\nfrom llm_exo_graph import ExoGraphEngine, InputItem\n\n# Initialize with auto-configuration\nengine = ExoGraphEngine()\n\n# Or with custom encoder models\nconfig = {\n \"encoder_model\": \"all-mpnet-base-v2\",\n \"cross_encoder_model\": \"cross-encoder/ms-marco-MiniLM-L-12-v2\"\n}\nengine = ExoGraphEngine(config=config)\n\n# Feed knowledge\nengine.process_input([\n InputItem(\"Marie Curie discovered radium in 1898\"),\n InputItem(\"Radium glows green in the dark\"),\n InputItem(\"Marie Curie won the Nobel Prize twice\")\n])\n\n# Query naturally\nresponse = engine.search(\"What did Marie Curie discover?\")\nprint(response.answer)\n# \u2192 \"Marie Curie discovered radium in 1898.\"\n```\n\n## \ud83e\udd16 MCP Integration (Model Context Protocol)\n\n### What is MCP?\n\nMCP enables AI assistants like Claude to directly interact with your knowledge graph via **Server-Sent Events (SSE)**, creating a persistent memory layer that survives across conversations.\n\n### Quick Setup with Docker\n\n1. **Start the MCP Server**\n ```bash\n # Use the notebook docker-compose for MCP development\n docker-compose -f docker-compose.notebook.yml up -d\n \n # This starts:\n # - Neo4j on port 7687/7474\n # - MCP SSE server on port 3000\n ```\n\n2. **Configure Claude Desktop**\n ```json\n // ~/Library/Application Support/Claude/claude_desktop_config.json\n {\n \"mcpServers\": {\n \"exo-graph\": {\n \"command\": \"npx\",\n \"args\": [ \n \"-y\",\n \"mcp-remote\",\n \"http://localhost:3000/sse\",\n \"--allow-http\"\n ]\n } \n }\n }\n ```\n\n3. **Restart Claude Desktop** - The MCP server will connect automatically\n\n### Graph Data Examples\n\nAfter setup, Claude can work with rich graph relationships like these from our Biblical knowledge graph:\n\n```\nGod \u2192 CREATED \u2192 man (God created man in his own image) [conf: 0.90]\nGod \u2192 DIVIDED \u2192 waters (God divided the waters) [conf: 0.90] \nlight \u2192 EXISTS \u2192 light (there was light) [conf: 0.90]\nGod \u2192 SAID \u2192 \"Let there be light\" (God spoke creation into existence) [conf: 0.95]\nman \u2192 MADE_IN_IMAGE_OF \u2192 God (humanity reflects divine nature) [conf: 0.85]\nwaters \u2192 SEPARATED_BY \u2192 firmament (division of waters above and below) [conf: 0.88]\n```\n\n### Using MCP in Claude\n\nOnce configured, Claude gains persistent memory and can:\n\n**\ud83d\udcbe Store Knowledge Permanently**\n```\nClaude: \"I'll remember that John works at OpenAI as a researcher\"\n\u2192 Creates: John \u2192 WORKS_AT \u2192 OpenAI (researcher role) [conf: 0.95]\n```\n\n**\ud83d\udd0d Query Across Sessions** \n```\nUser: \"What did we discuss about John yesterday?\"\nClaude: \"You told me John works at OpenAI as a researcher. I have that stored in the knowledge graph.\"\n```\n\n**\ud83d\udd17 Discover Connections**\n```\nUser: \"How is John connected to AI research?\"\nClaude: \"Through the knowledge graph, I can see John \u2192 WORKS_AT \u2192 OpenAI \u2192 FOCUSES_ON \u2192 AI Research\"\n```\n\n**\ud83d\udcca Analyze Patterns**\n```\nUser: \"Show me all employment relationships you know about\"\nClaude: \"I found 15 employment relationships in the graph, including John at OpenAI, Alice at Google...\"\n```\n\n**\u23f0 Track Changes Over Time**\n```\nUser: \"John left OpenAI and joined Google\"\nClaude: \"I've updated the graph - obsoleted John's OpenAI relationship and created a new Google relationship with today's date.\"\n```\n\n## \ud83c\udf10 REST API\n\n### Quick API Usage\n```bash\n# Start API server\ncd kg_api_server\npython app/main.py\n\n# Add knowledge\ncurl -X POST http://localhost:8080/api/v1/process \\\n -H \"Content-Type: application/json\" \\\n -d '{\"items\": [{\"description\": \"Einstein developed E=mc\u00b2\"}]}'\n\n# Search\ncurl http://localhost:8080/api/v1/search?query=Einstein\n```\n\n### API Endpoints\n- `POST /api/v1/process` - Add knowledge\n- `GET /api/v1/search` - Natural language search\n- `GET /api/v1/entities/{name}` - Get entity details\n- `DELETE /api/v1/edges/{id}` - Remove relationships\n\n## \ud83d\udcca Visualization\n\nGenerate beautiful graph visualizations:\n\n```bash\npython visualize_graph.py\n```\n\nCreates three outputs in `/output`:\n- \ud83d\udcc4 `knowledge_graph_relationships.txt` - Human-readable relationships\n- \ud83d\uddbc\ufe0f `knowledge_graph_static.png` - Publication-ready visualization\n- \ud83c\udf10 `knowledge_graph_interactive.html` - Interactive exploration\n\n## \ud83d\udd27 Configuration\n\n### Engine Configuration\n\n```python\nfrom llm_exo_graph import ExoGraphEngine, Neo4jConfig, OllamaConfig\n\n# Custom encoder configuration\nconfig = {\n \"encoder_model\": \"all-mpnet-base-v2\", # BiEncoder model\n \"cross_encoder_model\": \"cross-encoder/ms-marco-MiniLM-L-12-v2\" # CrossEncoder model\n}\n\n# Initialize with all configurations\nengine = ExoGraphEngine(\n llm_config=OllamaConfig(model=\"llama3.2\"),\n neo4j_config=Neo4jConfig(),\n config=config\n)\n```\n\n### Available Encoder Models\n\n**BiEncoder Models** (for semantic embeddings):\n- `all-MiniLM-L6-v2` (default) - Fast, good quality\n- `all-mpnet-base-v2` - Higher quality, slower\n- `sentence-transformers/all-MiniLM-L12-v2` - Balanced\n\n**CrossEncoder Models** (for relationship validation):\n- `cross-encoder/ms-marco-MiniLM-L-6-v2` (default) - Fast\n- `cross-encoder/ms-marco-MiniLM-L-12-v2` - More accurate\n- `cross-encoder/ms-marco-electra-base` - Highest accuracy\n\n### Environment Variables\n```bash\n# LLM Configuration (auto-detected)\nOPENAI_API_KEY=sk-... # For OpenAI\nOLLAMA_BASE_URL=http://localhost:11434 # For Ollama\nOLLAMA_MODEL=llama3\n\n# Neo4j Configuration\nNEO4J_URI=bolt://localhost:7687\nNEO4J_USERNAME=neo4j\nNEO4J_PASSWORD=password\nNEO4J_DATABASE=neo4j\n\n# Optional\nLOG_LEVEL=INFO\n```\n\n## \ud83d\udcda Advanced Features\n\n### Custom Model Configuration\n\nChoose encoder models based on your needs:\n\n```python\n# High Performance Setup (Fast processing)\nfast_config = {\n \"encoder_model\": \"all-MiniLM-L6-v2\",\n \"cross_encoder_model\": \"cross-encoder/ms-marco-MiniLM-L-6-v2\"\n}\n\n# High Accuracy Setup (Better quality)\naccurate_config = {\n \"encoder_model\": \"all-mpnet-base-v2\", \n \"cross_encoder_model\": \"cross-encoder/ms-marco-MiniLM-L-12-v2\"\n}\n\n# Domain-Specific Setup (for scientific/technical content)\ndomain_config = {\n \"encoder_model\": \"sentence-transformers/allenai-specter\",\n \"cross_encoder_model\": \"cross-encoder/ms-marco-electra-base\"\n}\n\nengine = ExoGraphEngine(config=accurate_config)\n```\n\n### Document Processing\n\n```python\nfrom llm_exo_graph import DocumentProcessor\n\nprocessor = DocumentProcessor()\nresults = processor.process_directory(\"./research_papers/\")\n```\n\n### Temporal Relationships & Negation Handling\n\n```python\n# Example: Career transitions with temporal intelligence\nengine.process_input([\n InputItem(\"Alice works as a software engineer at Google\"),\n InputItem(\"Alice no longer works at Google\"), # Negation - obsoletes previous\n InputItem(\"Alice started working at OpenAI in January 2024\") # New relationship\n])\n\n# The system automatically:\n# 1. Detects \"no longer\" as negation\n# 2. Finds conflicting relationships\n# 3. Obsoletes old relationship with end date\n# 4. Creates new relationship with start date\n```\n\n### Standardization in Action\n\n```python\n# These variations are automatically standardized:\nengine.process_input([\n InputItem(\"John works at Microsoft\"),\n InputItem(\"John is employed by Microsoft\"), # Standardized to \"WORKS_AT\"\n InputItem(\"John's employer is Microsoft\"), # Also standardized to \"WORKS_AT\"\n])\n\n# Result: All create the same standardized relationship\n# John \u2192 WORKS_AT \u2192 Microsoft (with different summaries)\n```\n\n### Conflict Resolution\n```python\n# Handles contradictions intelligently\nhistory = engine.get_entity_relationships(\"Alice\")\n# Shows both relationships with temporal metadata:\n# - Alice \u2192 WORKS_AT \u2192 Google [obsolete: 2024-01-15]\n# - Alice \u2192 WORKS_AT \u2192 OpenAI [active: 2024-01-16]\n```\n\n## \ud83e\uddea Examples\n\n- \ud83d\udcd6 [Bible Knowledge Graph](examples/bible_processing.ipynb)\n- \ud83e\uddec [Bio Research Graph](examples/bio_example.py)\n- \ud83d\udcc4 [Document Processing](examples/document_processing_example.py)\n- \ud83d\udd17 [API Integration](kg_api_server/tests/)\n\n## \ud83d\udee0\ufe0f Development\n\n### Running Tests\n```bash\npytest tests/\ncd kg_api_server && pytest tests/\n```\n\n### Contributing\nSee [CONTRIBUTING.md](docs/development/contributing.md)\n\n## \ud83d\udcc8 Performance\n\n- \u26a1 50-74% faster queries with optimizations\n- \ud83d\udd04 Batch processing for large datasets\n- \ud83d\udcbe Intelligent caching layers\n- \ud83c\udfaf Optimized Neo4j indexes\n\n## \ud83d\udce6 Package Information\n\n- **PyPI**: [https://pypi.org/project/llm-exo-graph/](https://pypi.org/project/llm-exo-graph/)\n- **Install**: `pip install llm-exo-graph`\n- **Version**: Check latest on PyPI\n- **Extras**: `[documents]`, `[dev]`, `[all]`\n\n## \ud83e\udd1d Community\n\n- \ud83d\udcd6 [Documentation](docs/)\n- \ud83d\udc1b [Issues](https://github.com/your-org/llm-exo-graph/issues) \n- \ud83d\udcac [Discussions](https://github.com/your-org/llm-exo-graph/discussions)\n- \ud83d\udce6 [PyPI Package](https://pypi.org/project/llm-exo-graph/)\n\n## \ud83d\udcdd License\n\nMIT License - see [LICENSE](LICENSE)\n\n---\n\n**LLM Exo-Graph** - Giving AI a persistent, searchable memory \ud83e\udde0\u2728",
"bugtrack_url": null,
"license": null,
"summary": "Advanced Knowledge Graph Engine with Document Processing, Semantic Search and Multi-LLM Integration",
"version": "1.2.1",
"project_urls": {
"Documentation": "https://github.com/dasein108/llm-exo-graph/docs",
"Homepage": "https://github.com/dasein108/llm-exo-graph",
"Issues": "https://github.com/dasein108/llm-exo-graph/issues",
"Repository": "https://github.com/dasein108/llm-exo-graph.git",
"Source": "https://github.com/dasein108/llm-exo-graph/"
},
"split_keywords": [
"ai",
" conflict-resolution",
" entity-recognition",
" graph-database",
" hybrid-search",
" information-extraction",
" knowledge-graph",
" llm",
" machine-learning",
" neo4j",
" nlp",
" openai",
" relationship-extraction",
" semantic-search",
" temporal-tracking",
" vector-search"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "d934ea73defa3b8e67c6915e2dac8eabb2161da1639988440bd83e5e07389930",
"md5": "5c276217c7fa1de8efe99e7894cd1f7b",
"sha256": "55ce2fe2a411158343f25fc528af0ea9ac3066f9ef2e06b5b4d573ac82ba790b"
},
"downloads": -1,
"filename": "llm_exo_graph-1.2.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "5c276217c7fa1de8efe99e7894cd1f7b",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 59034,
"upload_time": "2025-07-29T12:53:42",
"upload_time_iso_8601": "2025-07-29T12:53:42.085511Z",
"url": "https://files.pythonhosted.org/packages/d9/34/ea73defa3b8e67c6915e2dac8eabb2161da1639988440bd83e5e07389930/llm_exo_graph-1.2.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "180fc16eaca928933c46225a633a644779043956e7689ce14485c3abe2d3a136",
"md5": "a43e079d5140ab6d5fc948c73c3c548c",
"sha256": "cb17c6dee580ab05012a9e34b282d48d9ef894224abecdb0c54becdea5b2c6d3"
},
"downloads": -1,
"filename": "llm_exo_graph-1.2.1.tar.gz",
"has_sig": false,
"md5_digest": "a43e079d5140ab6d5fc948c73c3c548c",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 10968585,
"upload_time": "2025-07-29T12:53:43",
"upload_time_iso_8601": "2025-07-29T12:53:43.262340Z",
"url": "https://files.pythonhosted.org/packages/18/0f/c16eaca928933c46225a633a644779043956e7689ce14485c3abe2d3a136/llm_exo_graph-1.2.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-29 12:53:43",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "dasein108",
"github_project": "llm-exo-graph",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "llm-exo-graph"
}