# π daplug-cypher (daβ’plug)
> **Schema-Driven Cypher Normalization & Event Publishing for Python**
[](https://circleci.com/gh/dual/daplug-cypher)
[](https://www.python.org/downloads/)
[](LICENSE)
[](https://github.com/paulcruse3/daplug-cypher/issues)
[](https://pypi.org/project/daplug-cypher/)
[](https://sonarcloud.io/summary/new_code?id=dual_daplug-cypher)
[](https://sonarcloud.io/summary/new_code?id=dual_daplug-cypher)
[](https://sonarcloud.io/summary/new_code?id=dual_daplug-cypher)
`daplug-cypher` brings the ergonomics of an adapter patthern to graph databases. It bundles Cypher-friendly schema mapping, optimistic concurrency, and SNS event fan-out so your graph services stay DRY, version-safe, and event-drivenβwhether you deploy to Neo4j or AWS Neptune (openCypher).
## β¨ Key Features
- **Unified factory** β `daplug_cypher.adapter(**kwargs)` returns a ready-to-go adapter with SNS support, just like `daplug_ddb`.
- **Schema mapping** β Reuse OpenAPI/JSON schemas to validate and normalize payloads before writing nodes or relationships.
- **Optimistic concurrency** β Guard updates with identifier + version keys; the adapter enforces atomic Cypher `SET` semantics.
- **Relationship helpers** β Convenience methods that enforce safe Cypher patterns for creating/deleting relationships.
- **Backend flexibility** β Supply `bolt={...}` for Neo4j, `neptune={...}` for Neptune, or both; the adapter chooses the right driver config automatically.
- **Per-operation targeting** β Pass `node`, `identifier`, and `idempotence_key` to each call so shared adapters can manage multiple labels safely.
- **Per-call SNS metadata** β Supply `sns_attributes` when writing to annotate events with request-specific context.
## π Quick Start
### Installation
```bash
pip install daplug-cypher
# pipenv install daplug-cypher
# poetry add daplug-cypher
# uv pip install daplug-cypher
```
### Basic Usage
```python
from daplug_cypher import adapter
graph = adapter(
bolt={
"url": "bolt://localhost:7687",
"user": "neo4j",
"password": "password",
},
schema_file="openapi.yml",
schema="CustomerModel",
)
payload = {
"customer_id": "abc123",
"name": "Ada",
"version": 1,
}
graph.create(data=payload, node="Customer")
result = graph.read(
query="MATCH (c:Customer) WHERE c.customer_id = $id RETURN c",
placeholder={"id": "abc123"},
node="Customer",
)
print(result["Customer"][0]["name"])
graph.create(
data=payload,
node="Customer",
sns_attributes={"source": "api"},
)
```
Because the adapter is schema-aware, every write can opt into mapping by passing `schema`. Skip it when you want to persist the payload exactly as provided. Select the node label (and identifiers) per call so a single adapter can service multiple models, and add `sns_attributes` when you want to decorate published events with request-specific context.
## π§ Advanced Configuration
### Public API Cheat Sheet
```python
from daplug_cypher import adapter
graph = adapter(bolt={"url": "bolt://localhost:7687", "user": "neo4j", "password": "password"})
# CREATE ---------------------------------------------------------------
graph.create(
data={"customer_id": "abc123", "name": "Ada", "version": 1},
node="Customer",
sns_attributes={"event": "customer-created"},
)
# READ / MATCH ---------------------------------------------------------
graph.read(
query="MATCH (c:Customer) WHERE c.customer_id = $id RETURN c",
placeholder={"id": "abc123"},
node="Customer",
)
# QUERY (raw parameterized Cypher) ------------------------------------
graph.query(
query="MATCH (c:Customer) WHERE c.customer_id = $id RETURN c",
placeholder={"id": "abc123"},
sns_attributes={"source": "reporting"},
)
# UPDATE (optimistic) --------------------------------------------------
graph.update(
data={"status": "vip"},
query="MATCH (c:Customer) WHERE c.customer_id = $id RETURN c",
placeholder={"id": "abc123"},
original_idempotence_value=1,
node="Customer",
identifier="customer_id",
idempotence_key="version",
sns_attributes={"event": "customer-updated"},
)
# DELETE ---------------------------------------------------------------
graph.delete(
delete_identifier="abc123",
node="Customer",
identifier="customer_id",
sns_attributes={"event": "customer-deleted"},
)
# RELATIONSHIP HELPERS -------------------------------------------------
graph.create_relationship(
query="""
MATCH (c:Customer), (o:Order)
WHERE c.customer_id = $customer AND o.order_id = $order
CREATE (c)-[:PLACED]->(o)
RETURN c, o
""",
placeholder={"customer": "abc123", "order": "o-789"},
sns_attributes={"event": "relationship-created"},
)
graph.delete_relationship(
query="""
MATCH (c:Customer)-[r:PLACED]->(o:Order)
WHERE c.customer_id = $customer AND o.order_id = $order
DETACH DELETE r
""",
placeholder={"customer": "abc123", "order": "o-789"},
sns_attributes={"event": "relationship-deleted"},
)
```
Each method mirrors the DynamoDB adapter API: provide per-call metadata, and the adapter handles schema normalization, optimistic locking, driver orchestration, and optional SNS fan-out.
### Neo4j & Neptune Targets
```python
graph = adapter(
bolt={"url": "bolt://localhost:7687", "user": "neo4j", "password": "password"},
neptune={"url": "bolt://neptune-endpoint:8182", "user": "user", "password": "secret"},
)
```
Provide both dictionaries to allow local Neo4j development with a production Neptune endpoint. When `neptune` is supplied it wins; otherwise `bolt` is used.
Use the same adapter instance for different node types by passing the appropriate label to each call (e.g., `graph.create(..., node="Order")`).
### Optimistic Updates
```python
graph.update(
data={"order_id": "abc123", "updated_at": 2, "status": "shipped"},
query="MATCH (o:Order) WHERE o.order_id = $id RETURN o",
placeholder={"id": "abc123"},
original_idempotence_value=1, # the previous value of updated_at
node="Order",
identifier="order_id",
idempotence_key="updated_at",
sns_attributes={"event": "status-change"},
)
```
If another session updates the node first, the adapter raises `ValueError("ATOMIC ERROR...")` rather than overwriting silently.
### Relationship Helpers
```python
graph.create(data={"customer_id": "abc123", "version": 1}, node="Customer")
graph.create(data={"order_id": "o-789", "version": 1}, node="Order")
graph.create_relationship(
query="""
MATCH (c:Customer), (o:Order)
WHERE c.customer_id = $customer AND o.order_id = $order
CREATE (c)-[:PLACED]->(o)
RETURN c, o
""",
placeholder={"customer": "abc123", "order": "o-789"},
)
graph.delete_relationship(
query="""
MATCH (c:Customer)-[r:PLACED]->(o:Order)
WHERE c.customer_id = $customer AND o.order_id = $order
DETACH DELETE r
""",
placeholder={"customer": "abc123", "order": "o-789"},
)
```
Validation ensures relationship queries include edge notation and destructive operations actually delete nodes/relationships.
### SNS Event Publishing
```python
graph = adapter(
bolt={...},
sns_arn="arn:aws:sns:us-east-2:123456789012:customers",
sns_attributes={"service": "crm"},
)
graph.delete(delete_identifier="abc123", node="Customer", identifier="customer_id")
```
Each write publishes an event with merged attributes (`operation`, default adapter attributes, custom attributes, and call-specific overrides).
## π§ͺ Testing
We split fast unit tests from integration suites targeting Neo4j and Neptune-compatible endpoints.
```bash
# Unit tests (pure Python, heavy mocking)
pipenv run test
# Integration suites
pipenv run test_neo4j # requires Neo4j Bolt endpoint (defaults to bolt://localhost:7687)
pipenv run test_neptune # reuses Bolt settings, can point at Neptune or LocalStack
# Coverage (Neo4j suite under coverage)
pipenv run coverage
```
Environment variables to override defaults:
| Variable | Purpose | Default |
| ---------------------- | ---------------------------------- | --------------------- |
| `NEO4J_BOLT_URL` | Neo4j Bolt connection URI | `bolt://localhost:7687` |
| `NEO4J_USER` / `_PASSWORD` | Neo4j credentials | `neo4j` / `password` |
| `NEPTUNE_BOLT_URL` | Neptune Bolt-compatible endpoint | falls back to Neo4j |
| `NEPTUNE_USER` / `_PASSWORD` | Neptune credentials | falls back to Neo4j |
## π§° Tooling & CI
`.circleci/config.yml` mirrors the DynamoDB project:
- `install-build` installs dependencies and persists the workspace.
- `lint` and `type-check` run `pipenv run lint` and `pipenv run mypy`.
- `test-neo4j` and `test-neptune` run pytest markers in parallel; the Neptune job provisions a LocalStack container for compatibility checks.
- `install-build-publish` retains the token-based PyPI workflow.
## π οΈ Local Development
### Prerequisites
- Python **3.9+**
- [Pipenv](https://pipenv.pypa.io/)
- Docker (for running Neo4j or LocalStack locally)
### Environment Setup
```bash
git clone https://github.com/paulcruse3/daplug-cypher.git
cd daplug-cypher
pipenv install --dev
```
### Workflow
```bash
pipenv run lint # pylint (JSON + HTML report)
pipenv run mypy # static typing (post-phase polish)
pipenv run test # unit tests
pipenv run test_neo4j # integration suite (requires Bolt endpoint)
pipenv run test_neptune # integration suite (LocalStack/Neptune)
```
When running Neo4j via Docker, set `NEO4J_AUTH=neo4j/password` before `docker run` so the tests can authenticate automatically.
## ποΈ Project Structure
```txt
daplug-cypher/
βββ daplug_cypher/
β βββ adapter.py # Cypher adapter implementation
β βββ common/ # Shared schema, merge, logging, publisher helpers
β βββ cypher/ # Parameter + serialization utilities
β βββ types/ # Shared TypedDict/type aliases (reused by common)
β βββ __init__.py # Public adapter factory & exports
βββ tests/
β βββ integration/ # Neo4j & Neptune pytest suites
β βββ unit/ # Mock-based unit coverage for every module
βββ .circleci/config.yml # CI pipeline
βββ Pipfile # Runtime & dev dependencies + scripts
βββ setup.py / setup.cfg # Packaging metadata & pytest config
βββ README.md # You are here
```
## π€ Contributing
We welcome issues and pull requests! Please ensure linting, typing, and both integration suites pass before submitting.
```bash
git checkout -b feature/amazing-cypher
# make your changes
pipenv run lint
pipenv run mypy
pipenv run test
pipenv run test_neo4j
pipenv run test_neptune
git commit -am "feat: amazing cypher enhancement"
git push origin feature/amazing-cypher
```
## π License
Apache License 2.0 β see [LICENSE](LICENSE) for the full text.
---
> Built to keep Cypher integrations as clean and predictable
Raw data
{
"_id": null,
"home_page": "https://github.com/dual/daplug-cypher",
"name": "daplug-cypher",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "cypher, neptune, graph, bolt, normalizer, schema, sns, event-driven, database, adapter, python-library",
"author": "Paul Cruse III",
"author_email": "paulcruse3@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/55/e6/27a36f1adf7ec64b1ad1f2b7d982401498164080f112b6c9652756a63cc4/daplug_cypher-1.0.0b1.tar.gz",
"platform": "any",
"description": "# \ud83d\udd17 daplug-cypher (da\u2022plug)\n\n> **Schema-Driven Cypher Normalization & Event Publishing for Python**\n\n[](https://circleci.com/gh/dual/daplug-cypher)\n[](https://www.python.org/downloads/)\n[](LICENSE)\n[](https://github.com/paulcruse3/daplug-cypher/issues)\n[](https://pypi.org/project/daplug-cypher/)\n[](https://sonarcloud.io/summary/new_code?id=dual_daplug-cypher)\n[](https://sonarcloud.io/summary/new_code?id=dual_daplug-cypher)\n[](https://sonarcloud.io/summary/new_code?id=dual_daplug-cypher)\n\n`daplug-cypher` brings the ergonomics of an adapter patthern to graph databases. It bundles Cypher-friendly schema mapping, optimistic concurrency, and SNS event fan-out so your graph services stay DRY, version-safe, and event-driven\u2014whether you deploy to Neo4j or AWS Neptune (openCypher).\n\n## \u2728 Key Features\n\n- **Unified factory** \u2013 `daplug_cypher.adapter(**kwargs)` returns a ready-to-go adapter with SNS support, just like `daplug_ddb`.\n- **Schema mapping** \u2013 Reuse OpenAPI/JSON schemas to validate and normalize payloads before writing nodes or relationships.\n- **Optimistic concurrency** \u2013 Guard updates with identifier + version keys; the adapter enforces atomic Cypher `SET` semantics.\n- **Relationship helpers** \u2013 Convenience methods that enforce safe Cypher patterns for creating/deleting relationships.\n- **Backend flexibility** \u2013 Supply `bolt={...}` for Neo4j, `neptune={...}` for Neptune, or both; the adapter chooses the right driver config automatically.\n- **Per-operation targeting** \u2013 Pass `node`, `identifier`, and `idempotence_key` to each call so shared adapters can manage multiple labels safely.\n- **Per-call SNS metadata** \u2013 Supply `sns_attributes` when writing to annotate events with request-specific context.\n\n## \ud83d\ude80 Quick Start\n\n### Installation\n\n```bash\npip install daplug-cypher\n# pipenv install daplug-cypher\n# poetry add daplug-cypher\n# uv pip install daplug-cypher\n```\n\n### Basic Usage\n\n```python\nfrom daplug_cypher import adapter\n\ngraph = adapter(\n bolt={\n \"url\": \"bolt://localhost:7687\",\n \"user\": \"neo4j\",\n \"password\": \"password\",\n },\n schema_file=\"openapi.yml\",\n schema=\"CustomerModel\",\n)\n\npayload = {\n \"customer_id\": \"abc123\",\n \"name\": \"Ada\",\n \"version\": 1,\n}\n\ngraph.create(data=payload, node=\"Customer\")\nresult = graph.read(\n query=\"MATCH (c:Customer) WHERE c.customer_id = $id RETURN c\",\n placeholder={\"id\": \"abc123\"},\n node=\"Customer\",\n)\n\nprint(result[\"Customer\"][0][\"name\"])\n\ngraph.create(\n data=payload,\n node=\"Customer\",\n sns_attributes={\"source\": \"api\"},\n)\n```\n\nBecause the adapter is schema-aware, every write can opt into mapping by passing `schema`. Skip it when you want to persist the payload exactly as provided. Select the node label (and identifiers) per call so a single adapter can service multiple models, and add `sns_attributes` when you want to decorate published events with request-specific context.\n\n## \ud83d\udd27 Advanced Configuration\n\n### Public API Cheat Sheet\n\n```python\nfrom daplug_cypher import adapter\n\ngraph = adapter(bolt={\"url\": \"bolt://localhost:7687\", \"user\": \"neo4j\", \"password\": \"password\"})\n\n# CREATE ---------------------------------------------------------------\ngraph.create(\n data={\"customer_id\": \"abc123\", \"name\": \"Ada\", \"version\": 1},\n node=\"Customer\",\n sns_attributes={\"event\": \"customer-created\"},\n)\n\n# READ / MATCH ---------------------------------------------------------\ngraph.read(\n query=\"MATCH (c:Customer) WHERE c.customer_id = $id RETURN c\",\n placeholder={\"id\": \"abc123\"},\n node=\"Customer\",\n)\n\n# QUERY (raw parameterized Cypher) ------------------------------------\ngraph.query(\n query=\"MATCH (c:Customer) WHERE c.customer_id = $id RETURN c\",\n placeholder={\"id\": \"abc123\"},\n sns_attributes={\"source\": \"reporting\"},\n)\n\n# UPDATE (optimistic) --------------------------------------------------\ngraph.update(\n data={\"status\": \"vip\"},\n query=\"MATCH (c:Customer) WHERE c.customer_id = $id RETURN c\",\n placeholder={\"id\": \"abc123\"},\n original_idempotence_value=1,\n node=\"Customer\",\n identifier=\"customer_id\",\n idempotence_key=\"version\",\n sns_attributes={\"event\": \"customer-updated\"},\n)\n\n# DELETE ---------------------------------------------------------------\ngraph.delete(\n delete_identifier=\"abc123\",\n node=\"Customer\",\n identifier=\"customer_id\",\n sns_attributes={\"event\": \"customer-deleted\"},\n)\n\n# RELATIONSHIP HELPERS -------------------------------------------------\ngraph.create_relationship(\n query=\"\"\"\n MATCH (c:Customer), (o:Order)\n WHERE c.customer_id = $customer AND o.order_id = $order\n CREATE (c)-[:PLACED]->(o)\n RETURN c, o\n \"\"\",\n placeholder={\"customer\": \"abc123\", \"order\": \"o-789\"},\n sns_attributes={\"event\": \"relationship-created\"},\n)\n\ngraph.delete_relationship(\n query=\"\"\"\n MATCH (c:Customer)-[r:PLACED]->(o:Order)\n WHERE c.customer_id = $customer AND o.order_id = $order\n DETACH DELETE r\n \"\"\",\n placeholder={\"customer\": \"abc123\", \"order\": \"o-789\"},\n sns_attributes={\"event\": \"relationship-deleted\"},\n)\n```\n\nEach method mirrors the DynamoDB adapter API: provide per-call metadata, and the adapter handles schema normalization, optimistic locking, driver orchestration, and optional SNS fan-out.\n\n### Neo4j & Neptune Targets\n\n```python\ngraph = adapter(\n bolt={\"url\": \"bolt://localhost:7687\", \"user\": \"neo4j\", \"password\": \"password\"},\n neptune={\"url\": \"bolt://neptune-endpoint:8182\", \"user\": \"user\", \"password\": \"secret\"},\n)\n```\n\nProvide both dictionaries to allow local Neo4j development with a production Neptune endpoint. When `neptune` is supplied it wins; otherwise `bolt` is used.\nUse the same adapter instance for different node types by passing the appropriate label to each call (e.g., `graph.create(..., node=\"Order\")`).\n\n### Optimistic Updates\n\n```python\ngraph.update(\n data={\"order_id\": \"abc123\", \"updated_at\": 2, \"status\": \"shipped\"},\n query=\"MATCH (o:Order) WHERE o.order_id = $id RETURN o\",\n placeholder={\"id\": \"abc123\"},\n original_idempotence_value=1, # the previous value of updated_at\n node=\"Order\",\n identifier=\"order_id\",\n idempotence_key=\"updated_at\",\n sns_attributes={\"event\": \"status-change\"},\n)\n```\n\nIf another session updates the node first, the adapter raises `ValueError(\"ATOMIC ERROR...\")` rather than overwriting silently.\n\n### Relationship Helpers\n\n```python\ngraph.create(data={\"customer_id\": \"abc123\", \"version\": 1}, node=\"Customer\")\ngraph.create(data={\"order_id\": \"o-789\", \"version\": 1}, node=\"Order\")\n\ngraph.create_relationship(\n query=\"\"\"\n MATCH (c:Customer), (o:Order)\n WHERE c.customer_id = $customer AND o.order_id = $order\n CREATE (c)-[:PLACED]->(o)\n RETURN c, o\n \"\"\",\n placeholder={\"customer\": \"abc123\", \"order\": \"o-789\"},\n)\n\ngraph.delete_relationship(\n query=\"\"\"\n MATCH (c:Customer)-[r:PLACED]->(o:Order)\n WHERE c.customer_id = $customer AND o.order_id = $order\n DETACH DELETE r\n \"\"\",\n placeholder={\"customer\": \"abc123\", \"order\": \"o-789\"},\n)\n```\n\nValidation ensures relationship queries include edge notation and destructive operations actually delete nodes/relationships.\n\n### SNS Event Publishing\n\n```python\ngraph = adapter(\n bolt={...},\n sns_arn=\"arn:aws:sns:us-east-2:123456789012:customers\",\n sns_attributes={\"service\": \"crm\"},\n)\n\ngraph.delete(delete_identifier=\"abc123\", node=\"Customer\", identifier=\"customer_id\")\n```\n\nEach write publishes an event with merged attributes (`operation`, default adapter attributes, custom attributes, and call-specific overrides).\n\n## \ud83e\uddea Testing\n\nWe split fast unit tests from integration suites targeting Neo4j and Neptune-compatible endpoints.\n\n```bash\n# Unit tests (pure Python, heavy mocking)\npipenv run test\n\n# Integration suites\npipenv run test_neo4j # requires Neo4j Bolt endpoint (defaults to bolt://localhost:7687)\npipenv run test_neptune # reuses Bolt settings, can point at Neptune or LocalStack\n\n# Coverage (Neo4j suite under coverage)\npipenv run coverage\n```\n\nEnvironment variables to override defaults:\n\n| Variable | Purpose | Default |\n| ---------------------- | ---------------------------------- | --------------------- |\n| `NEO4J_BOLT_URL` | Neo4j Bolt connection URI | `bolt://localhost:7687` |\n| `NEO4J_USER` / `_PASSWORD` | Neo4j credentials | `neo4j` / `password` |\n| `NEPTUNE_BOLT_URL` | Neptune Bolt-compatible endpoint | falls back to Neo4j |\n| `NEPTUNE_USER` / `_PASSWORD` | Neptune credentials | falls back to Neo4j |\n\n## \ud83e\uddf0 Tooling & CI\n\n`.circleci/config.yml` mirrors the DynamoDB project:\n\n- `install-build` installs dependencies and persists the workspace.\n- `lint` and `type-check` run `pipenv run lint` and `pipenv run mypy`.\n- `test-neo4j` and `test-neptune` run pytest markers in parallel; the Neptune job provisions a LocalStack container for compatibility checks.\n- `install-build-publish` retains the token-based PyPI workflow.\n\n## \ud83d\udee0\ufe0f Local Development\n\n### Prerequisites\n\n- Python **3.9+**\n- [Pipenv](https://pipenv.pypa.io/)\n- Docker (for running Neo4j or LocalStack locally)\n\n### Environment Setup\n\n```bash\ngit clone https://github.com/paulcruse3/daplug-cypher.git\ncd daplug-cypher\npipenv install --dev\n```\n\n### Workflow\n\n```bash\npipenv run lint # pylint (JSON + HTML report)\npipenv run mypy # static typing (post-phase polish)\npipenv run test # unit tests\npipenv run test_neo4j # integration suite (requires Bolt endpoint)\npipenv run test_neptune # integration suite (LocalStack/Neptune)\n```\n\nWhen running Neo4j via Docker, set `NEO4J_AUTH=neo4j/password` before `docker run` so the tests can authenticate automatically.\n\n## \ud83d\uddc2\ufe0f Project Structure\n\n```txt\ndaplug-cypher/\n\u251c\u2500\u2500 daplug_cypher/\n\u2502 \u251c\u2500\u2500 adapter.py # Cypher adapter implementation\n\u2502 \u251c\u2500\u2500 common/ # Shared schema, merge, logging, publisher helpers\n\u2502 \u251c\u2500\u2500 cypher/ # Parameter + serialization utilities\n\u2502 \u251c\u2500\u2500 types/ # Shared TypedDict/type aliases (reused by common)\n\u2502 \u2514\u2500\u2500 __init__.py # Public adapter factory & exports\n\u251c\u2500\u2500 tests/\n\u2502 \u251c\u2500\u2500 integration/ # Neo4j & Neptune pytest suites\n\u2502 \u2514\u2500\u2500 unit/ # Mock-based unit coverage for every module\n\u251c\u2500\u2500 .circleci/config.yml # CI pipeline\n\u251c\u2500\u2500 Pipfile # Runtime & dev dependencies + scripts\n\u251c\u2500\u2500 setup.py / setup.cfg # Packaging metadata & pytest config\n\u2514\u2500\u2500 README.md # You are here\n```\n\n## \ud83e\udd1d Contributing\n\nWe welcome issues and pull requests! Please ensure linting, typing, and both integration suites pass before submitting.\n\n```bash\ngit checkout -b feature/amazing-cypher\n# make your changes\npipenv run lint\npipenv run mypy\npipenv run test\npipenv run test_neo4j\npipenv run test_neptune\ngit commit -am \"feat: amazing cypher enhancement\"\ngit push origin feature/amazing-cypher\n```\n\n## \ud83d\udcc4 License\n\nApache License 2.0 \u2013 see [LICENSE](LICENSE) for the full text.\n\n---\n\n> Built to keep Cypher integrations as clean and predictable\n",
"bugtrack_url": null,
"license": "Apache License 2.0",
"summary": "A Cypher and Neptune graph normalization layer inspired by Trellis data adapters.",
"version": "1.0.0b1",
"project_urls": {
"Bug Reports": "https://github.com/dual/daplug-cypher/issues",
"CI/CD": "https://circleci.com/gh/dual/daplug-cypher",
"Documentation": "https://github.com/dual/daplug-cypher#readme",
"Homepage": "https://github.com/dual/daplug-cypher",
"Source Code": "https://github.com/dual/daplug-cypher"
},
"split_keywords": [
"cypher",
" neptune",
" graph",
" bolt",
" normalizer",
" schema",
" sns",
" event-driven",
" database",
" adapter",
" python-library"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "6ee0598e179ddce4eea38e2d95c46583d9acfc53042e222f6b390afb2dd01111",
"md5": "00315c51085e2c77192217af60d278ac",
"sha256": "7b0ee346dead974f7f8546e3330d5a81995d4cba416fe14d909f163fc4439926"
},
"downloads": -1,
"filename": "daplug_cypher-1.0.0b1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "00315c51085e2c77192217af60d278ac",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 22808,
"upload_time": "2025-11-03T02:15:18",
"upload_time_iso_8601": "2025-11-03T02:15:18.231479Z",
"url": "https://files.pythonhosted.org/packages/6e/e0/598e179ddce4eea38e2d95c46583d9acfc53042e222f6b390afb2dd01111/daplug_cypher-1.0.0b1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "55e627a36f1adf7ec64b1ad1f2b7d982401498164080f112b6c9652756a63cc4",
"md5": "d53a6c3846c3519e453cb65e6a25d092",
"sha256": "15980a9df6d5406880f76dbffe9046f9c525e905228107bc076f4af9c872cf8b"
},
"downloads": -1,
"filename": "daplug_cypher-1.0.0b1.tar.gz",
"has_sig": false,
"md5_digest": "d53a6c3846c3519e453cb65e6a25d092",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 22583,
"upload_time": "2025-11-03T02:15:19",
"upload_time_iso_8601": "2025-11-03T02:15:19.255202Z",
"url": "https://files.pythonhosted.org/packages/55/e6/27a36f1adf7ec64b1ad1f2b7d982401498164080f112b6c9652756a63cc4/daplug_cypher-1.0.0b1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-11-03 02:15:19",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "dual",
"github_project": "daplug-cypher",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"circle": true,
"lcname": "daplug-cypher"
}