Name | graph-notebook JSON |
Version |
5.0.1
JSON |
| download |
home_page | None |
Summary | Jupyter notebook extension to connect to graph databases |
upload_time | 2025-05-19 21:05:10 |
maintainer | None |
docs_url | None |
author | None |
requires_python | <3.12,>=3.9 |
license |
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS |
keywords |
gremlin
jupyter
neptune
opencypher
sparql
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Graph Notebook: easily query and visualize graphs
The graph notebook provides an easy way to interact with graph databases using Jupyter notebooks. Using this open-source Python package, you can connect to any graph database that supports the [Apache TinkerPop](https://tinkerpop.apache.org/), [openCypher](https://github.com/opencypher/openCypher) or the [RDF SPARQL](https://www.w3.org/TR/rdf-sparql-query/) graph models. These databases could be running locally on your desktop or in the cloud. Graph databases can be used to explore a variety of use cases including [knowledge graphs](https://aws.amazon.com/neptune/knowledge-graphs-on-aws/) and [identity graphs](https://aws.amazon.com/neptune/identity-graphs-on-aws/).

## Visualizing Gremlin queries

## Visualizing openCypher queries

## Visualizing SPARQL queries

Instructions for connecting to the following graph databases:
| Endpoint | Graph model | Query language |
| :-----------------------------: | :---------------------: | :-----------------: |
|[Gremlin Server](#gremlin-server)| property graph | Gremlin |
| [Blazegraph](#blazegraph) | RDF | SPARQL |
|[Amazon Neptune](#amazon-neptune)| property graph or RDF | Gremlin, openCypher, or SPARQL |
| [Neo4J](#neo4j) | property graph | Cypher |
We encourage others to contribute configurations they find useful. There is an [`additional-databases`](https://github.com/aws/graph-notebook/blob/main/additional-databases) folder where more information can be found.
## Features
### Notebook cell 'magic' extensions in the IPython 3 kernel
`%%sparql` - Executes a SPARQL query against your configured database endpoint. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/notebooks-magics.html#notebooks-cell-magics-sparql)
`%%gremlin` - Executes a Gremlin query against your database using web sockets. The results are similar to those a Gremlin console would return. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/notebooks-magics.html#notebooks-cell-magics-gremlin)
`%%opencypher` or `%%oc` Executes an openCypher query against your database. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/notebooks-magics.html#notebooks-cell-magics-opencypher)
`%%graph_notebook_config` - Sets the executing notebook's database configuration to the JSON payload provided in the cell body.
`%%graph_notebook_vis_options` - Sets the executing notebook's [vis.js options](https://visjs.github.io/vis-network/docs/network/physics.html) to the JSON payload provided in the cell body.
`%%neptune_ml` - Set of commands to integrate with NeptuneML functionality, as described [here](https://docs.aws.amazon.com/neptune/latest/userguide/notebooks-magics.html#notebooks-line-magics-neptune_ml). [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning.html)
**TIP** :point_right: `%%sparql`, `%%gremlin`, and `%%oc` share a [suite of common arguments](https://docs.aws.amazon.com/neptune/latest/userguide/notebooks-magics.html#notebook-magics-query-args) that be used to customize the appearance of rendered graphs. Example usage of these arguments can also be found in the sample notebooks under [02-Visualization](https://github.com/aws/graph-notebook/tree/main/src/graph_notebook/notebooks/02-Visualization).
**TIP** :point_right: There is syntax highlighting for language query magic cells to help you structure your queries more easily.
#### Notebook line 'magic' extensions in the IPython 3 kernel
`%gremlin_status` - Obtain the status of Gremlin queries. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/gremlin-api-status.html)
`%sparql_status` - Obtain the status of SPARQL queries. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/sparql-api-status.html)
`%opencypher_status` or `%oc_status` - Obtain the status of openCypher queries. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/access-graph-opencypher-status.html)
`%load` - Generate a form to submit a bulk loader job. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/bulk-load.html)
`%load_ids` - Get ids of bulk load jobs. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/load-api-reference-status-examples.html)
`%load_status` - Get the status of a provided `load_id`. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/load-api-reference-status-examples.html)
`%cancel_load` - Cancels a bulk load job. You can either provide a single `load_id`, or specify `--all-in-queue` to cancel all queued (and not actively running) jobs. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/load-api-reference-cancel.html)
`%neptune_ml` - Set of commands to integrate with NeptuneML functionality, as described [here](https://docs.aws.amazon.com/neptune/latest/userguide/notebooks-magics.html#notebooks-cell-magics-neptune_ml). You can find a set of tutorial notebooks [here](https://github.com/aws/graph-notebook/tree/main/src/graph_notebook/notebooks/04-Machine-Learning).
[Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning.html)
`%status` - Check the Health Status of the configured host endpoint. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/access-graph-status.html)
`%seed` - Provides a form to add data to your graph, using sets of insert queries instead of a bulk loader. Sample RDF and Property Graph data models are provided with this command. Alternatively, you can select a language type and provide a file path(or a directory path containing one or more of these files) to load the queries from.
`%stream_viewer` - Interactively explore the Neptune CDC stream (if enabled)
`%graph_notebook_config` - Returns a JSON payload that contains connection information for your host.
`%graph_notebook_host` - Set the host endpoint to send queries to.
`%graph_notebook_version` - Print the version of the `graph-notebook` package
`%graph_notebook_vis_options` - Print the Vis.js options being used for rendered graphs
**TIP** :point_right: You can list all the magics installed in the Python 3 kernel using the `%lsmagic` command.
**TIP** :point_right: Many of the magic commands support a `--help` option in order to provide additional information.
## Example notebooks
This project includes many example Jupyter notebooks. It is recommended to explore them. All of the commands and features supported by `graph-notebook` are explained in detail with examples within the sample notebooks. You can find them [here](./src/graph_notebook/notebooks/). As this project has evolved, many new features have been added. If you are already familiar with graph-notebook but want a quick summary of new features added, a good place to start is the Air-Routes notebooks in the [02-Visualization](./src/graph_notebook/notebooks/02-Visualization) folder.
## Keeping track of new features
It is recommended to check the [ChangeLog.md](ChangeLog.md) file periodically to keep up to date as new features are added.
## Prerequisites
You will need:
* [Python](https://www.python.org/downloads/)
* For JupyterLab 4x Version: 3.9.x-3.11.x
* For JupyterLab 3x Version: 3.9.x-3.10.14
* A graph database that provides one or more of:
* A SPARQL 1.1 endpoint
* An Apache TinkerPop Gremlin Server compatible endpoint
* An endpoint compatible with openCypher
## Installation
Follow the instructions for either Jupyter Classic Notebook or JupyterLab based on your requirements.
### Jupyter Classic Notebook
``` bash
pip install graph-notebook
# Enable the visualization widget
jupyter nbclassic-extension enable --py --sys-prefix graph_notebook.widgets
# copy static html resources
python -m graph_notebook.static_resources.install
python -m graph_notebook.nbextensions.install
# copy premade starter notebooks
python -m graph_notebook.notebooks.install --destination ~/notebook/destination/dir
# create nbconfig file and directory tree, if they do not already exist
mkdir ~/.jupyter/nbconfig
touch ~/.jupyter/nbconfig/notebook.json
# start jupyter notebook
python -m graph_notebook.start_notebook --notebooks-dir ~/notebook/destination/dir
```
### Jupyter Lab
Graph-notebook has been upgraded to support JupyterLab 4.x since version 5.0.0, featuring a modernized widget architecture and improved compatibility.
Choose your installation based on your JupyterLab version requirements.
### JupyterLab 4.x (Recommended)
``` bash
# install jupyterlab
pip install "jupyterlab>=4.3.5,<5"
# Install the latest version with JupyterLab 4.x support
pip install graph-notebook
# copy premade starter notebooks
python -m graph_notebook.notebooks.install --destination ~/notebook/destination/dir
# start jupyterlab
python -m graph_notebook.start_jupyterlab --jupyter-dir ~/notebook/destination/dir
```
### JupyterLab 3.x (Legacy)
``` bash
# install jupyterlab
pip install "jupyterlab>=3,<4"
# Install legacy version for JupyterLab 3.x compatibility
pip install "graph-notebook<5.0.0"
# copy premade starter notebooks
python -m graph_notebook.notebooks.install --destination ~/notebook/destination/dir
# start jupyterlab
python -m graph_notebook.start_jupyterlab --jupyter-dir ~/notebook/destination/dir
```
#### Loading magic extensions in JupyterLab
When attempting to run a line/cell magic on a new notebook in JupyterLab, you may encounter the error:
``` bash
UsageError: Cell magic `%%graph_notebook_config` not found.
```
To fix this, run the following command, then restart JupyterLab.
``` bash
python -m graph_notebook.ipython_profile.configure_ipython_profile
```
Alternatively, the magic extensions can be manually reloaded for a single notebook by running the following command in any empty cell.
``` bash
%load_ext graph_notebook.magics
```
## Upgrading an existing installation
``` bash
# upgrade graph-notebook
pip install graph-notebook --upgrade
```
After the above command completes, rerun the commands given at [Jupyter Classic Notebook](#jupyter-classic-notebook) or [JupyterLab 3.x](#jupyterlab-3x) based on which flavour is installed.
## Connecting to a graph database
Configuration options can be set using the `%graph_notebook_config` magic command. The command accepts a JSON object as an argument. The JSON object can contain any of the configuration options listed below. The command can be run multiple times to change the configuration. The configuration is stored in the notebook's metadata and will be used for all subsequent queries.
| Configuration Option | Description | Default Value | Type |
| --- | --- | --- | --- |
| auth_mode | The authentication mode to use for Amazon Neptune connections | DEFAULT | string |
| aws_region | The AWS region to use for Amazon Neptune connections | your-region-1 | string |
| host | The host url to form a connection with | localhost | string |
| load_from_s3_arn | The ARN of the S3 bucket to load data from [Amazon Neptune only] | | string |
| neptune_service | The name of the Neptune service for the host url [Amazon Neptune only] | neptune-db | string |
| port | The port to use when creating a connection | 8182 | number |
| proxy_host | The proxy host url to route a connection through [Amazon Neptune only]| | string |
| proxy_port | The proxy port to use when creating proxy connection [Amazon Neptune only] | 8182 | number |
| ssl | Whether to make connections to the created endpoint with ssl or not [True/False] | False | boolean |
| ssl_verify | Whether to verify the server's TLS certificate or not [True/False] | True | boolean |
| sparql | SPARQL connection object | ``` { "path": "sparql" } ``` | string |
| gremlin | Gremlin connection object | ``` { "username": "", "password": "", "traversal_source": "g", "message_serializer": "graphsonv3" } ```| string |
| neo4j | Neo4J connection object |``` { "username": "neo4j", "password": "password", "auth": true, "database": null } ``` | string |
### Gremlin Server
In a new cell in the Jupyter notebook, change the configuration using `%%graph_notebook_config` and modify the fields for `host`, `port`, and `ssl`. Optionally, modify `traversal_source` if your graph traversal source name differs from the default value, `username` and `password` if required by the graph store, or `message_serializer` for a specific data transfer format. For a local Gremlin server (HTTP or WebSockets), you can use the following command:
``` python
%%graph_notebook_config
{
"host": "localhost",
"port": 8182,
"ssl": false,
"gremlin": {
"traversal_source": "g",
"username": "",
"password": "",
"message_serializer": "graphsonv3"
}
}
```
To setup a new local Gremlin Server for use with the graph notebook, check out [`additional-databases/gremlin server`](additional-databases/gremlin-server)
### Blazegraph
Change the configuration using `%%graph_notebook_config` and modify the fields for `host`, `port`, and `ssl`. For a local Blazegraph database, you can use the following command:
``` python
%%graph_notebook_config
{
"host": "localhost",
"port": 9999,
"ssl": false,
"sparql": {
"path": "sparql"
}
}
```
You can also make use of namespaces for Blazegraph by specifying the path `graph-notebook` should use when querying your SPARQL like below:
``` python
%%graph_notebook_config
{
"host": "localhost",
"port": 9999,
"ssl": false,
"sparql": {
"path": "blazegraph/namespace/foo/sparql"
}
}
```
This will result in the url `localhost:9999/blazegraph/namespace/foo/sparql` being used when executing any `%%sparql` magic commands.
To setup a new local Blazegraph database for use with the graph notebook, check out the [Quick Start](https://github.com/blazegraph/database/wiki/Quick_Start) from Blazegraph.
### Amazon Neptune
Change the configuration using `%%graph_notebook_config` and modify the defaults as they apply to your Neptune instance.
#### Neptune DB
``` python
%%graph_notebook_config
{
"host": "your-neptune-endpoint",
"neptune_service": "neptune-db",
"port": 8182,
"auth_mode": "DEFAULT",
"load_from_s3_arn": "",
"ssl": true,
"ssl_verify": true,
"aws_region": "your-neptune-region"
}
```
#### Neptune Analytics
``` python
%%graph_notebook_config
{
"host": "your-neptune-endpoint",
"neptune_service": "neptune-graph",
"port": 443,
"auth_mode": "IAM",
"ssl": true,
"ssl_verify": true,
"aws_region": "your-neptune-region"
}
```
To setup a new Amazon Neptune cluster, check out the [Amazon Web Services documentation](https://docs.aws.amazon.com/neptune/latest/userguide/manage-console-launch.html).
When connecting the graph notebook to Neptune via a private endpoint, make sure you have a network setup to communicate to the VPC that Neptune runs on. If not, you can follow [this guide](https://github.com/aws/graph-notebook/tree/main/additional-databases/neptune).
In addition to the above configuration options, you can also specify the following options:
### Amazon Neptune Proxy Connection
``` python
%%graph_notebook_config
{
"host": "clustername.cluster-ididididid.us-east-1.neptune.amazonaws.com",
"neptune_service": "neptune-db",
"port": 8182,
"ssl": true,
"proxy_port": 8182,
"proxy_host": "host.proxy.com",
"auth_mode": "IAM",
"aws_region": "us-east-1",
"load_from_s3_arn": ""
}
```
See also: Connecting to Amazon Neptune from clients outside the Neptune VPC using AWS Network [Load Balancer](https://aws-samples.github.io/aws-dbs-refarch-graph/src/connecting-using-a-load-balancer/#connecting-to-amazon-neptune-from-clients-outside-the-neptune-vpc-using-aws-network-load-balancer)
## Authentication (Amazon Neptune)
If you are running a SigV4 authenticated endpoint, ensure that your configuration has `auth_mode` set to `IAM`:
``` python
%%graph_notebook_config
{
"host": "your-neptune-endpoint",
"neptune_service": "neptune-db",
"port": 8182,
"auth_mode": "IAM",
"load_from_s3_arn": "",
"ssl": true,
"ssl_verify": true,
"aws_region": "your-neptune-region"
}
```
Additionally, you should have the following Amazon Web Services credentials available in a location accessible to Boto3:
* Access Key ID
* Secret Access Key
* Default Region
* Session Token (OPTIONAL. Use if you are using temporary credentials)
These variables must follow a specific naming convention, as listed in the [Boto3 documentation](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html#using-environment-variables)
A list of all locations checked for Amazon Web Services credentials can also be found [here](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html#configuring-credentials).
### Neo4J
Change the configuration using `%%graph_notebook_config` and modify the fields for `host`, `port`, `ssl`, and `neo4j` authentication.
If your Neo4J instance supports [multiple databases](https://neo4j.com/developer/manage-multiple-databases/), you can specify a database name via the `database` field. Otherwise, leave the `database` field blank to query the default database.
For a local Neo4j Desktop database, you can use the following command:
``` python
%%graph_notebook_config
{
"host": "localhost",
"port": 7687,
"ssl": false,
"neo4j": {
"username": "neo4j",
"password": "password",
"auth": true,
"database": ""
}
}
```
Ensure that you also specify the `%%oc bolt` option when submitting queries to the Bolt endpoint.
To setup a new local Neo4J Desktop database for use with the graph notebook, check out the [Neo4J Desktop User Interface Guide](https://neo4j.com/developer/neo4j-desktop/).
## Building From Source
A pre-release distribution can be built from the graph-notebook repository via the following steps:
``` bash
# 1) Clone the repository and navigate into the clone directory
git clone https://github.com/aws/graph-notebook.git
cd graph-notebook
# 2) Create a new virtual environment
# 2a) Option 1 - pyenv
# install pyenv - https://github.com/pyenv/pyenv?tab=readme-ov-file#installation
# install pyenv-virtualenv - https://github.com/pyenv/pyenv-virtualenv?tab=readme-ov-file#installation
pyenv install 3.10.13 # Only if not already installed; this can be any supported Python 3 version in Prerequisites
pyenv virtualenv 3.10.13 build-graph-notebook
pyenv local build-graph-notebook
# 2b) Option 2 - venv
deactivate
conda deactivate
rm -rf /tmp/venv
python3 -m venv --clear /tmp/venv
source /tmp/venv/bin/activate
# 3) Install build dependencies
pip install --upgrade build hatch hatch-jupyter-builder
pip install "jupyterlab>=4.3.5,<5"
# 4) Build the distribution
python3 -m build .
```
You should now be able to find the built distribution at
`./dist/graph_notebook-5.0.1-py3-none-any.whl`
And use it by following the [installation](https://github.com/aws/graph-notebook#installation) steps, replacing
``` python
pip install graph-notebook
```
with
``` python
pip install ./dist/graph_notebook-5.0.1-py3-none-any.whl --force-reinstall
```
## Contributing Guidelines
See [CONTRIBUTING](https://github.com/aws/graph-notebook/blob/main/CONTRIBUTING.md) for more information.
## License
This project is licensed under the Apache-2.0 License.
Raw data
{
"_id": null,
"home_page": null,
"name": "graph-notebook",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.12,>=3.9",
"maintainer_email": null,
"keywords": "gremlin, jupyter, neptune, opencypher, sparql",
"author": null,
"author_email": "amazon-neptune <amazon-neptune-pypi@amazon.com>",
"download_url": "https://files.pythonhosted.org/packages/f4/fa/f3baf8acd60744188e768c68fe3b3bcca03618a9580f4ffd5c4be082a949/graph_notebook-5.0.1.tar.gz",
"platform": null,
"description": "# Graph Notebook: easily query and visualize graphs\n\nThe graph notebook provides an easy way to interact with graph databases using Jupyter notebooks. Using this open-source Python package, you can connect to any graph database that supports the [Apache TinkerPop](https://tinkerpop.apache.org/), [openCypher](https://github.com/opencypher/openCypher) or the [RDF SPARQL](https://www.w3.org/TR/rdf-sparql-query/) graph models. These databases could be running locally on your desktop or in the cloud. Graph databases can be used to explore a variety of use cases including [knowledge graphs](https://aws.amazon.com/neptune/knowledge-graphs-on-aws/) and [identity graphs](https://aws.amazon.com/neptune/identity-graphs-on-aws/).\n\n\n\n## Visualizing Gremlin queries\n\n\n\n## Visualizing openCypher queries\n\n\n\n## Visualizing SPARQL queries\n\n\n\nInstructions for connecting to the following graph databases:\n\n| Endpoint | Graph model | Query language |\n| :-----------------------------: | :---------------------: | :-----------------: |\n|[Gremlin Server](#gremlin-server)| property graph | Gremlin |\n| [Blazegraph](#blazegraph) | RDF | SPARQL |\n|[Amazon Neptune](#amazon-neptune)| property graph or RDF | Gremlin, openCypher, or SPARQL |\n| [Neo4J](#neo4j) | property graph | Cypher |\n\nWe encourage others to contribute configurations they find useful. There is an [`additional-databases`](https://github.com/aws/graph-notebook/blob/main/additional-databases) folder where more information can be found.\n\n## Features\n\n### Notebook cell 'magic' extensions in the IPython 3 kernel\n\n`%%sparql` - Executes a SPARQL query against your configured database endpoint. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/notebooks-magics.html#notebooks-cell-magics-sparql)\n\n`%%gremlin` - Executes a Gremlin query against your database using web sockets. The results are similar to those a Gremlin console would return. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/notebooks-magics.html#notebooks-cell-magics-gremlin)\n\n`%%opencypher` or `%%oc` Executes an openCypher query against your database. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/notebooks-magics.html#notebooks-cell-magics-opencypher)\n\n`%%graph_notebook_config` - Sets the executing notebook's database configuration to the JSON payload provided in the cell body.\n\n`%%graph_notebook_vis_options` - Sets the executing notebook's [vis.js options](https://visjs.github.io/vis-network/docs/network/physics.html) to the JSON payload provided in the cell body.\n\n`%%neptune_ml` - Set of commands to integrate with NeptuneML functionality, as described [here](https://docs.aws.amazon.com/neptune/latest/userguide/notebooks-magics.html#notebooks-line-magics-neptune_ml). [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning.html)\n\n**TIP** :point_right: `%%sparql`, `%%gremlin`, and `%%oc` share a [suite of common arguments](https://docs.aws.amazon.com/neptune/latest/userguide/notebooks-magics.html#notebook-magics-query-args) that be used to customize the appearance of rendered graphs. Example usage of these arguments can also be found in the sample notebooks under [02-Visualization](https://github.com/aws/graph-notebook/tree/main/src/graph_notebook/notebooks/02-Visualization).\n\n**TIP** :point_right: There is syntax highlighting for language query magic cells to help you structure your queries more easily.\n\n#### Notebook line 'magic' extensions in the IPython 3 kernel\n\n`%gremlin_status` - Obtain the status of Gremlin queries. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/gremlin-api-status.html)\n\n`%sparql_status` - Obtain the status of SPARQL queries. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/sparql-api-status.html)\n\n`%opencypher_status` or `%oc_status` - Obtain the status of openCypher queries. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/access-graph-opencypher-status.html)\n\n`%load` - Generate a form to submit a bulk loader job. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/bulk-load.html)\n\n`%load_ids` - Get ids of bulk load jobs. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/load-api-reference-status-examples.html)\n\n`%load_status` - Get the status of a provided `load_id`. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/load-api-reference-status-examples.html)\n\n`%cancel_load` - Cancels a bulk load job. You can either provide a single `load_id`, or specify `--all-in-queue` to cancel all queued (and not actively running) jobs. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/load-api-reference-cancel.html)\n\n`%neptune_ml` - Set of commands to integrate with NeptuneML functionality, as described [here](https://docs.aws.amazon.com/neptune/latest/userguide/notebooks-magics.html#notebooks-cell-magics-neptune_ml). You can find a set of tutorial notebooks [here](https://github.com/aws/graph-notebook/tree/main/src/graph_notebook/notebooks/04-Machine-Learning).\n[Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning.html)\n\n`%status` - Check the Health Status of the configured host endpoint. [Documentation](https://docs.aws.amazon.com/neptune/latest/userguide/access-graph-status.html)\n\n`%seed` - Provides a form to add data to your graph, using sets of insert queries instead of a bulk loader. Sample RDF and Property Graph data models are provided with this command. Alternatively, you can select a language type and provide a file path(or a directory path containing one or more of these files) to load the queries from.\n\n`%stream_viewer` - Interactively explore the Neptune CDC stream (if enabled)\n\n`%graph_notebook_config` - Returns a JSON payload that contains connection information for your host.\n\n`%graph_notebook_host` - Set the host endpoint to send queries to.\n\n`%graph_notebook_version` - Print the version of the `graph-notebook` package\n\n`%graph_notebook_vis_options` - Print the Vis.js options being used for rendered graphs\n\n**TIP** :point_right: You can list all the magics installed in the Python 3 kernel using the `%lsmagic` command.\n\n**TIP** :point_right: Many of the magic commands support a `--help` option in order to provide additional information.\n\n## Example notebooks\n\nThis project includes many example Jupyter notebooks. It is recommended to explore them. All of the commands and features supported by `graph-notebook` are explained in detail with examples within the sample notebooks. You can find them [here](./src/graph_notebook/notebooks/). As this project has evolved, many new features have been added. If you are already familiar with graph-notebook but want a quick summary of new features added, a good place to start is the Air-Routes notebooks in the [02-Visualization](./src/graph_notebook/notebooks/02-Visualization) folder.\n\n## Keeping track of new features\n\nIt is recommended to check the [ChangeLog.md](ChangeLog.md) file periodically to keep up to date as new features are added.\n\n## Prerequisites\n\nYou will need:\n\n* [Python](https://www.python.org/downloads/) \n * For JupyterLab 4x Version: 3.9.x-3.11.x\n * For JupyterLab 3x Version: 3.9.x-3.10.14 \n* A graph database that provides one or more of:\n * A SPARQL 1.1 endpoint\n * An Apache TinkerPop Gremlin Server compatible endpoint\n * An endpoint compatible with openCypher\n \n## Installation\n\nFollow the instructions for either Jupyter Classic Notebook or JupyterLab based on your requirements.\n\n\n### Jupyter Classic Notebook\n\n``` bash\npip install graph-notebook\n\n# Enable the visualization widget\njupyter nbclassic-extension enable --py --sys-prefix graph_notebook.widgets\n\n# copy static html resources\npython -m graph_notebook.static_resources.install\npython -m graph_notebook.nbextensions.install\n\n# copy premade starter notebooks\npython -m graph_notebook.notebooks.install --destination ~/notebook/destination/dir\n\n# create nbconfig file and directory tree, if they do not already exist\nmkdir ~/.jupyter/nbconfig\ntouch ~/.jupyter/nbconfig/notebook.json\n\n# start jupyter notebook\npython -m graph_notebook.start_notebook --notebooks-dir ~/notebook/destination/dir\n```\n\n### Jupyter Lab\n\nGraph-notebook has been upgraded to support JupyterLab 4.x since version 5.0.0, featuring a modernized widget architecture and improved compatibility.\n\nChoose your installation based on your JupyterLab version requirements.\n\n\n### JupyterLab 4.x (Recommended)\n\n``` bash\n# install jupyterlab\npip install \"jupyterlab>=4.3.5,<5\"\n\n# Install the latest version with JupyterLab 4.x support\npip install graph-notebook\n\n# copy premade starter notebooks\npython -m graph_notebook.notebooks.install --destination ~/notebook/destination/dir\n\n# start jupyterlab\npython -m graph_notebook.start_jupyterlab --jupyter-dir ~/notebook/destination/dir\n```\n\n### JupyterLab 3.x (Legacy)\n\n``` bash\n\n# install jupyterlab\npip install \"jupyterlab>=3,<4\"\n\n# Install legacy version for JupyterLab 3.x compatibility\npip install \"graph-notebook<5.0.0\"\n\n# copy premade starter notebooks\npython -m graph_notebook.notebooks.install --destination ~/notebook/destination/dir\n\n# start jupyterlab\npython -m graph_notebook.start_jupyterlab --jupyter-dir ~/notebook/destination/dir\n```\n\n#### Loading magic extensions in JupyterLab\n\nWhen attempting to run a line/cell magic on a new notebook in JupyterLab, you may encounter the error:\n\n``` bash\nUsageError: Cell magic `%%graph_notebook_config` not found.\n```\n\nTo fix this, run the following command, then restart JupyterLab.\n\n``` bash\npython -m graph_notebook.ipython_profile.configure_ipython_profile\n```\n\nAlternatively, the magic extensions can be manually reloaded for a single notebook by running the following command in any empty cell.\n\n``` bash\n%load_ext graph_notebook.magics\n```\n\n## Upgrading an existing installation\n\n``` bash\n# upgrade graph-notebook\npip install graph-notebook --upgrade\n```\n\nAfter the above command completes, rerun the commands given at [Jupyter Classic Notebook](#jupyter-classic-notebook) or [JupyterLab 3.x](#jupyterlab-3x) based on which flavour is installed.\n\n## Connecting to a graph database\n\nConfiguration options can be set using the `%graph_notebook_config` magic command. The command accepts a JSON object as an argument. The JSON object can contain any of the configuration options listed below. The command can be run multiple times to change the configuration. The configuration is stored in the notebook's metadata and will be used for all subsequent queries.\n\n| Configuration Option | Description | Default Value | Type |\n| --- | --- | --- | --- |\n| auth_mode | The authentication mode to use for Amazon Neptune connections | DEFAULT | string |\n| aws_region | The AWS region to use for Amazon Neptune connections | your-region-1 | string |\n| host | The host url to form a connection with | localhost | string |\n| load_from_s3_arn | The ARN of the S3 bucket to load data from [Amazon Neptune only] | | string |\n| neptune_service | The name of the Neptune service for the host url [Amazon Neptune only] | neptune-db | string |\n| port | The port to use when creating a connection | 8182 | number |\n| proxy_host | The proxy host url to route a connection through [Amazon Neptune only]| | string |\n| proxy_port | The proxy port to use when creating proxy connection [Amazon Neptune only] | 8182 | number |\n| ssl | Whether to make connections to the created endpoint with ssl or not [True/False] | False | boolean |\n| ssl_verify | Whether to verify the server's TLS certificate or not [True/False] | True | boolean |\n| sparql | SPARQL connection object | ``` { \"path\": \"sparql\" } ``` | string |\n| gremlin | Gremlin connection object | ``` { \"username\": \"\", \"password\": \"\", \"traversal_source\": \"g\", \"message_serializer\": \"graphsonv3\" } ```| string |\n| neo4j | Neo4J connection object |``` { \"username\": \"neo4j\", \"password\": \"password\", \"auth\": true, \"database\": null } ``` | string |\n\n### Gremlin Server\n\nIn a new cell in the Jupyter notebook, change the configuration using `%%graph_notebook_config` and modify the fields for `host`, `port`, and `ssl`. Optionally, modify `traversal_source` if your graph traversal source name differs from the default value, `username` and `password` if required by the graph store, or `message_serializer` for a specific data transfer format. For a local Gremlin server (HTTP or WebSockets), you can use the following command:\n\n``` python\n%%graph_notebook_config\n{\n \"host\": \"localhost\",\n \"port\": 8182,\n \"ssl\": false,\n \"gremlin\": {\n \"traversal_source\": \"g\",\n \"username\": \"\",\n \"password\": \"\",\n \"message_serializer\": \"graphsonv3\"\n }\n}\n```\n\nTo setup a new local Gremlin Server for use with the graph notebook, check out [`additional-databases/gremlin server`](additional-databases/gremlin-server)\n\n### Blazegraph\n\nChange the configuration using `%%graph_notebook_config` and modify the fields for `host`, `port`, and `ssl`. For a local Blazegraph database, you can use the following command:\n\n``` python\n%%graph_notebook_config\n{\n \"host\": \"localhost\",\n \"port\": 9999,\n \"ssl\": false,\n \"sparql\": {\n \"path\": \"sparql\"\n }\n}\n```\n\nYou can also make use of namespaces for Blazegraph by specifying the path `graph-notebook` should use when querying your SPARQL like below:\n\n``` python\n%%graph_notebook_config\n\n{\n \"host\": \"localhost\",\n \"port\": 9999,\n \"ssl\": false,\n \"sparql\": {\n \"path\": \"blazegraph/namespace/foo/sparql\"\n }\n}\n```\n\nThis will result in the url `localhost:9999/blazegraph/namespace/foo/sparql` being used when executing any `%%sparql` magic commands.\n\nTo setup a new local Blazegraph database for use with the graph notebook, check out the [Quick Start](https://github.com/blazegraph/database/wiki/Quick_Start) from Blazegraph.\n\n### Amazon Neptune\n\nChange the configuration using `%%graph_notebook_config` and modify the defaults as they apply to your Neptune instance.\n\n#### Neptune DB\n\n``` python\n%%graph_notebook_config\n{\n \"host\": \"your-neptune-endpoint\",\n \"neptune_service\": \"neptune-db\",\n \"port\": 8182,\n \"auth_mode\": \"DEFAULT\",\n \"load_from_s3_arn\": \"\",\n \"ssl\": true,\n \"ssl_verify\": true,\n \"aws_region\": \"your-neptune-region\"\n}\n```\n\n#### Neptune Analytics\n\n``` python\n%%graph_notebook_config\n{\n \"host\": \"your-neptune-endpoint\",\n \"neptune_service\": \"neptune-graph\",\n \"port\": 443,\n \"auth_mode\": \"IAM\",\n \"ssl\": true,\n \"ssl_verify\": true,\n \"aws_region\": \"your-neptune-region\"\n}\n```\n\nTo setup a new Amazon Neptune cluster, check out the [Amazon Web Services documentation](https://docs.aws.amazon.com/neptune/latest/userguide/manage-console-launch.html).\n\nWhen connecting the graph notebook to Neptune via a private endpoint, make sure you have a network setup to communicate to the VPC that Neptune runs on. If not, you can follow [this guide](https://github.com/aws/graph-notebook/tree/main/additional-databases/neptune).\n\nIn addition to the above configuration options, you can also specify the following options:\n\n### Amazon Neptune Proxy Connection\n\n``` python\n%%graph_notebook_config\n{\n \"host\": \"clustername.cluster-ididididid.us-east-1.neptune.amazonaws.com\",\n \"neptune_service\": \"neptune-db\",\n \"port\": 8182,\n \"ssl\": true,\n \"proxy_port\": 8182,\n \"proxy_host\": \"host.proxy.com\",\n \"auth_mode\": \"IAM\",\n \"aws_region\": \"us-east-1\",\n \"load_from_s3_arn\": \"\"\n}\n```\n\nSee also: Connecting to Amazon Neptune from clients outside the Neptune VPC using AWS Network [Load Balancer](https://aws-samples.github.io/aws-dbs-refarch-graph/src/connecting-using-a-load-balancer/#connecting-to-amazon-neptune-from-clients-outside-the-neptune-vpc-using-aws-network-load-balancer)\n\n## Authentication (Amazon Neptune)\n\nIf you are running a SigV4 authenticated endpoint, ensure that your configuration has `auth_mode` set to `IAM`:\n\n``` python\n%%graph_notebook_config\n{\n \"host\": \"your-neptune-endpoint\",\n \"neptune_service\": \"neptune-db\",\n \"port\": 8182,\n \"auth_mode\": \"IAM\",\n \"load_from_s3_arn\": \"\",\n \"ssl\": true,\n \"ssl_verify\": true,\n \"aws_region\": \"your-neptune-region\"\n}\n```\n\nAdditionally, you should have the following Amazon Web Services credentials available in a location accessible to Boto3:\n\n* Access Key ID\n* Secret Access Key\n* Default Region\n* Session Token (OPTIONAL. Use if you are using temporary credentials)\n\nThese variables must follow a specific naming convention, as listed in the [Boto3 documentation](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html#using-environment-variables)\n\nA list of all locations checked for Amazon Web Services credentials can also be found [here](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html#configuring-credentials).\n\n### Neo4J\n\nChange the configuration using `%%graph_notebook_config` and modify the fields for `host`, `port`, `ssl`, and `neo4j` authentication.\n\nIf your Neo4J instance supports [multiple databases](https://neo4j.com/developer/manage-multiple-databases/), you can specify a database name via the `database` field. Otherwise, leave the `database` field blank to query the default database.\n\nFor a local Neo4j Desktop database, you can use the following command:\n\n``` python\n%%graph_notebook_config\n{\n \"host\": \"localhost\",\n \"port\": 7687,\n \"ssl\": false,\n \"neo4j\": {\n \"username\": \"neo4j\",\n \"password\": \"password\",\n \"auth\": true,\n \"database\": \"\"\n }\n}\n```\n\nEnsure that you also specify the `%%oc bolt` option when submitting queries to the Bolt endpoint.\n\nTo setup a new local Neo4J Desktop database for use with the graph notebook, check out the [Neo4J Desktop User Interface Guide](https://neo4j.com/developer/neo4j-desktop/).\n\n## Building From Source\n\nA pre-release distribution can be built from the graph-notebook repository via the following steps:\n\n``` bash\n# 1) Clone the repository and navigate into the clone directory\ngit clone https://github.com/aws/graph-notebook.git\ncd graph-notebook\n\n# 2) Create a new virtual environment\n\n# 2a) Option 1 - pyenv\n# install pyenv - https://github.com/pyenv/pyenv?tab=readme-ov-file#installation\n# install pyenv-virtualenv - https://github.com/pyenv/pyenv-virtualenv?tab=readme-ov-file#installation\npyenv install 3.10.13 # Only if not already installed; this can be any supported Python 3 version in Prerequisites\npyenv virtualenv 3.10.13 build-graph-notebook\npyenv local build-graph-notebook\n\n# 2b) Option 2 - venv\ndeactivate \nconda deactivate\nrm -rf /tmp/venv\npython3 -m venv --clear /tmp/venv\nsource /tmp/venv/bin/activate\n\n\n# 3) Install build dependencies\npip install --upgrade build hatch hatch-jupyter-builder\npip install \"jupyterlab>=4.3.5,<5\"\n\n# 4) Build the distribution\npython3 -m build .\n```\n\nYou should now be able to find the built distribution at\n\n`./dist/graph_notebook-5.0.1-py3-none-any.whl`\n\nAnd use it by following the [installation](https://github.com/aws/graph-notebook#installation) steps, replacing\n\n``` python\npip install graph-notebook\n```\n\nwith\n\n``` python\npip install ./dist/graph_notebook-5.0.1-py3-none-any.whl --force-reinstall\n```\n\n## Contributing Guidelines\n\nSee [CONTRIBUTING](https://github.com/aws/graph-notebook/blob/main/CONTRIBUTING.md) for more information.\n\n## License\n\nThis project is licensed under the Apache-2.0 License.\n",
"bugtrack_url": null,
"license": "\n Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n \n TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n \n 1. Definitions.\n \n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n \n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n \n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n \n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n \n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n \n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n \n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n \n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n \n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n \n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n \n 2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n \n 3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n \n 4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n \n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n \n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n \n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n \n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n \n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n \n 5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n \n 6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n \n 7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n \n 8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n \n 9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n \n END OF TERMS AND CONDITIONS",
"summary": "Jupyter notebook extension to connect to graph databases",
"version": "5.0.1",
"project_urls": null,
"split_keywords": [
"gremlin",
" jupyter",
" neptune",
" opencypher",
" sparql"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "e22db2bd3151b57103bab1d842d32befd7a1d3d5fcb264dec6430569ba130ccc",
"md5": "421122e9f306ddb97bdd8c565653fff3",
"sha256": "091a8b0c74df1b59e9d617118a60903c6fb9e4d2fbff7027b5b5a448e9192b94"
},
"downloads": -1,
"filename": "graph_notebook-5.0.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "421122e9f306ddb97bdd8c565653fff3",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.12,>=3.9",
"size": 21418983,
"upload_time": "2025-05-19T21:05:05",
"upload_time_iso_8601": "2025-05-19T21:05:05.073403Z",
"url": "https://files.pythonhosted.org/packages/e2/2d/b2bd3151b57103bab1d842d32befd7a1d3d5fcb264dec6430569ba130ccc/graph_notebook-5.0.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "f4faf3baf8acd60744188e768c68fe3b3bcca03618a9580f4ffd5c4be082a949",
"md5": "bdf43f6cf0d22a3d72dadb46c99ea1cc",
"sha256": "4ab0c557081fb05a057a2314c94429cea338cad21286b70cbd0771cc4f0b3113"
},
"downloads": -1,
"filename": "graph_notebook-5.0.1.tar.gz",
"has_sig": false,
"md5_digest": "bdf43f6cf0d22a3d72dadb46c99ea1cc",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.12,>=3.9",
"size": 20694941,
"upload_time": "2025-05-19T21:05:10",
"upload_time_iso_8601": "2025-05-19T21:05:10.131005Z",
"url": "https://files.pythonhosted.org/packages/f4/fa/f3baf8acd60744188e768c68fe3b3bcca03618a9580f4ffd5c4be082a949/graph_notebook-5.0.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-05-19 21:05:10",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "graph-notebook"
}