kedro-graphql


Namekedro-graphql JSON
Version 0.4.0 PyPI version JSON
download
home_page
SummaryA kedro plugin for serving any kedro project as a GraphQL api
upload_time2023-11-08 18:19:16
maintainer
docs_urlNone
authoropensean
requires_python<3.11,>=3.10
licenseApache Software License (Apache 2.0)
keywords pipelines machine learning data pipelines data science data engineering
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # kedro-graphql

## Overview

Kedro-graphql is a [kedro-plugin](https://docs.kedro.org/en/stable/extend_kedro/plugins.html) 
for serving kedro projects as a graphql api.  It leverages 
[Strawberry](https://strawberry.rocks/), [FastAPI](https://fastapi.tiangolo.com/), 
and [Celery](https://docs.celeryq.dev/en/stable/index.html) to turn any 
 [Kedro](https://docs.kedro.org/en/stable/) project into a GraphqlQL api 
 with features such as:
 
 - a distributed task queue
 - subscribe to pipline events and logs via GraphQL subscriptions
 - storage
   - persist and track all pipelines executed via the API
 - [additional features](#features)

```mermaid
flowchart  TB
  api[GraphQL API\n<i>strawberry + FastAPI</i>]
  mongodb[(db: 'pipelines'\ncollection: 'pipelines'\n<i>mongdob</i>)]
  redis[(task queue\n<i>redis</i>)]
  worker[worker\n<i>celery</i>]

  api<-->mongodb
  api<-->redis
  worker<-->mongodb
  worker<-->redis

```

Figure 1. Architecture



## Quickstart

Install kedro-graphql into your kedro project environnment.

```
pip install kedro_graphql
```

Start the redis and mongo services using this [docker-compose.yaml](https://github.com/opensean/kedro-graphql/blob/main/docker-compose.yaml).

```
docker-compose up -d
```

Start the api server.

```
kedro gql
```

Start a worker (in another terminal).

```
kedro gql -w
```

Navigate to http://127.0.0.1:5000/graphql to access the graphql interface.

![strawberry-ui](docs/strawberry-ui.png)

The [docker-compose.yaml](./docker-compose.yaml) includes 
[mongo-express](https://github.com/mongo-express/mongo-express) and 
[redis-commander](https://github.com/joeferner/redis-commander) services 
to provide easy acess to MongoDB and redis.


Navigate to http://127.0.0.1:8082 to access mongo-express interface.

![mongo-express-ui](docs/mongo-express-ui.png)

Navigate to http://127.0.0.1:8081 to access the redis-commander interface.
One can access the task queues created and managed by 
[Celery](https://docs.celeryq.dev/en/stable/index.html).

![redis-commander-ui](docs/redis-commander-ui.png)

## Example

The kedro-graphl package contains an very simple example 
pipeline called "example00".

### Setup

Clone the kedro-graphql repository.

```
git clone git@github.com:opensean/kedro-graphql.git
```

Create a virtualenv and activate it.

```
cd kedro-graphql
python3.10 -m venv venv
source venv/bin/activate
```

Install dependencies.

```
pip install -r src/requirements.txt
```

Create a text file.

```
echo "Hello" > ./data/01_raw/text_in.txt
```

Start the redis and mongo services.

```
docker-compose up -d
```

Start the api server.

```
kedro gql
```

Start a worker (in another terminal).

```
kedro gql -w
```

### Start a pipeline

Navigate to http://127.0.0.1:5000/graphql to access the graphql interface 
and execute the following mutation:

```
mutation MyMutation {
  pipeline(
    pipeline: {name: "example00", parameters: [{name: "example", value: "hello"}, {name: "duration", value: "10"}], inputs: {name: "text_in", type: "text.TextDataSet", filepath: "./data/01_raw/text_in.txt"}, outputs: {name: "text_out", type: "text.TextDataSet", filepath: "./data/02_intermediate/text_out.txt"}}
  ) {
    id
    name
  }
}
```

Expected response:

```
{
  "data": {
    "pipeline": {
      "id": "6463991db98d7f8564ab15a0",
      "name": "example00"
    }
  }
}
```

### Subscribe to pipeline events

Now execute the following subscription to track the progress:

```
subscription MySubscription {
  pipeline(id: "6463991db98d7f8564ab15a0") {
    id
    result
    status
    taskId
    timestamp
    traceback
  }
}
```

![subscription](docs/subscription.gif)


### Susbscribe to pipeline logs

Execute the following subscription to recieve log messages:


```
subscription {
   	pipelineLogs(id:"6463991db98d7f8564ab15a0") {
       id
       message
       messageId
       taskId
       time
     }
}
```


![logs subscription](docs/logs-subscription.gif)

### Get the pipeline result

Fetch the pipeline result with the following query:

```
query MyQuery {
  pipeline(id: "6463991db98d7f8564ab15a0") {
    describe
    id
    name
    outputs {
      filepath
      name
      type
    }
    inputs {
      filepath
      name
      type
    }
    parameters {
      name
      value
    }
    status
    taskEinfo
    taskException
    taskId
    taskKwargs
    taskRequest
    taskName
    taskResult
    taskTraceback
  }
}
```

Expected result:

```
{
  "data": {
    "pipeline": {
      "describe": "#### Pipeline execution order ####\nInputs: parameters, params:example, text_in\n\necho_node\n\nOutputs: text_out\n##################################",
      "id": "6463991db98d7f8564ab15a0",
      "name": "example00",
      "outputs": [
        {
          "filepath": "./data/02_intermediate/text_out.txt",
          "name": "text_out",
          "type": "text.TextDataSet"
        }using 
      ],
      "inputs": [
        {
          "filepath": "./data/01_raw/text_in.txt",
          "name": "text_in",
          "type": "text.TextDataSet"
        }
      ],
      "parameters": [
        {
          "name": "example",
          "value": "hello"
        },
        {
          "name": "duration",
          "value": "10"
        }
      ],
      "status": "SUCCESS",
      "taskEinfo": "None",
      "taskException": null,
      "taskId": "129b4441-6150-4c0b-90df-185c1ec692ea",
      "taskKwargs": "{'name': 'example00', 'inputs': {'text_in': {'type': 'text.TextDataSet', 'filepath': './data/01_raw/text_in.txt'}}, 'outputs': {'text_out': {'type': 'text.TextDataSet', 'filepath': './data/02_intermediate/text_out.txt'}}, 'parameters': {'example': 'hello', 'duration': '10'}}",
      "taskRequest": null,
      "taskName": "<@task: kedro_graphql.tasks.run_pipeline of kedro_graphql at 0x7f29e3e9e500>",
      "taskResult": null,
      "taskTraceback": null
    }
  }
}
```
One can explore how the pipeline is persisted using the mongo-express 
interface located here http://127.0.0.1:8082.  Pipelines are persisted in the
"pipelines" collection of the "pipelines" database.

![mongo-express-pipeline](./docs/mongo-express-pipeline.png)

![mongo-express-pipeline-doc](./docs/mongo-express-pipeline-doc.png)
## Features


### Extensible API

The api generated by this tool can be extended using decorators.

This example adds a query, mutation, and subscription types.

```
## kedro_graphql.plugins.plugins
import asyncio
from kedro_graphql.decorators import gql_query, gql_mutation, gql_subscription
import strawberry
from typing import AsyncGenerator

@gql_query()
@strawberry.type
class ExampleQueryTypePlugin():
    @strawberry.field
    def hello_world(self) -> str:
        return "Hello World"

@gql_mutation()
@strawberry.type
class ExampleMutationTypePlugin():
    @strawberry.mutation
    def hello_world(self, message: str = "World") -> str:
        return "Hello " + message

@gql_subscription()
@strawberry.type
class ExampleSubscriptionTypePlugin():
    @strawberry.subscription
    async def hello_world(self, message: str = "World", target: int = 11) -> AsyncGenerator[str, None]:
        for i in range(target):
            yield str(i) + " Hello " + message
            await asyncio.sleep(0.5)
```

When starting the api server specify the import path using the 
```--imports``` flag.

```
kedro gql --imports "kedro_graphql.plugins.plugins"
```

Multiple import paths can be specified using comma seperated values.

```
kedro gql --imports "kedro_graphql.plugins.plugins,example_pkg.example.my_types"
```

Alternatively, use a ```.env``` file as described in the [General Configuration](#general-configuration) section.


### Configurable Application

The base application is strawberry + FastAPI instance.  One can leverage the 
additional features FastAPI offers by defining a custom application class.


This example adds a [CORSMiddleware](https://fastapi.tiangolo.com/tutorial/cors/#use-corsmiddleware).

```
## src/kedro_graphql/example/app.py
from fastapi.middleware.cors import CORSMiddleware
from kedro_graphql import KedroGraphQL



class MyApp(KedroGraphQL):

    def __init__(self): 
        super(MyApp, self).__init__()

        origins = [
            "http://localhost",
            "http://localhost:8080",
        ]
        
        self.add_middleware(
            CORSMiddleware,
            allow_origins=origins,
            allow_credentials=True,
            allow_methods=["*"],
            allow_headers=["*"],
        )
        print("added CORSMiddleware")

```

When starting the api server specify the import path using the 
```--app``` flag.

```
kedro gql --app "my_kedro_project.app.MyApp"
## example output
added CORSMiddleware
INFO:     Started server process [7032]
INFO:     Waiting for application startup.
Connected to the MongoDB database!
INFO:     Application startup complete.
INFO:     Uvicorn running on http://127.0.0.1:5000 (Press CTRL+C to quit)
```


Alternatively, use a ```.env``` file as described in the [General Configuration](#general-configuration) section.

### Auto-reload

The cli interface supports "auto-reloading" in order to make development easier.
When starting the api server and worker specify the `-r` or `--reload` option
to turn on auto-reloading.  Any changes to the "src" directory of your kedro
project will trigger a reload.

Start the api server with auto-reload enabled.

```
kedro gql --reload
```

Start a worker (in another terminal) with auto-reload enabled.

```
kedro gql -w --reload
```

The path to watch can be further refined using the `--reload-path` option.
In the following examples a reload will be triggered when changes are 
made to files in the `src/kedro_graphql/src/runners` directory.
Start the api server with auto-reload enabled. 

```
kedro gql --reload --reload-path ./src/kedro_graphql/runners
```

Start a worker (in another terminal) with auto-reload enabled.

```
kedro gql -w --reload --reload-path ./src/kedro_graphql/runners
`````

## General Configuration

Configuration can be supplied via environment variables or a ```.env``` file.

```
## example .env file
MONGO_URI=mongodb://root:example@localhost:27017/
MONGO_DB_NAME=pipelines
KEDRO_GRAPHQL_IMPORTS=kedro_graphql.plugins.plugins
KEDRO_GRAPHQL_APP=kedro_graphql.asgi.KedroGraphQL
KEDRO_GRAPHQL_BACKEND=kedro_graphql.backends.mongodb.MongoBackend
KEDRO_GRAPHQL_BROKER=redis://localhost
KEDRO_GRAPHQL_CELERY_RESULT_BACKEND=redis://localhost
KEDRO_GRAPHQL_RUNNER=kedro.runner.SequentialRunner
KEDRO_GRAPHQL_ENV=local
KEDRO_GRAPHQL_CONF_SOURCE=None
```

The configuration can also be provided at startup through the cli interface.
Configuration values can be mapped to the the appropriate cli option by 
removing the "KEDRO_GRAPHQL" prefix and using a lower case, hyphen format
for the remaining string.  For example:

| configuration variable | cli option | example   |
|------------------------|------------|-----------|
|MONGO_URI |   --mongo-uri    | mongodb://root:example@localhost:27017/ |
|MONGO_DB_NAME |   --mongo-db-name    | pipelines |
|KEDRO_GRAPHQL_IMPORTS |   --imports    | kedro_graphql.plugins.plugins |
|KEDRO_GRAPHQL_APP |   --app    | kedro_graphql.asgi.KedroGraphQL |
|KEDRO_GRAPHQL_BACKEND |   --backend    | kedro_graphql.backends.mongodb.MongoBackend |
|KEDRO_GRAPHQL_BROKER |    --broker   | redis://localhost |
|KEDRO_GRAPHQL_CELERY_RESULT_BACKEND |  --celery-result-backend    | redis://localhost |
|KEDRO_GRAPHQL_RUNNER |    --runner   | kedro.runner.SequentialRunner |
|KEDRO_GRAPHQL_ENV |    --env   | local |
|KEDRO_GRAPHQL_CONF_SOURCE |   --conf-source    | $HOME/myproject/conf |


## How to install dependencies


To install them, run:

```
pip install -r src/requirements.txt
```


## How to test

```
pytest src/tests
```

To configure the coverage threshold, go to the `.coveragerc` file.

## Project dependencies

To generate or update the dependency requirements for your project:

```
kedro build-reqs
```

This will `pip-compile` the contents of `src/requirements.txt` into a new file `src/requirements.lock`. You can see the output of the resolution by opening `src/requirements.lock`.

After this, if you'd like to update your project requirements, please update `src/requirements.txt` and re-run `kedro build-reqs`.

[Further information about project dependencies](https://kedro.readthedocs.io/en/stable/kedro_project_setup/dependencies.html#project-specific-dependencies)


### TO DO

- support custom runners e.g. Argo Workflows, AWS Batch, etc...
- document plan for supporting custom IOResolverPlugins 
- document pipeline tags and implement "search" via tags and/or other fields
- API paginations e.g. list pipelines and/or search results
- support passing credentials via api


            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "kedro-graphql",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "<3.11,>=3.10",
    "maintainer_email": "",
    "keywords": "pipelines,machine learning,data pipelines,data science,data engineering",
    "author": "opensean",
    "author_email": "",
    "download_url": "https://files.pythonhosted.org/packages/ab/0e/0eaa29032507181e64d6dc4e840dc5e0cf2f8120fb99b48b7ffa2d101e3b/kedro-graphql-0.4.0.tar.gz",
    "platform": null,
    "description": "# kedro-graphql\n\n## Overview\n\nKedro-graphql is a [kedro-plugin](https://docs.kedro.org/en/stable/extend_kedro/plugins.html) \nfor serving kedro projects as a graphql api.  It leverages \n[Strawberry](https://strawberry.rocks/), [FastAPI](https://fastapi.tiangolo.com/), \nand [Celery](https://docs.celeryq.dev/en/stable/index.html) to turn any \n [Kedro](https://docs.kedro.org/en/stable/) project into a GraphqlQL api \n with features such as:\n \n - a distributed task queue\n - subscribe to pipline events and logs via GraphQL subscriptions\n - storage\n   - persist and track all pipelines executed via the API\n - [additional features](#features)\n\n```mermaid\nflowchart  TB\n  api[GraphQL API\\n<i>strawberry + FastAPI</i>]\n  mongodb[(db: 'pipelines'\\ncollection: 'pipelines'\\n<i>mongdob</i>)]\n  redis[(task queue\\n<i>redis</i>)]\n  worker[worker\\n<i>celery</i>]\n\n  api<-->mongodb\n  api<-->redis\n  worker<-->mongodb\n  worker<-->redis\n\n```\n\nFigure 1. Architecture\n\n\n\n## Quickstart\n\nInstall kedro-graphql into your kedro project environnment.\n\n```\npip install kedro_graphql\n```\n\nStart the redis and mongo services using this [docker-compose.yaml](https://github.com/opensean/kedro-graphql/blob/main/docker-compose.yaml).\n\n```\ndocker-compose up -d\n```\n\nStart the api server.\n\n```\nkedro gql\n```\n\nStart a worker (in another terminal).\n\n```\nkedro gql -w\n```\n\nNavigate to http://127.0.0.1:5000/graphql to access the graphql interface.\n\n![strawberry-ui](docs/strawberry-ui.png)\n\nThe [docker-compose.yaml](./docker-compose.yaml) includes \n[mongo-express](https://github.com/mongo-express/mongo-express) and \n[redis-commander](https://github.com/joeferner/redis-commander) services \nto provide easy acess to MongoDB and redis.\n\n\nNavigate to http://127.0.0.1:8082 to access mongo-express interface.\n\n![mongo-express-ui](docs/mongo-express-ui.png)\n\nNavigate to http://127.0.0.1:8081 to access the redis-commander interface.\nOne can access the task queues created and managed by \n[Celery](https://docs.celeryq.dev/en/stable/index.html).\n\n![redis-commander-ui](docs/redis-commander-ui.png)\n\n## Example\n\nThe kedro-graphl package contains an very simple example \npipeline called \"example00\".\n\n### Setup\n\nClone the kedro-graphql repository.\n\n```\ngit clone git@github.com:opensean/kedro-graphql.git\n```\n\nCreate a virtualenv and activate it.\n\n```\ncd kedro-graphql\npython3.10 -m venv venv\nsource venv/bin/activate\n```\n\nInstall dependencies.\n\n```\npip install -r src/requirements.txt\n```\n\nCreate a text file.\n\n```\necho \"Hello\" > ./data/01_raw/text_in.txt\n```\n\nStart the redis and mongo services.\n\n```\ndocker-compose up -d\n```\n\nStart the api server.\n\n```\nkedro gql\n```\n\nStart a worker (in another terminal).\n\n```\nkedro gql -w\n```\n\n### Start a pipeline\n\nNavigate to http://127.0.0.1:5000/graphql to access the graphql interface \nand execute the following mutation:\n\n```\nmutation MyMutation {\n  pipeline(\n    pipeline: {name: \"example00\", parameters: [{name: \"example\", value: \"hello\"}, {name: \"duration\", value: \"10\"}], inputs: {name: \"text_in\", type: \"text.TextDataSet\", filepath: \"./data/01_raw/text_in.txt\"}, outputs: {name: \"text_out\", type: \"text.TextDataSet\", filepath: \"./data/02_intermediate/text_out.txt\"}}\n  ) {\n    id\n    name\n  }\n}\n```\n\nExpected response:\n\n```\n{\n  \"data\": {\n    \"pipeline\": {\n      \"id\": \"6463991db98d7f8564ab15a0\",\n      \"name\": \"example00\"\n    }\n  }\n}\n```\n\n### Subscribe to pipeline events\n\nNow execute the following subscription to track the progress:\n\n```\nsubscription MySubscription {\n  pipeline(id: \"6463991db98d7f8564ab15a0\") {\n    id\n    result\n    status\n    taskId\n    timestamp\n    traceback\n  }\n}\n```\n\n![subscription](docs/subscription.gif)\n\n\n### Susbscribe to pipeline logs\n\nExecute the following subscription to recieve log messages:\n\n\n```\nsubscription {\n   \tpipelineLogs(id:\"6463991db98d7f8564ab15a0\") {\n       id\n       message\n       messageId\n       taskId\n       time\n     }\n}\n```\n\n\n![logs subscription](docs/logs-subscription.gif)\n\n### Get the pipeline result\n\nFetch the pipeline result with the following query:\n\n```\nquery MyQuery {\n  pipeline(id: \"6463991db98d7f8564ab15a0\") {\n    describe\n    id\n    name\n    outputs {\n      filepath\n      name\n      type\n    }\n    inputs {\n      filepath\n      name\n      type\n    }\n    parameters {\n      name\n      value\n    }\n    status\n    taskEinfo\n    taskException\n    taskId\n    taskKwargs\n    taskRequest\n    taskName\n    taskResult\n    taskTraceback\n  }\n}\n```\n\nExpected result:\n\n```\n{\n  \"data\": {\n    \"pipeline\": {\n      \"describe\": \"#### Pipeline execution order ####\\nInputs: parameters, params:example, text_in\\n\\necho_node\\n\\nOutputs: text_out\\n##################################\",\n      \"id\": \"6463991db98d7f8564ab15a0\",\n      \"name\": \"example00\",\n      \"outputs\": [\n        {\n          \"filepath\": \"./data/02_intermediate/text_out.txt\",\n          \"name\": \"text_out\",\n          \"type\": \"text.TextDataSet\"\n        }using \n      ],\n      \"inputs\": [\n        {\n          \"filepath\": \"./data/01_raw/text_in.txt\",\n          \"name\": \"text_in\",\n          \"type\": \"text.TextDataSet\"\n        }\n      ],\n      \"parameters\": [\n        {\n          \"name\": \"example\",\n          \"value\": \"hello\"\n        },\n        {\n          \"name\": \"duration\",\n          \"value\": \"10\"\n        }\n      ],\n      \"status\": \"SUCCESS\",\n      \"taskEinfo\": \"None\",\n      \"taskException\": null,\n      \"taskId\": \"129b4441-6150-4c0b-90df-185c1ec692ea\",\n      \"taskKwargs\": \"{'name': 'example00', 'inputs': {'text_in': {'type': 'text.TextDataSet', 'filepath': './data/01_raw/text_in.txt'}}, 'outputs': {'text_out': {'type': 'text.TextDataSet', 'filepath': './data/02_intermediate/text_out.txt'}}, 'parameters': {'example': 'hello', 'duration': '10'}}\",\n      \"taskRequest\": null,\n      \"taskName\": \"<@task: kedro_graphql.tasks.run_pipeline of kedro_graphql at 0x7f29e3e9e500>\",\n      \"taskResult\": null,\n      \"taskTraceback\": null\n    }\n  }\n}\n```\nOne can explore how the pipeline is persisted using the mongo-express \ninterface located here http://127.0.0.1:8082.  Pipelines are persisted in the\n\"pipelines\" collection of the \"pipelines\" database.\n\n![mongo-express-pipeline](./docs/mongo-express-pipeline.png)\n\n![mongo-express-pipeline-doc](./docs/mongo-express-pipeline-doc.png)\n## Features\n\n\n### Extensible API\n\nThe api generated by this tool can be extended using decorators.\n\nThis example adds a query, mutation, and subscription types.\n\n```\n## kedro_graphql.plugins.plugins\nimport asyncio\nfrom kedro_graphql.decorators import gql_query, gql_mutation, gql_subscription\nimport strawberry\nfrom typing import AsyncGenerator\n\n@gql_query()\n@strawberry.type\nclass ExampleQueryTypePlugin():\n    @strawberry.field\n    def hello_world(self) -> str:\n        return \"Hello World\"\n\n@gql_mutation()\n@strawberry.type\nclass ExampleMutationTypePlugin():\n    @strawberry.mutation\n    def hello_world(self, message: str = \"World\") -> str:\n        return \"Hello \" + message\n\n@gql_subscription()\n@strawberry.type\nclass ExampleSubscriptionTypePlugin():\n    @strawberry.subscription\n    async def hello_world(self, message: str = \"World\", target: int = 11) -> AsyncGenerator[str, None]:\n        for i in range(target):\n            yield str(i) + \" Hello \" + message\n            await asyncio.sleep(0.5)\n```\n\nWhen starting the api server specify the import path using the \n```--imports``` flag.\n\n```\nkedro gql --imports \"kedro_graphql.plugins.plugins\"\n```\n\nMultiple import paths can be specified using comma seperated values.\n\n```\nkedro gql --imports \"kedro_graphql.plugins.plugins,example_pkg.example.my_types\"\n```\n\nAlternatively, use a ```.env``` file as described in the [General Configuration](#general-configuration) section.\n\n\n### Configurable Application\n\nThe base application is strawberry + FastAPI instance.  One can leverage the \nadditional features FastAPI offers by defining a custom application class.\n\n\nThis example adds a [CORSMiddleware](https://fastapi.tiangolo.com/tutorial/cors/#use-corsmiddleware).\n\n```\n## src/kedro_graphql/example/app.py\nfrom fastapi.middleware.cors import CORSMiddleware\nfrom kedro_graphql import KedroGraphQL\n\n\n\nclass MyApp(KedroGraphQL):\n\n    def __init__(self): \n        super(MyApp, self).__init__()\n\n        origins = [\n            \"http://localhost\",\n            \"http://localhost:8080\",\n        ]\n        \n        self.add_middleware(\n            CORSMiddleware,\n            allow_origins=origins,\n            allow_credentials=True,\n            allow_methods=[\"*\"],\n            allow_headers=[\"*\"],\n        )\n        print(\"added CORSMiddleware\")\n\n```\n\nWhen starting the api server specify the import path using the \n```--app``` flag.\n\n```\nkedro gql --app \"my_kedro_project.app.MyApp\"\n## example output\nadded CORSMiddleware\nINFO:     Started server process [7032]\nINFO:     Waiting for application startup.\nConnected to the MongoDB database!\nINFO:     Application startup complete.\nINFO:     Uvicorn running on http://127.0.0.1:5000 (Press CTRL+C to quit)\n```\n\n\nAlternatively, use a ```.env``` file as described in the [General Configuration](#general-configuration) section.\n\n### Auto-reload\n\nThe cli interface supports \"auto-reloading\" in order to make development easier.\nWhen starting the api server and worker specify the `-r` or `--reload` option\nto turn on auto-reloading.  Any changes to the \"src\" directory of your kedro\nproject will trigger a reload.\n\nStart the api server with auto-reload enabled.\n\n```\nkedro gql --reload\n```\n\nStart a worker (in another terminal) with auto-reload enabled.\n\n```\nkedro gql -w --reload\n```\n\nThe path to watch can be further refined using the `--reload-path` option.\nIn the following examples a reload will be triggered when changes are \nmade to files in the `src/kedro_graphql/src/runners` directory.\nStart the api server with auto-reload enabled. \n\n```\nkedro gql --reload --reload-path ./src/kedro_graphql/runners\n```\n\nStart a worker (in another terminal) with auto-reload enabled.\n\n```\nkedro gql -w --reload --reload-path ./src/kedro_graphql/runners\n`````\n\n## General Configuration\n\nConfiguration can be supplied via environment variables or a ```.env``` file.\n\n```\n## example .env file\nMONGO_URI=mongodb://root:example@localhost:27017/\nMONGO_DB_NAME=pipelines\nKEDRO_GRAPHQL_IMPORTS=kedro_graphql.plugins.plugins\nKEDRO_GRAPHQL_APP=kedro_graphql.asgi.KedroGraphQL\nKEDRO_GRAPHQL_BACKEND=kedro_graphql.backends.mongodb.MongoBackend\nKEDRO_GRAPHQL_BROKER=redis://localhost\nKEDRO_GRAPHQL_CELERY_RESULT_BACKEND=redis://localhost\nKEDRO_GRAPHQL_RUNNER=kedro.runner.SequentialRunner\nKEDRO_GRAPHQL_ENV=local\nKEDRO_GRAPHQL_CONF_SOURCE=None\n```\n\nThe configuration can also be provided at startup through the cli interface.\nConfiguration values can be mapped to the the appropriate cli option by \nremoving the \"KEDRO_GRAPHQL\" prefix and using a lower case, hyphen format\nfor the remaining string.  For example:\n\n| configuration variable | cli option | example   |\n|------------------------|------------|-----------|\n|MONGO_URI |   --mongo-uri    | mongodb://root:example@localhost:27017/ |\n|MONGO_DB_NAME |   --mongo-db-name    | pipelines |\n|KEDRO_GRAPHQL_IMPORTS |   --imports    | kedro_graphql.plugins.plugins |\n|KEDRO_GRAPHQL_APP |   --app    | kedro_graphql.asgi.KedroGraphQL |\n|KEDRO_GRAPHQL_BACKEND |   --backend    | kedro_graphql.backends.mongodb.MongoBackend |\n|KEDRO_GRAPHQL_BROKER |    --broker   | redis://localhost |\n|KEDRO_GRAPHQL_CELERY_RESULT_BACKEND |  --celery-result-backend    | redis://localhost |\n|KEDRO_GRAPHQL_RUNNER |    --runner   | kedro.runner.SequentialRunner |\n|KEDRO_GRAPHQL_ENV |    --env   | local |\n|KEDRO_GRAPHQL_CONF_SOURCE |   --conf-source    | $HOME/myproject/conf |\n\n\n## How to install dependencies\n\n\nTo install them, run:\n\n```\npip install -r src/requirements.txt\n```\n\n\n## How to test\n\n```\npytest src/tests\n```\n\nTo configure the coverage threshold, go to the `.coveragerc` file.\n\n## Project dependencies\n\nTo generate or update the dependency requirements for your project:\n\n```\nkedro build-reqs\n```\n\nThis will `pip-compile` the contents of `src/requirements.txt` into a new file `src/requirements.lock`. You can see the output of the resolution by opening `src/requirements.lock`.\n\nAfter this, if you'd like to update your project requirements, please update `src/requirements.txt` and re-run `kedro build-reqs`.\n\n[Further information about project dependencies](https://kedro.readthedocs.io/en/stable/kedro_project_setup/dependencies.html#project-specific-dependencies)\n\n\n### TO DO\n\n- support custom runners e.g. Argo Workflows, AWS Batch, etc...\n- document plan for supporting custom IOResolverPlugins \n- document pipeline tags and implement \"search\" via tags and/or other fields\n- API paginations e.g. list pipelines and/or search results\n- support passing credentials via api\n\n",
    "bugtrack_url": null,
    "license": "Apache Software License (Apache 2.0)",
    "summary": "A kedro plugin for serving any kedro project as a GraphQL api",
    "version": "0.4.0",
    "project_urls": {
        "Homepage": "https://github.com/opensean/kedro-graphql",
        "Source": "https://github.com/opensean/kedro-graphql"
    },
    "split_keywords": [
        "pipelines",
        "machine learning",
        "data pipelines",
        "data science",
        "data engineering"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2cfb26de48d4fe583c28e98655f34552ed32e5ddb75537ea015a8b3f16d23860",
                "md5": "05686da4f8cf3bd2bd1a4e2a96d27c6b",
                "sha256": "9a05dbd405e71aa20044acbd504a9272d5395058a993272306e7854a0de18505"
            },
            "downloads": -1,
            "filename": "kedro_graphql-0.4.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "05686da4f8cf3bd2bd1a4e2a96d27c6b",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.11,>=3.10",
            "size": 47188,
            "upload_time": "2023-11-08T18:19:14",
            "upload_time_iso_8601": "2023-11-08T18:19:14.666755Z",
            "url": "https://files.pythonhosted.org/packages/2c/fb/26de48d4fe583c28e98655f34552ed32e5ddb75537ea015a8b3f16d23860/kedro_graphql-0.4.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ab0e0eaa29032507181e64d6dc4e840dc5e0cf2f8120fb99b48b7ffa2d101e3b",
                "md5": "6d9a0d0ad286c50dc105249746035a1f",
                "sha256": "99322dbc939dc54e3673c9c156f561b2be8d359a3d733a71c41bed00464346f2"
            },
            "downloads": -1,
            "filename": "kedro-graphql-0.4.0.tar.gz",
            "has_sig": false,
            "md5_digest": "6d9a0d0ad286c50dc105249746035a1f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.11,>=3.10",
            "size": 39865,
            "upload_time": "2023-11-08T18:19:16",
            "upload_time_iso_8601": "2023-11-08T18:19:16.203640Z",
            "url": "https://files.pythonhosted.org/packages/ab/0e/0eaa29032507181e64d6dc4e840dc5e0cf2f8120fb99b48b7ffa2d101e3b/kedro-graphql-0.4.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-11-08 18:19:16",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "opensean",
    "github_project": "kedro-graphql",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "kedro-graphql"
}
        
Elapsed time: 0.13091s