sap-ai-core-metaflow


Namesap-ai-core-metaflow JSON
Version 1.1.14 PyPI version JSON
download
home_pagehttps://www.sap.com/
SummaryA Metaflow plugin to execute flows on Argo Workflows
upload_time2023-03-29 12:51:12
maintainer
docs_urlNone
authorSAP SE
requires_python>=3.7
licenseSAP DEVELOPER LICENSE AGREEMENT
keywords sap ai core metaflow argo kubernetes
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Metaflow library for SAP AI Core

The `sap-ai-core-metaflow` package is a [Metaflow plugin](https://metaflow.org) that generates [Argo Workflows](https://argoproj.github.io/argo-workflows).

It adds Argo Workflows capabilities using Metaflow's `@kubernetes` decorator.

## Table of Contents

* [Plugin Installation](#plugin-installation)
* [Install and configure Argo Workflows](#install-and-configure-argo-workflows)
* [Configure Metaflow](#configure-metaflow)
* [Run the Argo WorkflowTemplate](#run-the-argo-workflowtemplate)
* [Metaflow-Argo Decorators](#metaflow-argo-decorators)
* [Command-line Parameters and Environment Variables](#command-line-parameters-and-environment-variables)
* [Other Supported Features](#other-supported-features)
* [Use with SAP AICore](#use-with-sap-aicore)

## Plugin Installation

```sh
pip install sap-ai-core-metaflow
```

## Install and configure Argo Workflows

With the Metaflow-Argo plugin you can execute flows in a K8S cluster.
> Note: you can also install a local K8S cluster using docker desktop (Mac, Windows) or K3D, minikube (Linux).

* Install [Argo Workflows](https://argoproj.github.io/argo-workflows/installation/) in the K8S cluster.
* Optionally expose [argo-server](https://argoproj.github.io/argo-workflows/argo-server/) to access ArgoWorkflows UI.

## Configure Metaflow

For a quick-start follow the steps below.\
(Find complete documentation here: [Configuring Metaflow](https://outerbounds.com/docs/configure-metaflow).)

## The Metaflow _config_ File

An S3 bucket can be configured to store training data and Metaflow code packages.

* Create the config file: `~/.metaflowconfig/config.json` containing your S3 _bucket_ and _prefix_:

``` json
{
    "METAFLOW_DEFAULT_DATASTORE": "s3",
    "METAFLOW_DATASTORE_SYSROOT_S3": "s3://<bucket>/<prefix>"
}
```

In addition, your local [aws cli](https://aws.amazon.com/cli/) has to be configured for the same bucket.

## Create the K8S Secret

This secret allows an Argo Workflow to access the S3 bucket when running in the cluster:

```bash
    kubectl create secret generic default-object-store-secret --from-literal=AWS_ACCESS_KEY_ID=XXX --from-literal=AWS_SECRET_ACCESS_KEY=YYY
```

## Run the Argo WorkflowTemplate

Here we use [helloworld.py](https://github.com/Netflix/metaflow/blob/master/metaflow/tutorials/00-helloworld/helloworld.py) from the [Metaflow Tutorials](https://docs.metaflow.org/getting-started/tutorials).

### Deploy the Argo WorkflowTemplate

```bash
    python helloworld.py --with kubernetes:secrets=default-object-store-secret argo create
```

You should see the following output:

```bash
    Metaflow 2.5.0 executing HelloFlow for user:<xyz>
    Validating your flow...
        The graph looks good!
    Running pylint...
        Pylint is happy!
    Deploying helloflow to Argo Workflow Templates...
    WorkflowTemplate helloflow is pushed to Argo Workflows successfully.
```

### Trigger the Execution of the WorkflowTemplate

```bash
    python helloworld.py argo trigger
```

You should see a `run-id` after a successful start of a workflow:

```bash
    Metaflow 2.5.0 executing HelloFlow for user:<xyz>
    Validating your flow...
        The graph looks good!
    Running pylint...
        Pylint is happy!
    Workflow helloflow is triggered on Argo Workflows (run-id helloflow-92vbn).
```

### Check the Execution Status

```bash
    python helloworld.py argo list-runs
```

The flow will display the state `Running` for some time and finally the flow shows
`Succeeded`:

```bash
    Metaflow 2.5.0 executing HelloFlow for user:<xyz>
    Validating your flow...
        The graph looks good!
    Running pylint...
        Pylint is happy!
    helloflow-92vbn startedAt:'2022-06-15T14:00:27Z' Succeeded
```

## Metaflow-Argo Decorators

In addition to general Metaflow decorators, i.e. `@kubernetes`,
`@resources`, `@environment`, etc. the Metaflow-Argo plugin supports
additional Argo-specific customization for steps which support
input_artifacts ,output_artifacts, labels and shared_memory.

## The `@argo` Decorator on Step Level

A Metaflow _step_ can override the default values specified on the
flow level or add extra configurations.

### Labels

Labels are attached as metadata to the pod executing a _step_:

``` python
...

    @argo(labels = {'plan': 'basic'})
    @step
    def training():
        ...
```

The _step_ labels override the flow labels.

### Input Artifacts

The artifacts configuration is inserted into the generated
WorkflowTemplate as-is.

``` python
@argo(input_artifacts=[{
        'name': 'input_dataset',
        'path': '/tmp/train.csv',
        's3': {
            'endpoint': 's3.amazonaws.com',
            'bucket': 'my-bucket-name',
            'key': 'path/in/bucket',
            'accessKeySecret': {
                'name': 'my-s3-credentials',
                'key': 'accessKey'
            },
            'secretKeySecret': {
                'name': 'my-s3-credentials',
                'key': 'secretKey'
            }
        }}])
@step
def start(self):
    ds = pd.read_csv('/tmp/train.csv')
```

If the [Default Artifact Repository](https://argoproj.github.io/argo-workflows/key-only-artifacts) is configured, the credentials can be ommited:

``` python
@argo(input_artifacts=[{
        'name': 'input_dataset',
        'path': '/tmp/spam.csv',
        's3': {
            'key': 'path/in/bucket'
          }
        }])
```

### Output Artifacts

> Prerequisite: The [Default Artifact Repository](https://argoproj.github.io/argo-workflows/configure-artifact-repository/#configure-the-default-artifact-repository) is configured

If the configuration below is added to a _step_, Argo will upload the output artifact to the specified S3 bucket.

``` python
@argo(output_artifacts=[{
    'name': 'model',
    'globalName': 'model',
    'path': '/tmp/model',
    'archive': {
        'none': {}
    }}])
@step
def train_model(self):
    ...
    with open('/tmp/model', 'wb') as f:
        pickle.dump(self.model, f)
```

### Configure additional Disk Space using Persistent Volume Claim (PVC)

If the pod's disk space is not sufficient, one can mount extra storage using a [Persistent Volume Claim](https://kubernetes.io/docs/concepts/storage/persistent-volumes/).

The example below creates a pvc in WorkflowTemplate with a default size of 50GiB:

``` python
@argo(mount_pvc="/mnt/data")
@step
def start(self):
    # use /mnt/data for huge files
```

To set the size for the `PVC`, add a value for the command-line parameter `volumen-claim` in MiB:

```bash
    python helloflow.py argo create --volume-claim=100000
```

### Shared Memory Allocation

For adding more `shared memory` in MiB use the following configuration:

``` python
@argo(shared_memory=100)
@step
def train_model(self):
    ...
```

## Command-line Parameters and Environment Variables

The Metaflow-Argo plugin allows overriding default `flow` values. This done through environment variables or command-line parameters. Command-line parameters naturally have precedence over `envvars`.

## Command-line Parameters

### List Command-line Parameters

```bash
Usage: helloworld.py argo create [OPTIONS]

  Deploy a new version of this workflow to Argo Workflow Templates.

Options:
  --image-pull-secret TEXT    Docker image pull secret.
  --label TEXT                Label to attach to the workflow template.
  --annotation TEXT           Annotation to attach to the workflow template.
  --volume-claim INTEGER      Size of the volumeClaim in MiB
  --k8s-namespace TEXT        Deploy into the specified kubernetes namespace.
  --embedded                  Don't download code package into step
                              containers.Docker images should have a flow and
                              all dependencies embedded.

  --max-workers INTEGER       Maximum number of parallel pods.  [default: 100]
  --workflow-timeout INTEGER  Workflow timeout in seconds.
  --only-json                 Only print out JSON sent to Argo. Do not deploy
                              anything.

  --help                      Show this message and exit.    
```

### Set the Flow Name

The Argo WorkflowTemplate name is taken by default from class name of the the Metaflow script. If you want to change the flow name, use:

```bash
    python helloworld.py argo --name=hello create
```

### `@kubernetes` Decorator Command Line Parameter

The Metaflow command-line parameter `--with=kubernetes` allows for setting configurations when using a kubernetes cluster:

* `secrets` to access object store, multiple secrets can be passed separated by `,`

``` python
    python helloworld.py --with=kubernetes:secrets=default-object-store-secret,s3-secret argo create
```

* `image` to specify a docker image

``` python
    python helloworld.py --with=kubernetes:secrets=default-object-store-secret,image=demisto/sklearn:1.0.0.29944 argo create
```

* `disk` is used to reserve ephemeral storage in MiB for the node.

``` python
    python helloworld.py --with=kubernetes:secrets=default-object-store-secret,disk=1024 argo create
```

### Embedded code package

The flag `--embedded` generates steps' commands without a code package download; expects a code	package	to be already *embedded*
into a docker image.

## Environment Variables

### Prefix for the Flow Name

To add a prefix to the flow name set the environment variable `ARGO_WORKFLOW_PREFIX`.

### Default K8S Namespace

The kubernetes namespace to store templates and execute workflows can be
set with envvar `METAFLOW_KUBERNETES_NAMESPACE`.

### Default Environment Variables for Command-line Parameters

Metaflow provides envvars for the correspondent command-line parameters by setting this prefix: `METAFLOW_ARGO_CREATE_` e.g.:

```bash
    export METAFLOW_ARGO_CREATE_MAX_WORKERS=4
```

## Other Supported Features

The Metaflow-Argo plugin supports also many Metaflow features.

## `@kubernetes` Decorator

The Metaflow-Argo plugin makes use of the specifications defined in
`@kubernetes` decorator to configure container resources.

``` python

    @kubernetes(image='tensorflow/tensorflow:2.6.0', cpu=8, memory=60000)
    @step
    def training():
        ...
```

## `@resources` Decorator

Full support for [Requesting resources with \@resources
decorator](https://docs.metaflow.org/metaflow/scaling#requesting-resources-with-resources-decorator).
If similar resources are specified in both `@kubernetes` and
`@resources`, the maximum is taken.

``` python

    @kubernetes(image='tensorflow/tensorflow:2.4.2-gpu',
                node_selector='gpu=nvidia-tesla-k80')
    @resources(memory=60000, cpu=2)
    @step
    def training():
        ...
```

## `@environment` Decorator

In addition to `env` and `envFrom` envvars can be specified in the decorator:

``` python
@environment(vars={'var':os.getenv('var_1')})
@step
def start(self):
    print(os.getenv('var'))
```

## Display Logs for a Step

Metaflow stores stdout and stderr of every step/task and provides a
convenient way to access them, passing run-id and step-name:

```bash
    python 00-helloworld/helloworld.py logs helloflow-bdmmw/start
```

Optionally specify a task-id, useful when have many instances of the
same step, e.g. when use foreach:

```bash
    python 00-helloworld/helloworld.py logs helloflow-bdmmw/hello/helloflow-bdmmw-3938807173
```

## Resume flows

See [How to debug failed
flows](https://docs.metaflow.org/metaflow/debugging#how-to-debug-failed-flows)
for details.

```bash
    python 00-helloworld/helloworld.py resume --origin-run-id=helloflow-bdmmw
```

## Schedule flows

Metaflow\'s
[\@schedule](https://docs.metaflow.org/going-to-production-with-metaflow/scheduling-metaflow-flows#scheduling-a-flow)
decorator is fully supported. E.g. a workflow template with the
`@schedule(hourly=True)` will automatically trigger workflow execution
on every hour start.

## Automatically rerun failed steps

`@retry` decorator helps to deal with transient failures. See [Retrying
Tasks with the retry
Decorator](https://docs.metaflow.org/metaflow/failures#retrying-tasks-with-the-retry-decorator).

## Handle failures programmatically

What if a default behavior to stop a whole flow if a particular step
fails is too limiting? With a help of `@catch` decorator, it\'s possible
to handle step\'s failure in a next step. [Catching Exceptions with the
catch
Decorator](https://docs.metaflow.org/metaflow/failures#catching-exceptions-with-the-catch-decorator).

## Limit the Step Execution Time

Use `@timeout` decorator to guard yourself from accidental resources
consumption. Read [Timing out with the timeout
Decorator](https://docs.metaflow.org/metaflow/failures#timing-out-with-the-timeout-decorator)
for more details.

## Use with SAP AICore

SAP AI Core does not allow access to its K8s cluster or argo-server. Workflow Templates are generated in json/yaml format and synced to K8s via the registered Git repo. The Workflows can then be triggered via `SAP AI API`.

The mandatory information for SAP AI Core is provided through labels and annotations.

```bash
python helloworld.py --with=kubernetes:secrets=default-object-store-secret,image=docker.io/<my-repo>/helloworld argo create
  --image-pull-secret=<docker-secret>
  --label="scenarios.ai.sap.com/id:ml_usecase_name"
  --label="ai.sap.com/version:0.1.0"
  --annotation="scenarios.ai.sap.com/name:Hello-World"
  --annotation=""executables.ai.sap.com/name":"helloworld"
  --only-json > hello-world.json
```

[Here](https://blogs.sap.com/2022/04/20/train-your-model-in-sap-ai-core-using-the-metaflow-argo-plugin/) you can find a step-by-step example for SAP AI Core.



            

Raw data

            {
    "_id": null,
    "home_page": "https://www.sap.com/",
    "name": "sap-ai-core-metaflow",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": "",
    "keywords": "SAP AI Core Metaflow Argo Kubernetes",
    "author": "SAP SE",
    "author_email": "",
    "download_url": "",
    "platform": "Linux",
    "description": "# Metaflow library for SAP AI Core\n\nThe `sap-ai-core-metaflow` package is a [Metaflow plugin](https://metaflow.org) that generates [Argo Workflows](https://argoproj.github.io/argo-workflows).\n\nIt adds Argo Workflows capabilities using Metaflow's `@kubernetes` decorator.\n\n## Table of Contents\n\n* [Plugin Installation](#plugin-installation)\n* [Install and configure Argo Workflows](#install-and-configure-argo-workflows)\n* [Configure Metaflow](#configure-metaflow)\n* [Run the Argo WorkflowTemplate](#run-the-argo-workflowtemplate)\n* [Metaflow-Argo Decorators](#metaflow-argo-decorators)\n* [Command-line Parameters and Environment Variables](#command-line-parameters-and-environment-variables)\n* [Other Supported Features](#other-supported-features)\n* [Use with SAP AICore](#use-with-sap-aicore)\n\n## Plugin Installation\n\n```sh\npip install sap-ai-core-metaflow\n```\n\n## Install and configure Argo Workflows\n\nWith the Metaflow-Argo plugin you can execute flows in a K8S cluster.\n> Note: you can also install a local K8S cluster using docker desktop (Mac, Windows) or K3D, minikube (Linux).\n\n* Install [Argo Workflows](https://argoproj.github.io/argo-workflows/installation/) in the K8S cluster.\n* Optionally expose [argo-server](https://argoproj.github.io/argo-workflows/argo-server/) to access ArgoWorkflows UI.\n\n## Configure Metaflow\n\nFor a quick-start follow the steps below.\\\n(Find complete documentation here: [Configuring Metaflow](https://outerbounds.com/docs/configure-metaflow).)\n\n## The Metaflow _config_ File\n\nAn S3 bucket can be configured to store training data and Metaflow code packages.\n\n* Create the config file: `~/.metaflowconfig/config.json` containing your S3 _bucket_ and _prefix_:\n\n``` json\n{\n    \"METAFLOW_DEFAULT_DATASTORE\": \"s3\",\n    \"METAFLOW_DATASTORE_SYSROOT_S3\": \"s3://<bucket>/<prefix>\"\n}\n```\n\nIn addition, your local [aws cli](https://aws.amazon.com/cli/) has to be configured for the same bucket.\n\n## Create the K8S Secret\n\nThis secret allows an Argo Workflow to access the S3 bucket when running in the cluster:\n\n```bash\n    kubectl create secret generic default-object-store-secret --from-literal=AWS_ACCESS_KEY_ID=XXX --from-literal=AWS_SECRET_ACCESS_KEY=YYY\n```\n\n## Run the Argo WorkflowTemplate\n\nHere we use [helloworld.py](https://github.com/Netflix/metaflow/blob/master/metaflow/tutorials/00-helloworld/helloworld.py) from the [Metaflow Tutorials](https://docs.metaflow.org/getting-started/tutorials).\n\n### Deploy the Argo WorkflowTemplate\n\n```bash\n    python helloworld.py --with kubernetes:secrets=default-object-store-secret argo create\n```\n\nYou should see the following output:\n\n```bash\n    Metaflow 2.5.0 executing HelloFlow for user:<xyz>\n    Validating your flow...\n        The graph looks good!\n    Running pylint...\n        Pylint is happy!\n    Deploying helloflow to Argo Workflow Templates...\n    WorkflowTemplate helloflow is pushed to Argo Workflows successfully.\n```\n\n### Trigger the Execution of the WorkflowTemplate\n\n```bash\n    python helloworld.py argo trigger\n```\n\nYou should see a `run-id` after a successful start of a workflow:\n\n```bash\n    Metaflow 2.5.0 executing HelloFlow for user:<xyz>\n    Validating your flow...\n        The graph looks good!\n    Running pylint...\n        Pylint is happy!\n    Workflow helloflow is triggered on Argo Workflows (run-id helloflow-92vbn).\n```\n\n### Check the Execution Status\n\n```bash\n    python helloworld.py argo list-runs\n```\n\nThe flow will display the state `Running` for some time and finally the flow shows\n`Succeeded`:\n\n```bash\n    Metaflow 2.5.0 executing HelloFlow for user:<xyz>\n    Validating your flow...\n        The graph looks good!\n    Running pylint...\n        Pylint is happy!\n    helloflow-92vbn startedAt:'2022-06-15T14:00:27Z' Succeeded\n```\n\n## Metaflow-Argo Decorators\n\nIn addition to general Metaflow decorators, i.e. `@kubernetes`,\n`@resources`, `@environment`, etc. the Metaflow-Argo plugin supports\nadditional Argo-specific customization for steps which support\ninput_artifacts ,output_artifacts, labels and shared_memory.\n\n## The `@argo` Decorator on Step Level\n\nA Metaflow _step_ can override the default values specified on the\nflow level or add extra configurations.\n\n### Labels\n\nLabels are attached as metadata to the pod executing a _step_:\n\n``` python\n...\n\n    @argo(labels = {'plan': 'basic'})\n    @step\n    def training():\n        ...\n```\n\nThe _step_ labels override the flow labels.\n\n### Input Artifacts\n\nThe artifacts configuration is inserted into the generated\nWorkflowTemplate as-is.\n\n``` python\n@argo(input_artifacts=[{\n        'name': 'input_dataset',\n        'path': '/tmp/train.csv',\n        's3': {\n            'endpoint': 's3.amazonaws.com',\n            'bucket': 'my-bucket-name',\n            'key': 'path/in/bucket',\n            'accessKeySecret': {\n                'name': 'my-s3-credentials',\n                'key': 'accessKey'\n            },\n            'secretKeySecret': {\n                'name': 'my-s3-credentials',\n                'key': 'secretKey'\n            }\n        }}])\n@step\ndef start(self):\n    ds = pd.read_csv('/tmp/train.csv')\n```\n\nIf the [Default Artifact Repository](https://argoproj.github.io/argo-workflows/key-only-artifacts) is configured, the credentials can be ommited:\n\n``` python\n@argo(input_artifacts=[{\n        'name': 'input_dataset',\n        'path': '/tmp/spam.csv',\n        's3': {\n            'key': 'path/in/bucket'\n          }\n        }])\n```\n\n### Output Artifacts\n\n> Prerequisite: The [Default Artifact Repository](https://argoproj.github.io/argo-workflows/configure-artifact-repository/#configure-the-default-artifact-repository) is configured\n\nIf the configuration below is added to a _step_, Argo will upload the output artifact to the specified S3 bucket.\n\n``` python\n@argo(output_artifacts=[{\n    'name': 'model',\n    'globalName': 'model',\n    'path': '/tmp/model',\n    'archive': {\n        'none': {}\n    }}])\n@step\ndef train_model(self):\n    ...\n    with open('/tmp/model', 'wb') as f:\n        pickle.dump(self.model, f)\n```\n\n### Configure additional Disk Space using Persistent Volume Claim (PVC)\n\nIf the pod's disk space is not sufficient, one can mount extra storage using a [Persistent Volume Claim](https://kubernetes.io/docs/concepts/storage/persistent-volumes/).\n\nThe example below creates a pvc in WorkflowTemplate with a default size of 50GiB:\n\n``` python\n@argo(mount_pvc=\"/mnt/data\")\n@step\ndef start(self):\n    # use /mnt/data for huge files\n```\n\nTo set the size for the `PVC`, add a value for the command-line parameter `volumen-claim` in MiB:\n\n```bash\n    python helloflow.py argo create --volume-claim=100000\n```\n\n### Shared Memory Allocation\n\nFor adding more `shared memory` in MiB use the following configuration:\n\n``` python\n@argo(shared_memory=100)\n@step\ndef train_model(self):\n    ...\n```\n\n## Command-line Parameters and Environment Variables\n\nThe Metaflow-Argo plugin allows overriding default `flow` values. This done through environment variables or command-line parameters. Command-line parameters naturally have precedence over `envvars`.\n\n## Command-line Parameters\n\n### List Command-line Parameters\n\n```bash\nUsage: helloworld.py argo create [OPTIONS]\n\n  Deploy a new version of this workflow to Argo Workflow Templates.\n\nOptions:\n  --image-pull-secret TEXT    Docker image pull secret.\n  --label TEXT                Label to attach to the workflow template.\n  --annotation TEXT           Annotation to attach to the workflow template.\n  --volume-claim INTEGER      Size of the volumeClaim in MiB\n  --k8s-namespace TEXT        Deploy into the specified kubernetes namespace.\n  --embedded                  Don't download code package into step\n                              containers.Docker images should have a flow and\n                              all dependencies embedded.\n\n  --max-workers INTEGER       Maximum number of parallel pods.  [default: 100]\n  --workflow-timeout INTEGER  Workflow timeout in seconds.\n  --only-json                 Only print out JSON sent to Argo. Do not deploy\n                              anything.\n\n  --help                      Show this message and exit.    \n```\n\n### Set the Flow Name\n\nThe Argo WorkflowTemplate name is taken by default from class name of the the Metaflow script. If you want to change the flow name, use:\n\n```bash\n    python helloworld.py argo --name=hello create\n```\n\n### `@kubernetes` Decorator Command Line Parameter\n\nThe Metaflow command-line parameter `--with=kubernetes` allows for setting configurations when using a kubernetes cluster:\n\n* `secrets` to access object store, multiple secrets can be passed separated by `,`\n\n``` python\n    python helloworld.py --with=kubernetes:secrets=default-object-store-secret,s3-secret argo create\n```\n\n* `image` to specify a docker image\n\n``` python\n    python helloworld.py --with=kubernetes:secrets=default-object-store-secret,image=demisto/sklearn:1.0.0.29944 argo create\n```\n\n* `disk` is used to reserve ephemeral storage in MiB for the node.\n\n``` python\n    python helloworld.py --with=kubernetes:secrets=default-object-store-secret,disk=1024 argo create\n```\n\n### Embedded code package\n\nThe flag `--embedded` generates steps' commands without a code package download; expects a code\tpackage\tto be already *embedded*\ninto a docker image.\n\n## Environment Variables\n\n### Prefix for the Flow Name\n\nTo add a prefix to the flow name set the environment variable `ARGO_WORKFLOW_PREFIX`.\n\n### Default K8S Namespace\n\nThe kubernetes namespace to store templates and execute workflows can be\nset with envvar `METAFLOW_KUBERNETES_NAMESPACE`.\n\n### Default Environment Variables for Command-line Parameters\n\nMetaflow provides envvars for the correspondent command-line parameters by setting this prefix: `METAFLOW_ARGO_CREATE_` e.g.:\n\n```bash\n    export METAFLOW_ARGO_CREATE_MAX_WORKERS=4\n```\n\n## Other Supported Features\n\nThe Metaflow-Argo plugin supports also many Metaflow features.\n\n## `@kubernetes` Decorator\n\nThe Metaflow-Argo plugin makes use of the specifications defined in\n`@kubernetes` decorator to configure container resources.\n\n``` python\n\n    @kubernetes(image='tensorflow/tensorflow:2.6.0', cpu=8, memory=60000)\n    @step\n    def training():\n        ...\n```\n\n## `@resources` Decorator\n\nFull support for [Requesting resources with \\@resources\ndecorator](https://docs.metaflow.org/metaflow/scaling#requesting-resources-with-resources-decorator).\nIf similar resources are specified in both `@kubernetes` and\n`@resources`, the maximum is taken.\n\n``` python\n\n    @kubernetes(image='tensorflow/tensorflow:2.4.2-gpu',\n                node_selector='gpu=nvidia-tesla-k80')\n    @resources(memory=60000, cpu=2)\n    @step\n    def training():\n        ...\n```\n\n## `@environment` Decorator\n\nIn addition to `env` and `envFrom` envvars can be specified in the decorator:\n\n``` python\n@environment(vars={'var':os.getenv('var_1')})\n@step\ndef start(self):\n    print(os.getenv('var'))\n```\n\n## Display Logs for a Step\n\nMetaflow stores stdout and stderr of every step/task and provides a\nconvenient way to access them, passing run-id and step-name:\n\n```bash\n    python 00-helloworld/helloworld.py logs helloflow-bdmmw/start\n```\n\nOptionally specify a task-id, useful when have many instances of the\nsame step, e.g. when use foreach:\n\n```bash\n    python 00-helloworld/helloworld.py logs helloflow-bdmmw/hello/helloflow-bdmmw-3938807173\n```\n\n## Resume flows\n\nSee [How to debug failed\nflows](https://docs.metaflow.org/metaflow/debugging#how-to-debug-failed-flows)\nfor details.\n\n```bash\n    python 00-helloworld/helloworld.py resume --origin-run-id=helloflow-bdmmw\n```\n\n## Schedule flows\n\nMetaflow\\'s\n[\\@schedule](https://docs.metaflow.org/going-to-production-with-metaflow/scheduling-metaflow-flows#scheduling-a-flow)\ndecorator is fully supported. E.g. a workflow template with the\n`@schedule(hourly=True)` will automatically trigger workflow execution\non every hour start.\n\n## Automatically rerun failed steps\n\n`@retry` decorator helps to deal with transient failures. See [Retrying\nTasks with the retry\nDecorator](https://docs.metaflow.org/metaflow/failures#retrying-tasks-with-the-retry-decorator).\n\n## Handle failures programmatically\n\nWhat if a default behavior to stop a whole flow if a particular step\nfails is too limiting? With a help of `@catch` decorator, it\\'s possible\nto handle step\\'s failure in a next step. [Catching Exceptions with the\ncatch\nDecorator](https://docs.metaflow.org/metaflow/failures#catching-exceptions-with-the-catch-decorator).\n\n## Limit the Step Execution Time\n\nUse `@timeout` decorator to guard yourself from accidental resources\nconsumption. Read [Timing out with the timeout\nDecorator](https://docs.metaflow.org/metaflow/failures#timing-out-with-the-timeout-decorator)\nfor more details.\n\n## Use with SAP AICore\n\nSAP AI Core does not allow access to its K8s cluster or argo-server. Workflow Templates are generated in json/yaml format and synced to K8s via the registered Git repo. The Workflows can then be triggered via `SAP AI API`.\n\nThe mandatory information for SAP AI Core is provided through labels and annotations.\n\n```bash\npython helloworld.py --with=kubernetes:secrets=default-object-store-secret,image=docker.io/<my-repo>/helloworld argo create\n  --image-pull-secret=<docker-secret>\n  --label=\"scenarios.ai.sap.com/id:ml_usecase_name\"\n  --label=\"ai.sap.com/version:0.1.0\"\n  --annotation=\"scenarios.ai.sap.com/name:Hello-World\"\n  --annotation=\"\"executables.ai.sap.com/name\":\"helloworld\"\n  --only-json > hello-world.json\n```\n\n[Here](https://blogs.sap.com/2022/04/20/train-your-model-in-sap-ai-core-using-the-metaflow-argo-plugin/) you can find a step-by-step example for SAP AI Core.\n\n\n",
    "bugtrack_url": null,
    "license": "SAP DEVELOPER LICENSE AGREEMENT",
    "summary": "A Metaflow plugin to execute flows on Argo Workflows",
    "version": "1.1.14",
    "split_keywords": [
        "sap",
        "ai",
        "core",
        "metaflow",
        "argo",
        "kubernetes"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "bf993308f863a72e4400389e543cb169e07be685109e7bf245c4af6fe44ad569",
                "md5": "833614a279c78bb5f9f3b16f89240330",
                "sha256": "c1807a55b268c70ab61830a893d8400bdd3fd94139cbdaa11b6d2c267e793507"
            },
            "downloads": -1,
            "filename": "sap_ai_core_metaflow-1.1.14-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "833614a279c78bb5f9f3b16f89240330",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 23837,
            "upload_time": "2023-03-29T12:51:12",
            "upload_time_iso_8601": "2023-03-29T12:51:12.021805Z",
            "url": "https://files.pythonhosted.org/packages/bf/99/3308f863a72e4400389e543cb169e07be685109e7bf245c4af6fe44ad569/sap_ai_core_metaflow-1.1.14-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-03-29 12:51:12",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "lcname": "sap-ai-core-metaflow"
}
        
Elapsed time: 0.04984s