ai-core-sdk


Nameai-core-sdk JSON
Version 2.4.12 PyPI version JSON
download
home_pagehttps://www.sap.com/
SummarySAP AI Core SDK
upload_time2024-11-04 14:34:31
maintainerNone
docs_urlNone
authorSAP SE
requires_python>=3.7
licenseSAP DEVELOPER LICENSE AGREEMENT
keywords sap ai core sdk sap ai core api sap ai core
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # SAP AI Core SDK

The SAP AI Core SDK is a Python-based SDK that lets you access SAP AI Core using Python methods and data structures. It provides tools that help you to manage your scenarios and workflows in SAP AI Core.


The SAP AI Core SDK can be used to interact with SAP AI Core. It provides access to all public lifecycle and administration APIs.

For example:

* You can execute pipelines as a batch job to preprocess or train your models, or perform batch inference.

* You can deploy а trained machine learning model as a web service to serve inference requests with high performance.

* You can register your own Docker registry, synchronize your AI content from your own git repository, and register your own object store for training data and trained models.

* You can log metrics within a workflow execution using the SDK. You can use the same code for tracking metrics in both your local environment and in the workflow execution (production).

> **Note**
>
> Note that executing online inference is not part of SAP AI Core SDK.
>
> Metrics persistence is not currently available in your local environment using the SDK. However, it is available in your productive workflow execution.

## Example Usage

The SDK can, for instance, be used in a Jupyter notebook for convenient interaction with SAP AI Core in a test or development context.

Here are a few examples how to use the SDK. For details on the methods, please refer to the html documentation provided in the `/docs` folder of the wheel file.

### Import Definitions

```python
from ai_core_sdk.ai_core_v2_client import AICoreV2Client
```

## Create Client

The SDK requires credentials from your tenant's subaccount Service Key:
```python
client = AICoreV2Client(base_url=AI_API_BASE,
                        auth_url=AUTH_URL,
                        client_id=CLIENT_ID,
                        client_secret=CLIENT_SECRET,
                        resource_group=resource_group_id)
```
(For persistent client configuration see below.)

### Create New Resource Group

```python
resource_group_create = client.resource_groups.create(resource_group_id=resource_group_id)
print(resource_group_create.resource_group_id)
resource_group_details = client.resource_groups.get(resource_group_id=resource_group_id)
print(f"{resource_group_details.status_message} \n{resource_group_details.resource_group_id}")
```

### Create Object Store Secret

```python
# access key and secret are assumed to reside in environment variables OSS_KEY and OSS_SECRET
object_store_secret_create = client.object_store_secrets.create(
            name="default",
            type="S3",
            bucket="<your S3 bucket>",
            endpoint="<your S3 host>",
            path_prefix="<your path prefix in S3>", region="<your S3 region>",
            data={"AWS_ACCESS_KEY_ID": os.environ.get("OSS_KEY"),
            "AWS_SECRET_ACCESS_KEY": os.environ.get("OSS_SECRET")})

secret_get = client.object_store_secrets.get(name="default")
print(f"{secret_get.metadata}")
```

### List Scenarios

```python
scenarios = client.scenario.query()
for scenario in scenarios.resources:
    print(f"{scenario.name} {scenario.id}")
```
## Client Configuration

There are different options to persist the client credentials
(in this order of precedence):
 - in code via keyword arguments (see above),
 - environment variables,
 - profile configuration file.
 - from VCAP_SERVICES environment variable, if exists

A **profile** is a json file residing in a config directory,
which can be set via environment variable `AICORE_HOME` (the default being `~/.aicore/config.json`).

The command `ai-core-sdk configure --help` can be used to generate a profile.

With profile names one can switch easily between profiles e.g., for different (sub)accounts.
The profile name can be passed also as a keyword. If no profile is specified, the default profile is used.

## AICore Content Packages

ai-core-sdk provides a command-line utility as well as a python library to use AICore content packages.

### Installation

The content functionality should be installed as an optional dependency of ai-core-sdk:

```bash
pip install ai-core-sdk[aicore-content]
```

Content packages use Metaflow's Argo Workflows plugin to generate templates for AICore and hence Metaflow should be also configured.
Run a configuration wizzard:

```bash
metaflow configure kubernetes
```

or create a simple configuration file `~/.metaflowconfig/config.json` with a link to the configured in AICore object store:

```bash
{
    "METAFLOW_DEFAULT_DATASTORE": "s3",
    "METAFLOW_DATASTORE_SYSROOT_S3": "s3://<bucket>/<prefix>"
}
```

See [Configuring Metaflow](https://outerbounds.com/docs/configure-metaflow) for more details.

### Discover Content Packages and their Content

**CLI**

Echo descriptions of all packages from all registires:

```bash
aicore-content list
```

To add custom content package spec the environment variable `AI_CORE_CONTENT_SPECS` ([`.env`](https://pypi.org/project/python-dotenv/) file is also supported ) or the `-c <path to content spec py/yaml>` option can be used:

```bash
aicore-content -c ai_core_content_spec.py list
export AI_CORE_CONTENT_SPECS=$PWD/ai_core_content_spec.py && aicore-content show
echo "AI_CORE_CONTENT_SPECS=$PWD/ai_core_content_spec.py" >> .env && aicore-content show
```

**Python**

All packages:

```python
from ai_core_sdk.content import get_content_packages # function to fetch a list of all packages from all registries

for content_pkg in get_content_packages().values():
    content_pkg.print() # print more details for each package
```

The content specs can also be used directly:

```python
from content_package.ai_core_content_spec import package_spec_obj
package_spec_obj.print()
```

A content packages consists of multiple *workflows*. **Workflows** can be AI Core **Executions** or **Deployments**.

**CLI**

List of all workflows available:

```bash
aicore-content list <name of the package>
```

Details of a specific workflow:

```bash
aicore-content show <name of the package> <name of the workflow>
```

**Python**

All packages:

```python
from ai_core_sdk.content import get_content_packages # function to fetch a list of all packages from all registries

package = [*get_content_packages().values()][0] # Get a ContentPackage object
workflows = package.workflows # get all workflows from a package
for flow in workflows:
    flow.print(compact=True) # Print a compact description of the workflow
for flow in workflows:
    flow.print(compact=False) # Print a detailed description of the workflow.
```

When using `python` a package can also directly used from the `ContentPackage` file without registering:

```python
# Load the object from the submodule
from content_package.ai_core_content_spec import package_spec_obj
[*package_spec_obj.workflows.values()][0].print(compact=True)

# create the object from content package spec yaml
from ai_core_sdk.content import ContentPackage

package_spec_obj = ContentPackage.from_yaml('<path to package spec yaml>')
[*package_spec_obj.workflows.values()][0].print(compact=True)

# load content package specs from .py file
from ai_core_sdk.content import get_content_packages_from_py_file
list_of_package_specs = get_content_packages_from_py_file('<path to package spec py>') # a .py file can contain multiple package specs
for package_spec_obj in list_of_package_specs:
    [*package_spec_obj.workflows.values()][0].print(compact=True)
```

### User's workflow configuration

Every AICore template has a section with a user-specific information, i.e. scenario, docker repo, secret names, etc.  To generate a template from a content package with user settings, user should create a *workflow configuration*.  This is a yaml file with a following structure:

```bash
.contentPackage: my-package
.workflow: my-package-workflow
.dockerType: gpu

name: "my-executable-id"
labels:
  scenarios.ai.sap.com/id: "my-scenario-id"
  ai.sap.com/version: "0.0.1"
annotations:
  scenarios.ai.sap.com/name: "my-scenario-name"
  executables.ai.sap.com/description: "My description"
  executables.ai.sap.com/name: "my-executable-name"
image: "my-docker-image"
imagePullSecret: "my-docker-registry-secret"
objectStoreSecret: "default-object-store-secret"
```

In this config the target content package and workflow can be referenced using the keys `.package` and `.workflow`. If provided those references are used to create the image and template. If not provided the package and the workflow have to be specified with `--package` and `--workflow` options (see following sections for details).

### Create Docker images

To run an execution or start a deployment, at least one docker image is needed. Required docker images could be generated through the CLI/Python calls. Both run a `docker build` internally.

**CLI**

To create docker images:

```bash
aicore-content create-image [--package=<package name>] [--workflow=<workflow name>] <workflow config>
```
The command-line options `--package` and `--workflow` overwrite valudes from the worfklow config.

The building process has to be followed by a `docker push -t <generated-image>` to push the container to the registry.

**Python**

The workflow objects got a function `.create_image(...)`:

```python
workflow = content_package_spec_obj['batch_processing'] # fetch the workflow object using its name
docker_tag = 'my-very-own-docker.repo/sap-cv-batch-processing:0.0.1'
workflow_config = 'my-config.yaml'
with open(workflow_config) as stream
    workflow_config = yaml.load(stream)
cmd = workflow.create_image(workflow_config, return_cmd=True) # perform a dry run and return the cmd
print(cmd)
workflow.create_image(workflow_config) # actually build the docker container
os.system(f'docker push {workflow_config["image"]}') # push the container
```

### Create AI Core Templates

The template creation is similar for Execution and Deployment templates.

#### Create Execution Templates

To create execution templates the [metaflow argo plugin](https://pypi.org/project/sap-ai-core-metaflow/) is used.

**CLI**

Workflow templates need a workflow config to be provided to the `create-template` subcommand.

```bash
aicore-content create-template [--package=<package name>] [--workflow=<workflow name>] <workflow config> -o <path to output template.json>
```

**Python**

The workflow config for execution templates has to be provided to the workflows's `.create_template(...)` function. The output file can be specified through the keyword parameter `target_file`:

```python
workflow.create_template('my-workflow-config.yaml', target_file='workflow-template.json')
```

**Additonal Options**

Additional arguments can be defined in the workflow config under the key `additionalOptions`.

```yaml
...
additionalOptions:
    workflow: # list of strings passed as workflow options
        ...
    metaflow: # list of strings passed as metaflow options
        - --package-suffixes=.py,.sh
```

Strings in the `workflow`/`metaflow` pasted into these positions:

```bash
python -m flow [metaflow] argo create [workflow]
```

To check the resulting call an `--dry-run`(CLI)/`return_cmd=True`(Python) option is availble. Alternativly the subcommand `aicore-content <package name> metaflow-cli <workflow name>` or `workflow.raw_metaflow_cli(...)`. Every input after the command is directly passed to the underlying `python -m <flow>` call.

#### Create Serving Templates

**CLI**

Serving templates also need a *workflow config* to be passed to the `create-template` subcommand.

```bash
aicore-content create-template [--package=<package name>] [--workflow=<workflow name>] <workflow config> -o <path to output template.yaml>
```

**Python**

The workflow config has to be provided to the workflows's `.create_template(...)` function. The output file can be specified through the keyword parameter `target_file`:

```python
workflow.create_template('my-workflow-config.yaml', target_file='serving-template.yaml')
```

### Content Package Creation

A content package needs two additional files: the `ContentPackage` file and a workflows yaml.

#### `ContentPackage`

Content package are defined in a `ContentPackage` instance. This instance can either be loaded from a `.py`file or can be created from a yaml. A typical `.py` file looks like this:

```python
import pathlib
import yaml
from ai_core_sdk.content import ContentPackage

HERE = pathlib.Path(__file__).parent

workflow_yaml = HERE / 'workflows.yaml'
with workflow_yaml.open() as stream:
    workflows = yaml.safe_load(stream)

spec = ContentPackage(
    name='my-package name', # name of the package and the aicore-content cli subcommand
    workflows_base_path=workflow_yaml.parent, # Base paths for all relative paths in the workflow dict
    workflows=workflows,
    description='This is an epic content package.', # Package description
    examples=HERE / 'examples', # Folder to package related examples [optional]
    version='0.0.1' # Version of the package
)
```

If the package is to be distributed as a python module via Nexus or PyPI and the content package spec python file should be included in the package as `ai_core_content_spec.py`. This allows the CLI to find the package without additional configuration efforts and creates a subcommand using the name from the `ContentPackage.name` attribute.

##### Examples

Examples for the content package can copy by the consumer to a directory using the command `aicore-content examples <package name> <target folder>`. This command creates the target folder and copies all files from the paths set in the `ContentPackage`. If no path is set or the path does not exists the `examples` subcommand is not created.

##### Version

Currently the version in the `ContentPackage` is passed to the docker build call as  `--build-arg pkg_version==={version}`. In the `Dockerfile` this build argument can be used to build the docker container using the local version of the package. This is useful for content packages distributed as module through Nexus or PyPI:

```Dockerfile
FROM pytorch/pytorch
ARG pkg_version=
ENV pkg_version=$pkg_version
...
RUN python -m pip install sap-computer-vision-package${pkg_version}
```

#### Workflows File

The second mandatory file a workflows yaml. This yaml is used to define workflows and is structed like a dictionary:
Entries of the dict look like this:

```Yaml
name-of-the-workflow:
    description: >
        Description text [optional]
    dockerfile: ...
    type: ExecutionMetaflow/DeploymentYAML
    [... more type specific fields]
```

It is important that all paths in the workflow yaml are specified as paths relative to the workflow yaml`s path. This also means that all files must be located in the same folder or in subfolders.

#### Dockerfiles

The dockerfile entry can either be a single:

```Yaml
dockerfile: train_workflow/Dockerfile
```

or a dictionary of paths:

```Yaml
dockerfile:
    cpu: train_workflow/Dockerfile
    gpu: train_workflow/Dockerfile
```

This allows to define different Docker container for different node types or multiple containers for cases where different steps use different containers. The different dockerfile can be selected using the option/argument `docker_type` when running the build docker command/function.

#### Types

Currently two types of workflows are supported:

* `ExecutionMetaflow`: exections defined as metaflow flowspecs
* `DeploymentYaml`: deployment defined as yamls

##### ExecutionMetaflow

Additional fields for a `ExecutionMetaflow` entry are

* `py`: path to the python file containing the `metaflow.FlowSpec` class
* `className`: name of the `metaflow.FlowSpec` class
* `additionalOptions` (optional): The section is identical to the `additionalOptions` from the workflow configs.

```Yaml
train-workflow:
    description: >
        Description text [optional]
    dockerfile:
        cpu: train/Dockerfile
        cpu: train/DockerfileGPU
    type: ExecutionMetaflow
    py: train/flow.py
    className: TrainingFlow
    additionalOptions:
        annotations:
            artifacts.ai.sap.com/datain.kind: dataset
            artifacts.ai.sap.com/trainedmodel.kind: model
        labels:
            ...
        workflow: # list of strings passed as workflow options
            ...
        metaflow: # list of strings passed as metaflow options (only valid for ExecutionMetaflow)
            - --package-suffixes=.py,.sh
```

##### DeploymentYaml

Additional fields:

* `yaml`: contains a path to a Deployment Template.

```Yaml
my_deployment:
  type: DeploymentYaml
  yaml: deploy/template.yaml
  dockerfile: deploy/Dockerfile
```

A Deployment Template is a regular Serving Template.  Data from `name`, `annotations` and `labels` fields from a user-defined
workflow config would replace correspondent attributes in the template.
In addition, variables `$image` and `$imagePullSecret` will be replaced with the values from the workflow config.

```Yaml
apiVersion: ai.sap.com/v1alpha1
kind: ServingTemplate
metadata:
  name: my-serving
  annotations:
    executables.ai.sap.com/name: my-serving
    executables.ai.sap.com/description: "My model serving"
  labels:
    ai.sap.com/version: "0.0.1"
spec:
  template:
    apiVersion: serving.kserve.io/v1beta1
    metadata:
      labels: |
        ai.sap.com/resourcePlan: "infer.s"
    spec: |
      predictor:
         imagePullSecrets:
         - name: $imagePullSecret
         containers:
         - name: kserve-container
           image: $image
```

##### `ContentPackage` from yaml

The specification of a content package and of the workflows can also be merged into a single yaml file:

```yaml
name: test-spec-yaml
examples: examples
description: |
  this is a test.
show_files:
  test: SHOW_FILE
workflows:
  test_execution:
    type: ExecutionMetaflow
    className: TestPipeline
    py: train/flow.py
    dockerfile:
      cpu: train/Dockerfile
      gpu: train/DockerfileGPU
    annotations:
      artifacts.ai.sap.com/datain.kind: dataset
      artifacts.ai.sap.com/trainedmodel.kind: model
    metaflow:
      - --package-suffixes=.py,.s
```

Currently there is no discovery mechanism for yaml specs.

They either have to be consumed in python:

```python
from ai_core_sdk.content import ContentPackage
content_package = ContentPackage.from_yaml(spec_yaml)
```

or made available to the CLI through the `aicore-content -c <path to the spec yaml> ...` option or through the `AI_CORE_CONTENT_SPECS` environment variable.

#### Embedded code packages

Metaflow supports the Data Scientist during the experimentation phase, such that it packs all python files into an archive called "code package" and stores it in the configured S3 bucket. Later, during steps execution, this code package is downloaded into the docker container, unpacked there and the respective step function from the Metaflow python file is executed. This approach helps to iterate faster since there is no need to rebuild the docker image after every code change.

In production, however the ML code will not change often. Therefore, it could be preferable to pack the code into the docker image instead of injecting it for every run. In other words, we want to embed the ML code into the docker image.

When running **aicore-content create-image ...**,	a code package is always generated and placed next to the corresponding Dockerfile under the name `codepkg.tgz`.  This file could be used in the Dockerfile with:
```
ADD codepkg.tgz .
```

In addition, it is necessary to to tell sap-ai-core-metaflow to use the embedded coded package instead of downloading it.
Add this to the workflow's config file:
```
  additionalOptions:
    workflow:
      - --embedded
```


## Tracking

 The tracking module of the SAP AI Core SDK can be used to log metrics in both your local environment, and productive workflow executions. Metrics persistence is currently available in your productive environment.

 Here are a few code samples demonstrating how to use the SDK for metrics tracking.


### Modify Metrics

 ```
 from ai_core_sdk.tracking import Tracking

 from ai_core_sdk.models import Metric, MetricTag, MetricCustomInfo

 tracking_client = Tracking()

 tracking_client.modify(
    tags = [
        # list
        MetricTag(name="Our Team Tag", value="Tutorial Team"),
        MetricTag(name="Stage", value="Development")
    ],
    metrics = [
        Metric(
            name="Training Loss",
            value=np.finfo(np.float64).max,
            timestamp= datetime.now().utcnow(),
            step = 1, # denotes epoch 1
            labels = []
        )
    ],
    custom_info = [
        # list of Custom Information
         MetricCustomInfo(
             name = "My Classification Report",
             # you may convert anything to string and store it
             value = str('''{
                 "Cats": {
                     "Precision": 75,
                     "Recall": 74
                 },
                 "Dogs": {
                     "Precision": 85,
                     "Recall": 84
                 }
             }''')
        )
    ]
 )

 ```

 ### Log Metrics

 ```
 tracking_client.log_metrics(
    metrics = [
        Metric(
            name="Training Loss",
            value=float(86.99),
            timestamp= datetime.now().utcnow(),
            step = 1, # denotes epoch 1
            labels = []
        ),
    ],
 )

 ```

 ### Set Tags

 ```
 tracking_client.set_tags(
    tags = [
        # list
        MetricTag(name="Our Team Tag", value="Tutorial Team"),
        MetricTag(name="Stage", value="Development")
    ]
 )

 ```

 ### Set Custom Info

 ```
 tracking_client.set_custom_info(
    custom_info = [
        # list of Custom Information
         MetricCustomInfo(
             name = "My Classification Report",
             # you may convert anything to string and store it
             value = str('''
             {
                 "Cats": {
                     "Precision": 75,
                     "Recall": 74
                 },
                 "Dogs": {
                     "Precision": 85,
                     "Recall": 84
                 }
             }
             '''
             )
        ),
    ]
 )

 ```
  ### Query Metrics

 ```
 metrics_response = tracking_client.query(execution_ids = [
    "test_execution_id"    # Change this with the training execution id
 ])
 ```

  ### Delete Metrics

 ```
 metrics_response = tracking_client.delete(execution_id = "test_execution_id") # Change this with the actual execution id
 ```


            

Raw data

            {
    "_id": null,
    "home_page": "https://www.sap.com/",
    "name": "ai-core-sdk",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": "SAP AI Core SDK, SAP AI Core API, SAP AI Core",
    "author": "SAP SE",
    "author_email": null,
    "download_url": "https://pypi.python.org/pypi/ai-core-sdk",
    "platform": "Windows",
    "description": "# SAP AI Core SDK\n\nThe SAP AI Core SDK is a Python-based SDK that lets you access SAP AI Core using Python methods and data structures. It provides tools that help you to manage your scenarios and workflows in SAP AI Core.\n\n\nThe SAP AI Core SDK can be used to interact with SAP AI Core. It provides access to all public lifecycle and administration APIs.\n\nFor example:\n\n* You can execute pipelines as a batch job to preprocess or train your models, or perform batch inference.\n\n* You can deploy \u0430 trained machine learning model as a web service to serve inference requests with high performance.\n\n* You can register your own Docker registry, synchronize your AI content from your own git repository, and register your own object store for training data and trained models.\n\n* You can log metrics within a workflow execution using the SDK. You can use the same code for tracking metrics in both your local environment and in the workflow execution (production).\n\n> **Note**\n>\n> Note that executing online inference is not part of SAP AI Core SDK.\n>\n> Metrics persistence is not currently available in your local environment using the SDK. However, it is available in your productive workflow execution.\n\n## Example Usage\n\nThe SDK can, for instance, be used in a Jupyter notebook for convenient interaction with SAP AI Core in a test or development context.\n\nHere are a few examples how to use the SDK. For details on the methods, please refer to the html documentation provided in the `/docs` folder of the wheel file.\n\n### Import Definitions\n\n```python\nfrom ai_core_sdk.ai_core_v2_client import AICoreV2Client\n```\n\n## Create Client\n\nThe SDK requires credentials from your tenant's subaccount Service Key:\n```python\nclient = AICoreV2Client(base_url=AI_API_BASE,\n                        auth_url=AUTH_URL,\n                        client_id=CLIENT_ID,\n                        client_secret=CLIENT_SECRET,\n                        resource_group=resource_group_id)\n```\n(For persistent client configuration see below.)\n\n### Create New Resource Group\n\n```python\nresource_group_create = client.resource_groups.create(resource_group_id=resource_group_id)\nprint(resource_group_create.resource_group_id)\nresource_group_details = client.resource_groups.get(resource_group_id=resource_group_id)\nprint(f\"{resource_group_details.status_message} \\n{resource_group_details.resource_group_id}\")\n```\n\n### Create Object Store Secret\n\n```python\n# access key and secret are assumed to reside in environment variables OSS_KEY and OSS_SECRET\nobject_store_secret_create = client.object_store_secrets.create(\n            name=\"default\",\n            type=\"S3\",\n            bucket=\"<your S3 bucket>\",\n            endpoint=\"<your S3 host>\",\n            path_prefix=\"<your path prefix in S3>\", region=\"<your S3 region>\",\n            data={\"AWS_ACCESS_KEY_ID\": os.environ.get(\"OSS_KEY\"),\n            \"AWS_SECRET_ACCESS_KEY\": os.environ.get(\"OSS_SECRET\")})\n\nsecret_get = client.object_store_secrets.get(name=\"default\")\nprint(f\"{secret_get.metadata}\")\n```\n\n### List Scenarios\n\n```python\nscenarios = client.scenario.query()\nfor scenario in scenarios.resources:\n    print(f\"{scenario.name} {scenario.id}\")\n```\n## Client Configuration\n\nThere are different options to persist the client credentials\n(in this order of precedence):\n - in code via keyword arguments (see above),\n - environment variables,\n - profile configuration file.\n - from VCAP_SERVICES environment variable, if exists\n\nA **profile** is a json file residing in a config directory,\nwhich can be set via environment variable `AICORE_HOME` (the default being `~/.aicore/config.json`).\n\nThe command `ai-core-sdk configure --help` can be used to generate a profile.\n\nWith profile names one can switch easily between profiles e.g., for different (sub)accounts.\nThe profile name can be passed also as a keyword. If no profile is specified, the default profile is used.\n\n## AICore Content Packages\n\nai-core-sdk provides a command-line utility as well as a python library to use AICore content packages.\n\n### Installation\n\nThe content functionality should be installed as an optional dependency of ai-core-sdk:\n\n```bash\npip install ai-core-sdk[aicore-content]\n```\n\nContent packages use Metaflow's Argo Workflows plugin to generate templates for AICore and hence Metaflow should be also configured.\nRun a configuration wizzard:\n\n```bash\nmetaflow configure kubernetes\n```\n\nor create a simple configuration file `~/.metaflowconfig/config.json` with a link to the configured in AICore object store:\n\n```bash\n{\n    \"METAFLOW_DEFAULT_DATASTORE\": \"s3\",\n    \"METAFLOW_DATASTORE_SYSROOT_S3\": \"s3://<bucket>/<prefix>\"\n}\n```\n\nSee [Configuring Metaflow](https://outerbounds.com/docs/configure-metaflow) for more details.\n\n### Discover Content Packages and their Content\n\n**CLI**\n\nEcho descriptions of all packages from all registires:\n\n```bash\naicore-content list\n```\n\nTo add custom content package spec the environment variable `AI_CORE_CONTENT_SPECS` ([`.env`](https://pypi.org/project/python-dotenv/) file is also supported ) or the `-c <path to content spec py/yaml>` option can be used:\n\n```bash\naicore-content -c ai_core_content_spec.py list\nexport AI_CORE_CONTENT_SPECS=$PWD/ai_core_content_spec.py && aicore-content show\necho \"AI_CORE_CONTENT_SPECS=$PWD/ai_core_content_spec.py\" >> .env && aicore-content show\n```\n\n**Python**\n\nAll packages:\n\n```python\nfrom ai_core_sdk.content import get_content_packages # function to fetch a list of all packages from all registries\n\nfor content_pkg in get_content_packages().values():\n    content_pkg.print() # print more details for each package\n```\n\nThe content specs can also be used directly:\n\n```python\nfrom content_package.ai_core_content_spec import package_spec_obj\npackage_spec_obj.print()\n```\n\nA content packages consists of multiple *workflows*. **Workflows** can be AI Core **Executions** or **Deployments**.\n\n**CLI**\n\nList of all workflows available:\n\n```bash\naicore-content list <name of the package>\n```\n\nDetails of a specific workflow:\n\n```bash\naicore-content show <name of the package> <name of the workflow>\n```\n\n**Python**\n\nAll packages:\n\n```python\nfrom ai_core_sdk.content import get_content_packages # function to fetch a list of all packages from all registries\n\npackage = [*get_content_packages().values()][0] # Get a ContentPackage object\nworkflows = package.workflows # get all workflows from a package\nfor flow in workflows:\n    flow.print(compact=True) # Print a compact description of the workflow\nfor flow in workflows:\n    flow.print(compact=False) # Print a detailed description of the workflow.\n```\n\nWhen using `python` a package can also directly used from the `ContentPackage` file without registering:\n\n```python\n# Load the object from the submodule\nfrom content_package.ai_core_content_spec import package_spec_obj\n[*package_spec_obj.workflows.values()][0].print(compact=True)\n\n# create the object from content package spec yaml\nfrom ai_core_sdk.content import ContentPackage\n\npackage_spec_obj = ContentPackage.from_yaml('<path to package spec yaml>')\n[*package_spec_obj.workflows.values()][0].print(compact=True)\n\n# load content package specs from .py file\nfrom ai_core_sdk.content import get_content_packages_from_py_file\nlist_of_package_specs = get_content_packages_from_py_file('<path to package spec py>') # a .py file can contain multiple package specs\nfor package_spec_obj in list_of_package_specs:\n    [*package_spec_obj.workflows.values()][0].print(compact=True)\n```\n\n### User's workflow configuration\n\nEvery AICore template has a section with a user-specific information, i.e. scenario, docker repo, secret names, etc.  To generate a template from a content package with user settings, user should create a *workflow configuration*.  This is a yaml file with a following structure:\n\n```bash\n.contentPackage: my-package\n.workflow: my-package-workflow\n.dockerType: gpu\n\nname: \"my-executable-id\"\nlabels:\n  scenarios.ai.sap.com/id: \"my-scenario-id\"\n  ai.sap.com/version: \"0.0.1\"\nannotations:\n  scenarios.ai.sap.com/name: \"my-scenario-name\"\n  executables.ai.sap.com/description: \"My description\"\n  executables.ai.sap.com/name: \"my-executable-name\"\nimage: \"my-docker-image\"\nimagePullSecret: \"my-docker-registry-secret\"\nobjectStoreSecret: \"default-object-store-secret\"\n```\n\nIn this config the target content package and workflow can be referenced using the keys `.package` and `.workflow`. If provided those references are used to create the image and template. If not provided the package and the workflow have to be specified with `--package` and `--workflow` options (see following sections for details).\n\n### Create Docker images\n\nTo run an execution or start a deployment, at least one docker image is needed. Required docker images could be generated through the CLI/Python calls. Both run a `docker build` internally.\n\n**CLI**\n\nTo create docker images:\n\n```bash\naicore-content create-image [--package=<package name>] [--workflow=<workflow name>] <workflow config>\n```\nThe command-line options `--package` and `--workflow` overwrite valudes from the worfklow config.\n\nThe building process has to be followed by a `docker push -t <generated-image>` to push the container to the registry.\n\n**Python**\n\nThe workflow objects got a function `.create_image(...)`:\n\n```python\nworkflow = content_package_spec_obj['batch_processing'] # fetch the workflow object using its name\ndocker_tag = 'my-very-own-docker.repo/sap-cv-batch-processing:0.0.1'\nworkflow_config = 'my-config.yaml'\nwith open(workflow_config) as stream\n    workflow_config = yaml.load(stream)\ncmd = workflow.create_image(workflow_config, return_cmd=True) # perform a dry run and return the cmd\nprint(cmd)\nworkflow.create_image(workflow_config) # actually build the docker container\nos.system(f'docker push {workflow_config[\"image\"]}') # push the container\n```\n\n### Create AI Core Templates\n\nThe template creation is similar for Execution and Deployment templates.\n\n#### Create Execution Templates\n\nTo create execution templates the [metaflow argo plugin](https://pypi.org/project/sap-ai-core-metaflow/) is used.\n\n**CLI**\n\nWorkflow templates need a workflow config to be provided to the `create-template` subcommand.\n\n```bash\naicore-content create-template [--package=<package name>] [--workflow=<workflow name>] <workflow config> -o <path to output template.json>\n```\n\n**Python**\n\nThe workflow config for execution templates has to be provided to the workflows's `.create_template(...)` function. The output file can be specified through the keyword parameter `target_file`:\n\n```python\nworkflow.create_template('my-workflow-config.yaml', target_file='workflow-template.json')\n```\n\n**Additonal Options**\n\nAdditional arguments can be defined in the workflow config under the key `additionalOptions`.\n\n```yaml\n...\nadditionalOptions:\n    workflow: # list of strings passed as workflow options\n        ...\n    metaflow: # list of strings passed as metaflow options\n        - --package-suffixes=.py,.sh\n```\n\nStrings in the `workflow`/`metaflow` pasted into these positions:\n\n```bash\npython -m flow [metaflow] argo create [workflow]\n```\n\nTo check the resulting call an `--dry-run`(CLI)/`return_cmd=True`(Python) option is availble. Alternativly the subcommand `aicore-content <package name> metaflow-cli <workflow name>` or `workflow.raw_metaflow_cli(...)`. Every input after the command is directly passed to the underlying `python -m <flow>` call.\n\n#### Create Serving Templates\n\n**CLI**\n\nServing templates also need a *workflow config* to be passed to the `create-template` subcommand.\n\n```bash\naicore-content create-template [--package=<package name>] [--workflow=<workflow name>] <workflow config> -o <path to output template.yaml>\n```\n\n**Python**\n\nThe workflow config has to be provided to the workflows's `.create_template(...)` function. The output file can be specified through the keyword parameter `target_file`:\n\n```python\nworkflow.create_template('my-workflow-config.yaml', target_file='serving-template.yaml')\n```\n\n### Content Package Creation\n\nA content package needs two additional files: the `ContentPackage` file and a workflows yaml.\n\n#### `ContentPackage`\n\nContent package are defined in a `ContentPackage` instance. This instance can either be loaded from a `.py`file or can be created from a yaml. A typical `.py` file looks like this:\n\n```python\nimport pathlib\nimport yaml\nfrom ai_core_sdk.content import ContentPackage\n\nHERE = pathlib.Path(__file__).parent\n\nworkflow_yaml = HERE / 'workflows.yaml'\nwith workflow_yaml.open() as stream:\n    workflows = yaml.safe_load(stream)\n\nspec = ContentPackage(\n    name='my-package name', # name of the package and the aicore-content cli subcommand\n    workflows_base_path=workflow_yaml.parent, # Base paths for all relative paths in the workflow dict\n    workflows=workflows,\n    description='This is an epic content package.', # Package description\n    examples=HERE / 'examples', # Folder to package related examples [optional]\n    version='0.0.1' # Version of the package\n)\n```\n\nIf the package is to be distributed as a python module via Nexus or PyPI and the content package spec python file should be included in the package as `ai_core_content_spec.py`. This allows the CLI to find the package without additional configuration efforts and creates a subcommand using the name from the `ContentPackage.name` attribute.\n\n##### Examples\n\nExamples for the content package can copy by the consumer to a directory using the command `aicore-content examples <package name> <target folder>`. This command creates the target folder and copies all files from the paths set in the `ContentPackage`. If no path is set or the path does not exists the `examples` subcommand is not created.\n\n##### Version\n\nCurrently the version in the `ContentPackage` is passed to the docker build call as  `--build-arg pkg_version==={version}`. In the `Dockerfile` this build argument can be used to build the docker container using the local version of the package. This is useful for content packages distributed as module through Nexus or PyPI:\n\n```Dockerfile\nFROM pytorch/pytorch\nARG pkg_version=\nENV pkg_version=$pkg_version\n...\nRUN python -m pip install sap-computer-vision-package${pkg_version}\n```\n\n#### Workflows File\n\nThe second mandatory file a workflows yaml. This yaml is used to define workflows and is structed like a dictionary:\nEntries of the dict look like this:\n\n```Yaml\nname-of-the-workflow:\n    description: >\n        Description text [optional]\n    dockerfile: ...\n    type: ExecutionMetaflow/DeploymentYAML\n    [... more type specific fields]\n```\n\nIt is important that all paths in the workflow yaml are specified as paths relative to the workflow yaml`s path. This also means that all files must be located in the same folder or in subfolders.\n\n#### Dockerfiles\n\nThe dockerfile entry can either be a single:\n\n```Yaml\ndockerfile: train_workflow/Dockerfile\n```\n\nor a dictionary of paths:\n\n```Yaml\ndockerfile:\n    cpu: train_workflow/Dockerfile\n    gpu: train_workflow/Dockerfile\n```\n\nThis allows to define different Docker container for different node types or multiple containers for cases where different steps use different containers. The different dockerfile can be selected using the option/argument `docker_type` when running the build docker command/function.\n\n#### Types\n\nCurrently two types of workflows are supported:\n\n* `ExecutionMetaflow`: exections defined as metaflow flowspecs\n* `DeploymentYaml`: deployment defined as yamls\n\n##### ExecutionMetaflow\n\nAdditional fields for a `ExecutionMetaflow` entry are\n\n* `py`: path to the python file containing the `metaflow.FlowSpec` class\n* `className`: name of the `metaflow.FlowSpec` class\n* `additionalOptions` (optional): The section is identical to the `additionalOptions` from the workflow configs.\n\n```Yaml\ntrain-workflow:\n    description: >\n        Description text [optional]\n    dockerfile:\n        cpu: train/Dockerfile\n        cpu: train/DockerfileGPU\n    type: ExecutionMetaflow\n    py: train/flow.py\n    className: TrainingFlow\n    additionalOptions:\n        annotations:\n            artifacts.ai.sap.com/datain.kind: dataset\n            artifacts.ai.sap.com/trainedmodel.kind: model\n        labels:\n            ...\n        workflow: # list of strings passed as workflow options\n            ...\n        metaflow: # list of strings passed as metaflow options (only valid for ExecutionMetaflow)\n            - --package-suffixes=.py,.sh\n```\n\n##### DeploymentYaml\n\nAdditional fields:\n\n* `yaml`: contains a path to a Deployment Template.\n\n```Yaml\nmy_deployment:\n  type: DeploymentYaml\n  yaml: deploy/template.yaml\n  dockerfile: deploy/Dockerfile\n```\n\nA Deployment Template is a regular Serving Template.  Data from `name`, `annotations` and `labels` fields from a user-defined\nworkflow config would replace correspondent attributes in the template.\nIn addition, variables `$image` and `$imagePullSecret` will be replaced with the values from the workflow config.\n\n```Yaml\napiVersion: ai.sap.com/v1alpha1\nkind: ServingTemplate\nmetadata:\n  name: my-serving\n  annotations:\n    executables.ai.sap.com/name: my-serving\n    executables.ai.sap.com/description: \"My model serving\"\n  labels:\n    ai.sap.com/version: \"0.0.1\"\nspec:\n  template:\n    apiVersion: serving.kserve.io/v1beta1\n    metadata:\n      labels: |\n        ai.sap.com/resourcePlan: \"infer.s\"\n    spec: |\n      predictor:\n         imagePullSecrets:\n         - name: $imagePullSecret\n         containers:\n         - name: kserve-container\n           image: $image\n```\n\n##### `ContentPackage` from yaml\n\nThe specification of a content package and of the workflows can also be merged into a single yaml file:\n\n```yaml\nname: test-spec-yaml\nexamples: examples\ndescription: |\n  this is a test.\nshow_files:\n  test: SHOW_FILE\nworkflows:\n  test_execution:\n    type: ExecutionMetaflow\n    className: TestPipeline\n    py: train/flow.py\n    dockerfile:\n      cpu: train/Dockerfile\n      gpu: train/DockerfileGPU\n    annotations:\n      artifacts.ai.sap.com/datain.kind: dataset\n      artifacts.ai.sap.com/trainedmodel.kind: model\n    metaflow:\n      - --package-suffixes=.py,.s\n```\n\nCurrently there is no discovery mechanism for yaml specs.\n\nThey either have to be consumed in python:\n\n```python\nfrom ai_core_sdk.content import ContentPackage\ncontent_package = ContentPackage.from_yaml(spec_yaml)\n```\n\nor made available to the CLI through the `aicore-content -c <path to the spec yaml> ...` option or through the `AI_CORE_CONTENT_SPECS` environment variable.\n\n#### Embedded code packages\n\nMetaflow supports the Data Scientist during the experimentation phase, such that it packs all python files into an archive called \"code package\" and stores it in the configured S3 bucket. Later, during steps execution, this code package is downloaded into the docker container, unpacked there and the respective step function from the Metaflow python file is executed. This approach helps to iterate faster since there is no need to rebuild the docker image after every code change.\n\nIn production, however the ML code will not change often. Therefore, it could be preferable to pack the code into the docker image instead of injecting it for every run. In other words, we want to embed the ML code into the docker image.\n\nWhen running **aicore-content create-image ...**,\ta code package is always generated and placed next to the corresponding Dockerfile under the name `codepkg.tgz`.  This file could be used in the Dockerfile with:\n```\nADD codepkg.tgz .\n```\n\nIn addition, it is necessary to to tell sap-ai-core-metaflow to use the embedded coded package instead of downloading it.\nAdd this to the workflow's config file:\n```\n  additionalOptions:\n    workflow:\n      - --embedded\n```\n\n\n## Tracking\n\n The tracking module of the SAP AI Core SDK can be used to log metrics in both your local environment, and productive workflow executions. Metrics persistence is currently available in your productive environment.\n\n Here are a few code samples demonstrating how to use the SDK for metrics tracking.\n\n\n### Modify Metrics\n\n ```\n from ai_core_sdk.tracking import Tracking\n\n from ai_core_sdk.models import Metric, MetricTag, MetricCustomInfo\n\n tracking_client = Tracking()\n\n tracking_client.modify(\n    tags = [\n        # list\n        MetricTag(name=\"Our Team Tag\", value=\"Tutorial Team\"),\n        MetricTag(name=\"Stage\", value=\"Development\")\n    ],\n    metrics = [\n        Metric(\n            name=\"Training Loss\",\n            value=np.finfo(np.float64).max,\n            timestamp= datetime.now().utcnow(),\n            step = 1, # denotes epoch 1\n            labels = []\n        )\n    ],\n    custom_info = [\n        # list of Custom Information\n         MetricCustomInfo(\n             name = \"My Classification Report\",\n             # you may convert anything to string and store it\n             value = str('''{\n                 \"Cats\": {\n                     \"Precision\": 75,\n                     \"Recall\": 74\n                 },\n                 \"Dogs\": {\n                     \"Precision\": 85,\n                     \"Recall\": 84\n                 }\n             }''')\n        )\n    ]\n )\n\n ```\n\n ### Log Metrics\n\n ```\n tracking_client.log_metrics(\n    metrics = [\n        Metric(\n            name=\"Training Loss\",\n            value=float(86.99),\n            timestamp= datetime.now().utcnow(),\n            step = 1, # denotes epoch 1\n            labels = []\n        ),\n    ],\n )\n\n ```\n\n ### Set Tags\n\n ```\n tracking_client.set_tags(\n    tags = [\n        # list\n        MetricTag(name=\"Our Team Tag\", value=\"Tutorial Team\"),\n        MetricTag(name=\"Stage\", value=\"Development\")\n    ]\n )\n\n ```\n\n ### Set Custom Info\n\n ```\n tracking_client.set_custom_info(\n    custom_info = [\n        # list of Custom Information\n         MetricCustomInfo(\n             name = \"My Classification Report\",\n             # you may convert anything to string and store it\n             value = str('''\n             {\n                 \"Cats\": {\n                     \"Precision\": 75,\n                     \"Recall\": 74\n                 },\n                 \"Dogs\": {\n                     \"Precision\": 85,\n                     \"Recall\": 84\n                 }\n             }\n             '''\n             )\n        ),\n    ]\n )\n\n ```\n  ### Query Metrics\n\n ```\n metrics_response = tracking_client.query(execution_ids = [\n    \"test_execution_id\"    # Change this with the training execution id\n ])\n ```\n\n  ### Delete Metrics\n\n ```\n metrics_response = tracking_client.delete(execution_id = \"test_execution_id\") # Change this with the actual execution id\n ```\n\n",
    "bugtrack_url": null,
    "license": "SAP DEVELOPER LICENSE AGREEMENT",
    "summary": "SAP AI Core SDK",
    "version": "2.4.12",
    "project_urls": {
        "Download": "https://pypi.python.org/pypi/ai-core-sdk",
        "Homepage": "https://www.sap.com/"
    },
    "split_keywords": [
        "sap ai core sdk",
        " sap ai core api",
        " sap ai core"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a5b761f09ced8a16a887a4507b76976a0173f89c9a06affd65047979f4f3ebae",
                "md5": "4b32cc67b9992deedd82c33589da24eb",
                "sha256": "2fd307f05eb8906dc04d495cb95b9d96b2fbc08ec303347984891c245902aded"
            },
            "downloads": -1,
            "filename": "ai_core_sdk-2.4.12-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4b32cc67b9992deedd82c33589da24eb",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 151600,
            "upload_time": "2024-11-04T14:34:31",
            "upload_time_iso_8601": "2024-11-04T14:34:31.430500Z",
            "url": "https://files.pythonhosted.org/packages/a5/b7/61f09ced8a16a887a4507b76976a0173f89c9a06affd65047979f4f3ebae/ai_core_sdk-2.4.12-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-04 14:34:31",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "ai-core-sdk"
}
        
Elapsed time: 3.82460s