teradatamodelops


Nameteradatamodelops JSON
Version 7.1.2 PyPI version JSON
download
home_pageNone
SummaryPython client for Teradata ModelOps (TMO)
upload_time2024-10-23 11:34:53
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseCopyright (c) 2024 Teradata. All rights reserved. LICENSE AGREEMENT Software: Teradata ModelOps IMPORTANT - READ THIS AGREEMENT CAREFULLY BEFORE INSTALLING, DOWNLOADING, OR USING THE SOFTWARE. TERADATA WILL LICENSE THE APPLICABLE SOFTWARE TO YOU (AS DEFINED BELOW) ONLY IF YOU ACCEPT THE TERMS AND CONDITIONS OF THIS AGREEMENT AND MEET THE CONDITIONS FOR USING THE SOFTWARE DESCRIBED BELOW. BY INSTALLING, DOWNLOADING, OR USING THE SOFTWARE, YOU (1) AGREE TO THE TERMS AND CONDITIONS OF THIS AGREEMENT, AND (2) REPRESENT AND WARRANT THAT YOU POSSESS THE AUTHORITY TO ENTER INTO THIS AGREEMENT ON BEHALF OF YOU, YOUR EMPLOYER (WHEN ACTING ON BEHALF OF YOUR EMPLOYER), AND/OR A TERADATA-AUTHORIZED LICENSEE (WHEN YOU AND YOUR EMPLOYER ARE ACTING ON BEHALF OF A TERADATA-AUTHORIZED LICENSEE). IF YOU DO NOT AGREE TO THE TERMS AND CONDITIONS OF THIS AGREEMENT, DO NOT INSTALL, DOWNLOAD, OR USE THE SOFTWARE. IMPORTANT – BY DOWNLOADING, INSTALLING OR USING THE SOFTWARE: • YOU ACKNOWLEDGE THAT THE SOFTWARE YOU ARE INSTALLING, DOWNLOADING OR USING FROM TERADATA IS SUBJECT TO THE RESTRICTIONS AND CONTROLS IMPOSED BY UNITED STATES EXPORT REGULATIONS. • YOU CERTIFY THAT: o YOU DO NOT INTEND TO USE THE SOFTWARE FOR ANY PURPOSE PROHIBITED BY UNITED STATES EXPORT REGULATIONS, INCLUDING, WITHOUT LIMITATION, TERRORISM, CYBER-ATTACKS, CYBER-CRIMES, MONEY-LAUNDERING, INDUSTRIAL ESPIONAGE, OR NUCLEAR, CHEMICAL OR BIOLOGICAL WEAPONS PROLIFERATION. o YOU ARE NOT LISTED AS A DENIED PARTY ON ANY LIST GOVERNING UNITED STATES EXPORTS. o YOU ARE NOT A NATIONAL OF ANY COUNTRY OR IN ANY COUNTRY THAT IS NOT APPROVED FOR EXPORT OF THE SOFTWARE. This License Agreement (“Agreement”) is a legal contract between you, or as applicable on behalf of your employer and any Teradata-authorized licensee for whom your employer is acting on its behalf ("You," "Your," and "Yours") and Teradata Operations, Inc. ("Teradata") for the Software. “Software” refers to the applicable software product identified either above or on the site where this Agreement is located, which consists of computer software code in object code form only, as well as associated media, printed materials, and online or electronic documentation. The term “Software” also includes any and all error corrections, bug fixes, updates, upgrades, or new versions or releases of the Software (collectively and individually, “Enhancements”) that Teradata may elect in its sole discretion to provide you. Teradata is under no obligation to provide you with Enhancements under this Agreement. By accepting this Agreement, you represent and warrant that you possess the authority to enter into this Agreement on behalf of your employer and any Teradata-authorized licensee for whom you and/or your employer are acting on their behalf. “Scenario (i)”: If you are downloading the Software on behalf of a Teradata-authorized licensee that has already purchased a license to the Software pursuant to an executed master agreement (“Master Agreement”) with Teradata or one of its affiliates, the terms of such master agreement prevail over this Agreement. “Scenario (ii)”: If Scenario (i) does not apply to you, then this Agreement and the terms of use for the site from which you downloaded the Software (“General Terms of Use”) constitute the entire understanding of the parties with respect to the Software and Services, and supersede all other prior agreements and understandings whether oral or written. For clarity, Section 2.a.(i) does not apply to you. For example, individuals that have downloaded a copy of the Teradata Express version of the Teradata Relational Database product fall within this Scenario (ii). 1. Term. This Agreement commences on the earliest date of the first download, first copying, first installation, or first use of the Software (the “Effective Date”). Unless terminated earlier as provided herein, this Agreement, including your license to the Software, will expire or terminate on the first to occur of: (i) the same date that your Teradata-authorized license to use the Teradata Relational Database product expires or terminates or (ii) in accordance with the terms of this Agreement. 2. License. (a) Teradata grants you a nonexclusive, nontransferable, paid up license: (i) If Scenario (i) applies – to the Software in accordance with all of the terms and conditions of the Master Agreement. (ii) If Scenario (ii) applies - subject to your compliance with all of the terms and conditions of this Agreement, to install and use the Software on your computer workstation in object code form for your internal use solely for purposes of facilitating your Teradata-authorized license to use the Teradata Relational Database product. You may make reasonable archival backup copies of the Software, but may only use an archival copy in lieu of your primary copy and subject to the same restrictions as your primary copy. (b) The term "Third Party Software" means computer programs or modules (including their documentation) that bear the logo, copyright and/or trademark of a third party (including open source software that are contained in files marked as “open source” or the like) or are otherwise subject to written license terms of a third party. Third Party Software does not constitute Software. Third Party Software is licensed to you subject to the applicable license terms accompanying it, included in/with it, referenced in it, or otherwise entered into by you with respect to it. (c) You will not sell, copy, rent, loan, modify, transfer, disclose, embed, sublicense, create derivative works of or distribute the Software, in whole or in part, without Teradata’s prior written consent. You are granted no rights to obtain or use the Software’s source code. You will not reverse-assemble, reverse compile or reverse-engineer the Software, except as expressly permitted by applicable law without the possibility of contractual waiver. Notwithstanding anything to the contrary, you do not have any license, right, or authority to subject the Software, in whole or in part or as part of a larger work, to any terms of any other agreement, including GNU Public Licenses. (d) No license rights to the Software will be implied. The Software, which includes all copies thereof (whether in whole or in part), is and remains the exclusive property of Teradata and its licensors. You will ensure that all copies of the Software contain Teradata's and its licensors' copyright notices, as well as all other proprietary legends. Teradata reserves the right to inspect your use of the Software for purposes of verifying your compliance with the terms and conditions of this Agreement. If applicable, such right will be conducted according to any audit provisions that might exist in an underlying agreement between you and Teradata. (e) The Software may contain a disabling device that will prevent the Software from being used on data sets larger than a certain size. You agree not to: (i) use the Software on data sets larger than such or (ii) disable or circumvent any disabling device contained in the Software. (f) Notwithstanding anything to the contrary, Teradata shall have the right to collect and analyze data and other information relating to the provision, use and performance of various aspects of the Software and related systems and technologies (including, without limitation, information concerning your data and data derived therefrom), and Teradata will be free (during and after the term hereof) to (i) use such information and data to improve and enhance the Software and for other development, diagnostic and corrective purposes in connection with the Software and other Teradata offerings, and (ii) disclose such data solely in aggregate or other de-identified form in connection with its business. 3. Responsibilities. You are responsible for the installation of the Software, as well as for providing data security and backup operations. This Agreement does not require Teradata to provide you with any Enhancements, consulting services, technical assistance, installation, training, support, or maintenance of any kind (collectively and individually, “Services”). To the extent that Teradata elects to provide you with any Services, such Services are provided to you at Teradata’s sole discretion and may be modified or discontinued at any time for any reason. 4. DISCLAIMER OF WARRANTY. TERADATA: (A) PROVIDES SERVICES (IF ANY), (B) LICENSES THE SOFTWARE, AND (C) PROVIDES THIRD PARTY SOFTWARE TO YOU HEREUNDER ON AN “AS-IS” BASIS WITHOUT WARRANTIES OF ANY KIND (ORAL OR WRITTEN, EXPRESS OR IMPLIED, OR STATUTORY). WITHOUT LIMITATION TO THE FOREGOING, THERE ARE NO IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, OR NON-INFRINGEMENT. TERADATA DOES NOT WARRANT THAT THE SOFTWARE, THIRD PARTY SOFTWARE, OR SERVICES WILL MEET YOUR REQUIREMENTS OR CONFORM TO ANY SPECIFICATIONS, OR THAT THE OPERATION OF THE SOFTWARE OR THIRD PARTY SOFTWARE WILL BE UNINTERRUPTED OR ERROR FREE. YOU BEAR THE ENTIRE RISK AS TO SATISFACTORY QUALITY, PERFORMANCE, ACCURACY, AND RESULTS OBTAINED FROM THE SOFTWARE, THIRD PARTY SOFTWARE, AND SERVICES. SOME JURISDICTIONS RESTRICT DISCLAIMERS OF WARRANTY, SO THE ABOVE DISCLAIMERS MAY NOT FULLY APPLY TO YOU. 5. LIMITATIONS ON LIABILITY: UNDER NO CIRCUMSTANCES WILL TERADATA’S AND ITS LICENSORS’ TOTAL CUMULATIVE LIABILITY FOR CLAIMS RELATING TO THIS AGREEMENT, THE SERVICES, THE SOFTWARE, AND/OR THIRD PARTY SOFTWARE (WHETHER BASED IN CONTRACT, STATUTE, OR TORT (INCLUDING NEGLIGENCE) OR OTHERWISE) EXCEED US$1,000; PROVIDED, HOWEVER, THAT THE FOREGOING WILL NOT APPLY TO CLAIMS FOR (I) PERSONAL INJURY, INCLUDING DEATH, TO THE EXTENT CAUSED BY TERADATA'S GROSS NEGLIGENCE OR WILLFUL MISCONDUCT; OR (II) PHYSICAL DAMAGE TO TANGIBLE REAL OR PERSONAL PROPERTY TO THE EXTENT CAUSED BY TERADATA'S GROSS NEGLIGENCE OR WILLFUL MISCONDUCT EQUAL TO THE AMOUNT OF DIRECT DAMAGES UP TO ONE MILLION DOLLARS PER OCCURRENCE. IN NO EVENT WILL TERADATA OR ITS LICENSORS BE LIABLE FOR ANY INDIRECT, SPECIAL, PUNITIVE, INCIDENTAL OR CONSEQUENTIAL DAMAGES, OR FOR LOSS OF PROFITS, REVENUE, TIME, OPPORTUNITY, OR DATA, EVEN IF INFORMED OF THE POSSIBILITY OF SUCH DAMAGES. SOME JURISDICTIONS RESTRICT LIMITATIONS OF LIABILITY, SO THE ABOVE LIMITATIONS MAY NOT FULLY APPLY TO YOU. 6. Government Restrictions. You agree that you will not, directly or indirectly, export or transmit any Software without obtaining Teradata’s prior written authorization, as well as appropriate governmental approvals, including those required by the U.S. Government. Use and or distribution of this software is subject to export laws and regulations of the United States and other jurisdictions. The links below connect you to applicable U.S. government agencies, and their regulations, that have jurisdiction over this transaction. http://www.bis.doc.gov/ http://www.treas.gov/offices/enforcement/ofac/ By installing, downloading, or using this product, you acknowledge that this transaction is subject to applicable export control laws and you certify that your installation, download, use and/or subsequent distribution of this product is not prohibited under applicable laws and regulations. The Government’s use, duplication, or disclosure of Teradata’s commercial computer software and commercial computer software documentation is subject to: (a) the Restricted Rights Notice set forth in 48 C.F.R. ¶ 52.227-14 (Rights In Data - General); (b) Teradata’s standard commercial license rights supplemented by 48 C.F.R. ¶ 52.227-19 (Commercial Computer Software - Restricted Rights); and/or (c) the limited rights and license set forth 48 CFR ¶ 252.227-7015 (Technical Data–Commercial Items), as applicable. 7. Termination and Expiration. A party may terminate this Agreement with or without cause, upon providing written notice to the other parties. When this Agreement terminates or expires, you will immediately cease all use of the Software, permanently remove the Software from all computers, destroy all copies of the Software, and (upon receipt of Teradata’s request) provide a signed written certification that the foregoing has occurred. Sections 4-11 will survive expiration or termination of this Agreement. 8. Choice of Law and Dispute Resolution. The parties will attempt in good faith to resolve any controversy or claim by negotiation or mediation. If they are unable to do so, and regardless of the causes of action alleged and whether they arise under this Agreement or otherwise, the claim will be resolved by arbitration before a sole arbitrator in San Diego, California pursuant to the then-current Commercial Rules of the American Arbitration Association and the federal substantive and procedural law of arbitration. The arbitrator’s award will be final and binding, and may be entered in any court having jurisdiction thereof, but may include only damages consistent with the limitations in this Agreement. Each party will bear its own attorney's fees and costs related to the arbitration. The obligations to negotiate, mediate and arbitrate shall not apply to claims for misuse or infringement of a party’s intellectual property rights. Any claim or action must be brought within two years after the claimant knows or should have known of the claim . New York law will govern the interpretation and enforcement of this Agreement, except that the Federal Arbitration Act will govern the interpretation and enforcement of the arbitrability of claims under this Section. 9. Feedback. Notwithstanding anything to the contrary: (a) Teradata will have no obligation of any kind with respect to any Software-related comments, suggestions, design changes or improvements that you elect to provide to Teradata in either verbal or written form (collectively, “Software Feedback”), and (b) Teradata and its affiliates are hereby free to use any ideas, concepts, know-how or techniques, in whole or in part, contained in Software Feedback: (i) for any purpose whatsoever, including developing, manufacturing, and/or marketing products and/or services incorporating Software Feedback in whole or in part, and (ii) without any restrictions or limitations, including requiring the payment of any license fees, royalties, or other consideration. 10. Confidentiality. You will not disclose the results of any testing or evaluations, including any benchmarks, insofar as it relates to the Software without Teradata’s prior written consent. 11. Entire Agreement. This Agreement constitutes the entire understanding of the parties with respect to the Software and supersedes all other prior agreements and understandings whether oral or written. No oral representation or change to this Agreement will be binding upon either party unless agreed to in writing and signed by authorized representatives of all parties. You will not assign this Agreement or your rights, nor will you delegate your obligations under this Agreement. Failure by either party to enforce any term or condition of this Agreement will not be deemed a waiver of future enforcement of that or any other term or condition. The provisions of this Agreement are severable. "Include", "includes", and "including" shall be interpreted as introducing a list of examples which do not limit the generality of any preceding words or any words in the list of examples.
keywords teradata vantage data science devops vantagecloud
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Teradata ModelOps Client

- [Installation](#installation)
- [CLI](#cli)
  - [configuration](#configuration)
  - [help](#help)
  - [list](#list)
  - [add](#add)
  - [run](#run)
  - [init](#init)
  - [clone](#clone)
  - [link](#link)
  - [task](#task)
    - [add](#task-add)
    - [run](#task-run)
  - [connection](#connection)
  - [feature](#feature)
    - [compute-stats](#feature-compute-stats)
    - [list-stats](#feature-list-stats)
    - [create-stats-table](#feature-create-stats-table)
    - [import-stats](#feature-import-stats)
  - [doctor](#doctor)
- [Authentication](#authentication)
- [SDK](#sdk)
  - [Create Client](#create-client)
  - [Read Entities](#read-entities)
  - [Deploy Model Version](#deploy-model-version)
  - [Import Model Version](#import-model-version)
- [Release Notes](#release-notes)


# Installation<a id="installation"></a>

The teradatamodelops is available from [pypi](https://pypi.org/project/teradatamodelops/)

```bash
pip install teradatamodelops
```

_Note: previously Teradata ModelOps was known as AnalyticsOps Accelerator (AOA), so you might encounter mentions of AOA or aoa, it's referring to an older version of ModelOps_

# CLI<a id="cli"></a>

## configuration<a id="configuration"></a>

By default, the CLI looks for configuration stored in `~/.aoa/config.yaml`. Copy the config from ModelOps UI -> Session Details -> CLI Config. This will provide the command to create or update the `config.yaml`. If required, one can override this configuration at runtime by specifying environment variables (see `api_client.py`)

## help<a id="help"></a>

The cli can be used to perform a number of interactions and guides the user to perform those actions.

```bash
> tmo -h
usage: tmo [-h] [--debug] [--version] {list,add,run,init,clone,link,task,connection,feature,doctor} ...

TMO CLI

options:
  -h, --help            show this help message and exit
  --debug               Enable debug logging
  --version             Display the version of this tool

actions:
  valid actions

  {list,add,run,init,clone,link,task,connection,feature,doctor}
    list                List projects, models, local models or datasets
    add                 Add model to working dir
    run                 Train and Evaluate model locally
    init                Initialize model directory with basic structure
    clone               Clone Project Repository
    link                Link local repo to project
    task                Manage feature engineering tasks
    connection          Manage local connections
    feature             Manage feature statistics
    doctor              Diagnose configuration issues
```

## list<a id="list"></a>

Allows to list the aoa resources. In the cases of listing models (pushed / committed) and datasets, it will prompt the user to select a project prior showing the results. In the case of local models, it lists both committed and non-committed models.
```bash
> tmo list -h
usage: tmo list [-h] [--debug] [-p] [-m] [-lm] [-t] [-d] [-c]

options:
  -h, --help           show this help message and exit
  --debug              Enable debug logging
  -p, --projects       List projects
  -m, --models         List registered models (committed / pushed)
  -lm, --local-models  List local models. Includes registered and non-registered (non-committed / non-pushed)
  -t, --templates      List dataset templates
  -d, --datasets       List datasets
  -c, --connections    List local connections
```
All results are shown in the format
```
[index] (id of the resource) name
```
for example:
```
List of models for project Demo:
--------------------------------
[0] (03c9a01f-bd46-4e7c-9a60-4282039094e6) Diabetes Prediction
[1] (74eca506-e967-48f1-92ad-fb217b07e181) IMDB Sentiment Analysis
```

## add<a id="add"></a>

Add a new model to a given repository based on a model template. A model in any other existing ModelOps git repository (specified via the `-t <giturl>`) can be used.

```bash
> tmo add -h
usage: tmo add [-h] [--debug] -t TEMPLATE_URL -b BRANCH

options:
  -h, --help            show this help message and exit
  --debug               Enable debug logging
  -t TEMPLATE_URL, --template-url TEMPLATE_URL
                        Git URL for template repository
  -b BRANCH, --branch BRANCH
                        Git branch to pull templates (default is main)
```

Example usage
```bash
> tmo add -t https://github.com/Teradata/modelops-demo-models -b master
```

## run<a id="run"></a>

The cli can be used to validate the model training and evaluation logic locally before committing to git. This simplifies the development lifecycle and allows you to test and validate many options. It also enables you to avoid creating the dataset definitions in the AOA UI until you are ready and have a finalised version.

```bash
> tmo run -h
usage: tmo run [-h] [--debug] [-id MODEL_ID] [-m MODE] [-d DATASET_ID] [-t DATASET_TEMPLATE_ID] [-ld LOCAL_DATASET] [-lt LOCAL_DATASET_TEMPLATE] [-c CONNECTION]

options:
  -h, --help            show this help message and exit
  --debug               Enable debug logging
  -id MODEL_ID, --model-id MODEL_ID
                        Id of model
  -m MODE, --mode MODE  Mode (train or evaluate)
  -d DATASET_ID, --dataset-id DATASET_ID
                        Remote datasetId
  -t DATASET_TEMPLATE_ID, --dataset-template-id DATASET_TEMPLATE_ID
                        Remote datasetTemplateId
  -ld LOCAL_DATASET, --local-dataset LOCAL_DATASET
                        Path to local dataset metadata file
  -lt LOCAL_DATASET_TEMPLATE, --local-dataset-template LOCAL_DATASET_TEMPLATE
                        Path to local dataset template metadata file
  -c CONNECTION, --connection CONNECTION
                        Local connection id
```

You can run all of this as a single command or interactively by selecting some optional arguments, or none of them.

For example, if you want to run the cli interactively you just select `tmo run` but if you wanted to run it non interactively to train a given model with a given datasetId you would expect
```bash
> tmo run -id <modelId> -m <mode> -d <datasetId>
```

## init<a id="init"></a>

When you create a git repository, its empty by default. The `init` command allows you to initialize the repository with the structure required by the AOA. It also adds a default README.md and HOWTO.md.

```bash
> tmo init -h
usage: tmo init [-h] [--debug]

options:
  -h, --help  show this help message and exit
  --debug     Enable debug logging
```

## clone<a id="clone"></a>

The `clone` command provides a convenient way to perform a git clone of the repository associated with a given project. The command can be run interactively and will allow you to select the project you wish to clone. Note that by default it clones to the current working directory so you either need to make sure you create an empty folder and run it from within there or else provide the `--path ` argument.

```bash
> tmo clone -h
usage: tmo clone [-h] [--debug] [-id PROJECT_ID] [-p PATH]

options:
  -h, --help            show this help message and exit
  --debug               Enable debug logging
  -id PROJECT_ID, --project-id PROJECT_ID
                        Id of Project to clone
  -p PATH, --path PATH  Path to clone repository to
```

## link<a id="link"></a>

Links the current local repository to a remote project.

```bash
> tmo link -h
usage: tmo link [-h] [--debug]

options:
  -h, --help  show this help message and exit
  --debug     Enable debug logging
```

## task<a id="task"></a>

Manage feature tasks actions. The feature tasks are used to automate the feature engineering process. The tasks are stored in a git repository and can be shared across multiple projects.

```bash
> tmo task -h
usage: tmo task [-h] {add,run} ...

options:
  -h, --help  show this help message and exit

actions:
  valid actions

  {add,run}
    add       Add feature engineering task to working dir
    run       Run feature engineering tasks locally
```

### task add<a id="task-add"></a>

Add a new feature engineering task to a given repository based on a task template. A task in any other existing ModelOps git repository (specified via the `-t <giturl>`) can be used.

```bash
> tmo task add -h
usage: tmo task add [-h] [--debug] -t TEMPLATE_URL -b BRANCH

options:
  -h, --help            show this help message and exit
  --debug               Enable debug logging
  -t TEMPLATE_URL, --template-url TEMPLATE_URL
                        Git URL for template repository
  -b BRANCH, --branch BRANCH
                        Git branch to pull task (default is main)
```

Example usage 
```bash
> tmo task add -t https://github.com/Teradata/modelops-demo-models -b feature_engineering
```

### task run<a id="task-run"></a>

The cli can be used to run feature engineering tasks locally before committing to git.

```bash
> tmo task run -h
usage: tmo task run [-h] [--debug] [-c CONNECTION] [-f FUNCTION_NAME]

options:
  -h, --help            show this help message and exit
  --debug               Enable debug logging
  -c CONNECTION, --connection CONNECTION
                        Local connection id
  -f FUNCTION_NAME, --function-name FUNCTION_NAME
                        Task function name
```

You can run all of this as a single command or interactively by selecting some optional arguments, or none of them.

For example, if you want to run the cli interactively you just select `tmo task run` but if you wanted to run it non interactively to run a given task function you would expect
```bash
> tmo task run -f <functionName> -c <connectionId>
```

## connection<a id="connection"></a>

The connection credentials stored in the ModelOps service cannot be accessed remotely through the CLI for security reasons. Instead, users can manage connection information locally for the CLI. These connections are used by other CLI commands which access Vantage.

```bash
> tmo connection -h
usage: tmo connection [-h] {list,add,remove,export,test,create-byom-table} ...

options:
  -h, --help            show this help message and exit

actions:
  valid actions

  {list,add,remove,export,test,create-byom-table}
    list                List all local connections
    add                 Add a local connection
    remove              Remove a local connection
    export              Export a local connection to be used as a shell script
    test                Test a local connection
    create-byom-table   Create a table to store BYOM models
```

## feature<a id="feature"></a>

Manage feature metadata by creating and populating feature metadata table(s). The feature metadata tables contain information required when computing statistics during training, scoring etc. This metadata depends on the feature type (categorical or continuous). 

As this metadata can contain sensitive profiling information (such as categories), it is recommended to treat this metadata in the same manner as you treat the features for a given use case. That is, the feature metadata should live in a project or use case level database. 

```bash
> tmo feature -h
usage: tmo feature [-h] {compute-stats,list-stats,create-stats-table,import-stats} ...

options:
  -h, --help            show this help message and exit

action:
  valid actions

  {compute-stats,list-stats,create-stats-table,import-stats}
    compute-stats       Compute feature statistics
    list-stats          List available statistics
    create-stats-table  Create statistics table
    import-stats        Import column statistics from local JSON file
```

### feature compute-stats<a id="feature-compute-stats"></a>

Compute the feature metadata information required when computing statistics during training, scoring etc. This metadata depends on the feature type (categorical or continuous).

Continuous: the histograms edges  
Categorical: the categories

```bash
> tmo feature compute-stats -h
usage: tmo feature compute-stats [-h] [--debug] -s SOURCE_TABLE -m METADATA_TABLE [-t {continuous,categorical}] -c COLUMNS

options:
  -h, --help            show this help message and exit
  --debug               Enable debug logging
  -s SOURCE_TABLE, --source-table SOURCE_TABLE
                        Feature source table/view
  -m METADATA_TABLE, --metadata-table METADATA_TABLE
                        Metadata table for feature stats, including database name
  -t {continuous,categorical}, --feature-type {continuous,categorical}
                        Feature type: continuous or categorical (default is continuous)
  -c COLUMNS, --columns COLUMNS
                        List of feature columns
```

Example usage

```bash
tmo feature compute-stats \
  -s <feature-db>.<feature-data> \
  -m <feature-metadata-db>.<feature-metadata-table> \
  -t continuous -c numtimesprg,plglcconc,bloodp,skinthick,twohourserins,bmi,dipedfunc,age
```

### feature list-stats<a id="feature-list-stats"></a>

Lists the name and type (categorical or continuous) for all the features stored in the specified metadata table.

```bash
> tmo feature list-stats -h
usage: tmo feature list-stats [-h] [--debug] -m METADATA_TABLE

options:
  -h, --help            show this help message and exit
  --debug               Enable debug logging
  -m METADATA_TABLE, --metadata-table METADATA_TABLE
                        Metadata table for feature stats, including database name
```

Example usage

```bash
tmo feature list-stats \
  -m <feature-metadata-db>.<feature-metadata-table>
```

### feature create-stats-table<a id="feature-create-stats-table"></a>

Creates or prints out the required SQL to create the metadata statistics table with the specified name.

```bash
> tmo feature create-stats-table -h
usage: tmo feature create-stats-table [-h] [--debug] -m METADATA_TABLE [-e]

options:
  -h, --help            show this help message and exit
  --debug               Enable debug logging
  -m METADATA_TABLE, --metadata-table METADATA_TABLE
                        Metadata table for feature stats, including database name
  -e, --execute-ddl     Execute CREATE TABLE DDL, not just generate it
```

Example usage

```bash
tmo feature create-stats-table \
  -m <feature-metadata-db>.<feature-metadata-table> \
  -e
```

### feature import-stats<a id="feature-import-stats"></a>

Imports feature metadata statistics from a local JSON file to the target table. It also supports showing an example of the file structure.

```bash
> tmo feature import-stats -h
usage: tmo feature import-stats [-h] [--debug] -m METADATA_TABLE -f STATISTICS_FILE [-s]

options:
  -h, --help            show this help message and exit
  --debug               Enable debug logging
  -m METADATA_TABLE, --metadata-table METADATA_TABLE
                        Metadata table for feature stats, including database name
  -f STATISTICS_FILE, --statistics-file STATISTICS_FILE
                        Name of statistics JSON file
  -s, --show-example    Show file structure example and exit
```

Example usage

```bash
tmo feature import-stats \
  -m <feature-metadata-db>.<feature-metadata-table> \
  -f <path-to-statistics-file>
```

## doctor<a id="doctor"></a>

Runs healtchecks on the current setup. First checks the health of the remote ModelOps service configuration and secondly checks the health of any of the locally created connections.

```bash
> tmo doctor -h
usage: tmo doctor [-h] [--debug]

options:
  -h, --help  show this help message and exit
  --debug     Enable debug logging
```

# Authentication<a id="authentication"></a>

A number of authentication methods are supported for both the CLI and SDK. 

- device_code (interactive)
- client_credentials (service-service)
- bearer (raw bearer token)

When working interactively, the recommended auth method for the CLI is `device_code`. It will guide you through the auth automatically. For the SDK, use  `bearer` if working interactively. 
For both CLI and SDK, if working in an automated service-service manner, use `client_credentials`. 

# SDK<a id="sdk"></a>

The SDK for ModelOps allows users to interact with ModelOps APIs from anywhere they can execute python such as notebooks, IDEs etc. It can also be used for devops to automate additional parts of the process and integrate into the wider organization.

## Create Client<a id="create-client"></a>

By default, creating an instance of the `AoaClient` looks for configuration stored in `~/.aoa/config.yaml`. When working with the SDK, we recommend that you specify (and override) all the necessary configuration as part of the `AoaClient` invocation. 

An example to create a client using a bearer token for a given project is

```python
from aoa import AoaClient

client = AoaClient(
    aoa_url="<modelops-endpoint>",
    auth_mode="bearer",
    auth_bearer="<bearer-token>",
    project_id="23e1df4b-b630-47a1-ab80-7ad5385fcd8d",
)
```

To get the values to use for bearer token and aoa_url, go to the ModelOps UI -> Session Details -> SDK Config. 

## Read Entities<a id="read-entities"></a>

We provide an extensive sdk implementation to interact with the APIs. You can find, create, update, archive, etc any entity that supports it via the SDK. In addition, most if not all search endpoints are also implemented in the sdk. Here are some examples

```python
from aoa import AoaClient
import pprint

client = AoaClient(project_id="23e1df4b-b630-47a1-ab80-7ad5385fcd8d")

datasets = client.datasets().find_all()
pprint.pprint(datasets)

dataset = client.datasets().find_by_id("11e1df4b-b630-47a1-ab80-7ad5385fcd8c")
pprint.pprint(dataset)

jobs = client.jobs().find_by_id("21e1df4b-b630-47a1-ab80-7ad5385fcd1c")
pprint.pprint(jobs)
```

## Deploy Model Version<a id="deploy-model-version"></a>

Let's assume we have a model version `4131df4b-b630-47a1-ab80-7ad5385fcd15` which we want to deploy In-Vantage and schedule it to execute once a month at midnight of the first day of the month using dataset connection `11e1df4b-b630-47a1-ab80-7ad5385fcd8c` and dataset template `d8a35d98-21ce-47d0-b9f2-00d355777de1`. We can use the SDK as follows to perform this.

```python
from aoa import AoaClient

client = AoaClient(project_id="23e1df4b-b630-47a1-ab80-7ad5385fcd8d")

trained_model_id = "4131df4b-b630-47a1-ab80-7ad5385fcd15"
deploy_request = {
   "engineType": "IN_VANTAGE",
   "publishOnly": False,
   "language": "PMML",
   "cron": "0 0 1 * *",
   "byomModelLocation": {
     "database": "<db-name>",
     "table": "<table-name>"
    },
    "datasetConnectionId": "11e1df4b-b630-47a1-ab80-7ad5385fcd8c",
    "datasetTemplateId": "d8a35d98-21ce-47d0-b9f2-00d355777de1",
    "engineTypeConfig": {
        "dockerImage": "",
        "engine": "byom",
        "resources": {
            "memory": "1G",
            "cpu": "1"
        }
    }
}

job = client.trained_models().deploy(trained_model_id, deploy_request)

# wait until the job completes (if the job fails it will raise an exception)
client.jobs().wait(job['id'])
```

## Import Model Version<a id="import-model-version"></a>

Let's assume we have a PMML model which we have trained in another data science platform. We want to import the artefacts for this version (model.pmml and data_stats.json) against a BYOM model `f937b5d8-02c6-5150-80c7-1e4ff07fea31`.

```python
from aoa import AoaClient

client = AoaClient(project_id="23e1df4b-b630-47a1-ab80-7ad5385fcd8d")

# set metadata for your import request
model_id = '<model-uuid>'
filename = './pima.pmml'
language = 'PMML'
dataset_connection_id = '<dataset-connection-uuid>'
train_dataset_id = '<train-dataset-uuid>'

# first, upload the artefacts which we want to associate with the BYOM model version
import_id = client.trained_model_artefacts().upload_byom_model(language,filename)

import_request = {
    "artefactImportId": import_id,
    "externalId": "my_model_external_id",
    "modelMonitoring": {
        "language": language,
        "useDefaultEvaluation": True,
        "evaluationEnabled": True,
        "modelType": "CLASSIFICATION",
        "byomColumnExpression": "CAST(CAST(json_report AS JSON).JSONExtractValue('$.predicted_HasDiabetes') AS INT)",
        "driftMonitoringEnabled": True,
        "datasetId": train_dataset_id,
        "datasetConnectionId": dataset_connection_id,
    },
}

job = client.models().import_byom(model_id, import_request)

# wait until the job completes (if the job fails it will raise an exception)
client.jobs().wait(job["id"])

# now you can list the artefacts which were uploaded and linked to the model version
trained_model_id = job["metadata"]["trainedModel"]["id"]
artefacts = client.trained_model_artefacts().list_artefacts(trained_model_id)
```


## Release Notes<a id="release-notes"></a>

### 7.1.2

- Feature: Added shortcut methods to `AoaClient` to improve developer experience.
- Feature: Exposed more API methods, such as:
  - `deployments().find_by_deployment_job_id()`
  - `jobs().find_by_deployment_id()`
  - `dataset_connections().validate_connection()`
  - `describe_current_project()`
- Documentation: Added example for scheduling a Jupyter notebook.

### 7.1.1

- Bug: Fix for OAuth2 Device code grant type.
- Feature: Updated pypi.md with new release notes.
- Bug: Fix for rest api compliance after core upgrading.

### 7.1.0

- Feature: Explicitly ask for `logmech` when defining a local Vantage connection.
- Bug: Fixed `tmo connection create-byom-table` logging error.
- Feature: Added support for `access_token` when using OAuth2 Device code grant type.
- Feature: Updated `requests` dependency.
- Feature: Client hardening efforts.
- Feature: Added Feature Engineering Tasks functions `tmo task add` and `tmo task run`.

### 7.0.6

- Bug: Fixed compatibility issue with `teradataml>=20.0.0.0`.
- Feature: Minor formatting updates.
- Feature: Added user attributes API.
- Feature: Updated programmatic BYOM import examples.

### 7.0.5

**WARNING** Please recreate statistics metadata if you are using this version. It will produce more accurate results and correct behaviour.

- Bug: Fixed computing statistics metadata for continuous features with all NULLs.
- Bug: Fixed training/eval/scoring statistics on columns with all NULLs.
- Bug: Added a workaround for VAL Histograms failure due to SP argument being longer than 31000 characters.
- Feature: Forced rounding of all decimal and flloat bin edges to Teradata-supported FLOAT literals (15 digits max).
- Bug: Fixed CLI error computing continuous statistics metadata related to mixed case column names.

### 7.0.4

**WARNING** Please recreate statistics metadata if you are using this version. It will produce more accurate results and correct behaviour.

- Bug: Fixed STO helper functions to pick up correct python environment.
- Feature: Computing statistics metadata:
    - Allow less than 10 bins for continuous integer features.
    - Ignore continuous features that have single value for all rows.
    - Allow different precision for decimal and float continuous features.
    - Fix a bug with repeating edges for decimal/float continuous features.
    - Using rounded edges for integer continuous features.
    - Ignore NULLs for categorical features.
    - Ignore categorical features that have NULL value for all rows.
- Feature: Allow collecting training statistics without providing a target (unsupervised learning).
- Feature: Report right and left outliers when computing statistics on continuous features.
- Feature: Allow collecting training/evaluation/scoring statistics when statistics metadata is missing for some columns (missing columns are reported in data_stats.json).
- Feature: For categorical features, report outlier categories in "unmonitored_frequency".
- Feature: Assume statistics metadata is empty, in case it could not be read from the database.
- Refactor: Minor fixes and improvements.

### 7.0.3

- Bug: Fixed statistics computation for symmetric distributions.
- Feature: Added automatic retry for device_code expired session or refresh token.

### 7.0.2

- Refactor: Updated dependencies.

### 7.0.1

- Feature: Updates to new name.
- Feature: Changes to support both 1.x and 2.x version of SQLAlchemy.
- Feature: Added a command to create BYOM table.
- Refactor: Various quality of life and performance improvements.

### 7.0.0.0

- Refactor: Refactor data statistics API / internals to be simpler (breaking changes).
- Feature: Switch CLI authentication to use `device_code` grant flow.
- Feature: Add raw Bearer token support for authentication (SDK).


### 6.1.4

- Feature: Document user facing stats functions.
- Feature: Improve end user error messaging related to stats.
- Bug: Fix `aoa init` and `aoa add` not working due to refactor in 6.1.3.

### 6.1.3

- Feature: Improve error messages for statistics calculations and validation.
- Feature: Use [aia](https://pypi.org/project/aia/) for AIA chasing for missing intermediate certificates.
- Bug: No defaults for set when BYOM and VAL DBs not configured on connections.
- Bug: Fixed requirement versions to ensure more stability across python versions.
- Bug: Fixed slow CLI for some commands due to repeated server initialization.

### 6.1.2

- Bug: Work around problems with special character in passwords for teradataml.

### 6.1.0

- Cleanup: Remove all non OAuth2 (JWT) authentication methods.
- Cleanup: Remove `aoa configure`.
- Feature: Improve error messages to user on CLI.
- Feature: Add `aoa link` for linking project to repo locally.
- Bug: Don't show archived datasets.
- Bug: Fix issue with `aoa feature create-table`.


### 6.0.0

- Feature: Support API changes on ModelOps 06.00.
- Feature: CLI DX improvements.
- Feature: Add Telemetry query bands to session.
- Feature: `aoa feature` support for managing feature metadata.
- Feature: `aoa add` uses reference git repository for model templates.
- Feature: Improve DX from Notebooks.


### 5.0.0

- Feature: Add simpler teradataml context creation via aoa_create_context.
- Feature: Add database to connections.
- Feature: Support for human-readable model folder names.
- Feature: Improve UX of aoa run.
- Feature: Improved error messages for users related to auth and configure.
- Refactor: Package refactor of aoa.sto.util to aoa.util.sto.
- Bug: cli listing not filtering archived entities.
- Cleanup: Remove pyspark support from CLI.
- 

### 4.1.12

- Bug: aoa connection add now hides password symbols.
- Bug: sto.util.cleanup_cli() used hardcoded models table.
- Feature: sto.util.check_sto_version() checks In-Vantage Python version compatibility.
- Feature: sto.util.collect_sto_versions() fetches dict with Python and packages versions.

### 4.1.11

- Bug: aoa run (evaluation) for R now uses the correct scoring file.

### 4.1.10

- Bug: aoa init templates were out of date.
- Bug: aoa run (score) didn't read the dataset template correctly.
- Bug: aoa run (score) tried to publish to prometheus.
- Bug: aoa run (score) not passing model_table kwargs.

### 4.1.9

- Bug: Histogram buckets incorrectly offset by 1 for scoring metrics.

### 4.1.7

- Bug: Quoted and escaped exported connection environmental variables.
- Bug: aoa clone with `path` argument didn't create .aoa/config.yaml in correct directory.
- Feature: aoa clone without `path` now uses repository name by default.
- Feature: update BYOM import to upload artefacts before creating version.

### 4.1.6

- Feature: Added local connections feature with Stored Password Protection.
- Feature: Self creation of .aoa/config.yaml file when cloning a repo.
- Bug: Fix argparse to use of common arguments.
- Feature: Support dataset templates for listing datasets and selecting dataset for train/eval.
- Bug: Fix aoa run for batch scoring, prompts for dataset template instead of dataset.
- Bug: Fix batch scoring histograms as cumulative.

### 4.1.5

- Bug: Fix computing stats.
- Feature: Autogenerate category labels and support for overriding them.
- Feature: Prompt for confirmation when retiring/archiving.

### 4.1.4

- Feature: Retiring deployments and archiving projects support.
- Feature: Added support for batch scoring monitoring.

### 4.1.2

- Bug: Fix computing stats.
- Bug: Fix local SQL model training and evaluation.

### 4.1

- Bug: CLI shows archived entities when listing datasets, projects, etc.
- Bug: Adapt histogram bins depending on range of integer types.

### 4.0

- Feature: Extract and record dataset statistics for Training, Evaluation.

### 3.1.1

- Feature: `aoa clone` respects project branch.
- Bug: support Source Model ID from the backend.

### 3.1

- Feature: ability to separate evaluation and scoring logic into separate files for Python/R.

### 3.0

- Feature: Add support for Batch Scoring in run command.
- Feature: Added STO utilities to extract metadata for micro-models.

### 2.7.2

- Feature: Add support for OAUTH2 token refresh flows.
- Feature: Add dataset connections api support.

### 2.7.1

- Feature: Add TrainedModelArtefactsApi.
- Bug: pyspark cli only accepted old resources format.
- Bug: Auth mode not picked up from environment variables.

### 2.7.0 

- Feature: Add support for dataset templates.
- Feature: Add support for listing models (local and remote), datasets, projects.
- Feature: Remove pykerberos dependency and update docs.
- Bug: Fix tests for new dataset template api structure.
- Bug: Unable to view/list more than 20 datasets / entities of any type in the cli.

### 2.6.2
 - Bug: Added list resources command.
 - Bug: Remove all kerberos dependencies from standard installation, as they can be now installed as an optional feature.
 - Feature: Add cli support for new artefact path formats.
 
### 2.6.1 
 - Bug: Remove pykerberos as an installation dependency. 

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "teradatamodelops",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "Anton Zadorozhniy <anton.zadorozhniy@teradata.com>, Sheheryar Ali Bhatti <sheheryar.bhatti@teradata.com>, Christian Axel Schmidt Dick <christianaxel.schmidt@teradata.com>, \"Mo, Mengyu\" <mengyu.mo@teradata.com>",
    "keywords": "teradata, vantage, data science, devops, vantagecloud",
    "author": null,
    "author_email": "Anton Zadorozhniy <anton.zadorozhniy@teradata.com>, Sheheryar Ali Bhatti <sheheryar.bhatti@teradata.com>, Christian Axel Schmidt Dick <christianaxel.schmidt@teradata.com>, Will Fleury <will.fleury@teradata.com>",
    "download_url": "https://files.pythonhosted.org/packages/4e/64/cc4dd45a5a56de7c32281643235672c01035b34464dc190f59066bdec774/teradatamodelops-7.1.2.tar.gz",
    "platform": null,
    "description": "# Teradata ModelOps Client\n\n- [Installation](#installation)\n- [CLI](#cli)\n  - [configuration](#configuration)\n  - [help](#help)\n  - [list](#list)\n  - [add](#add)\n  - [run](#run)\n  - [init](#init)\n  - [clone](#clone)\n  - [link](#link)\n  - [task](#task)\n    - [add](#task-add)\n    - [run](#task-run)\n  - [connection](#connection)\n  - [feature](#feature)\n    - [compute-stats](#feature-compute-stats)\n    - [list-stats](#feature-list-stats)\n    - [create-stats-table](#feature-create-stats-table)\n    - [import-stats](#feature-import-stats)\n  - [doctor](#doctor)\n- [Authentication](#authentication)\n- [SDK](#sdk)\n  - [Create Client](#create-client)\n  - [Read Entities](#read-entities)\n  - [Deploy Model Version](#deploy-model-version)\n  - [Import Model Version](#import-model-version)\n- [Release Notes](#release-notes)\n\n\n# Installation<a id=\"installation\"></a>\n\nThe teradatamodelops is available from [pypi](https://pypi.org/project/teradatamodelops/)\n\n```bash\npip install teradatamodelops\n```\n\n_Note: previously Teradata ModelOps was known as AnalyticsOps Accelerator (AOA), so you might encounter mentions of AOA or aoa, it's referring to an older version of ModelOps_\n\n# CLI<a id=\"cli\"></a>\n\n## configuration<a id=\"configuration\"></a>\n\nBy default, the CLI looks for configuration stored in `~/.aoa/config.yaml`. Copy the config from ModelOps UI -> Session Details -> CLI Config. This will provide the command to create or update the `config.yaml`. If required, one can override this configuration at runtime by specifying environment variables (see `api_client.py`)\n\n## help<a id=\"help\"></a>\n\nThe cli can be used to perform a number of interactions and guides the user to perform those actions.\n\n```bash\n> tmo -h\nusage: tmo [-h] [--debug] [--version] {list,add,run,init,clone,link,task,connection,feature,doctor} ...\n\nTMO CLI\n\noptions:\n  -h, --help            show this help message and exit\n  --debug               Enable debug logging\n  --version             Display the version of this tool\n\nactions:\n  valid actions\n\n  {list,add,run,init,clone,link,task,connection,feature,doctor}\n    list                List projects, models, local models or datasets\n    add                 Add model to working dir\n    run                 Train and Evaluate model locally\n    init                Initialize model directory with basic structure\n    clone               Clone Project Repository\n    link                Link local repo to project\n    task                Manage feature engineering tasks\n    connection          Manage local connections\n    feature             Manage feature statistics\n    doctor              Diagnose configuration issues\n```\n\n## list<a id=\"list\"></a>\n\nAllows to list the aoa resources. In the cases of listing models (pushed / committed) and datasets, it will prompt the user to select a project prior showing the results. In the case of local models, it lists both committed and non-committed models.\n```bash\n> tmo list -h\nusage: tmo list [-h] [--debug] [-p] [-m] [-lm] [-t] [-d] [-c]\n\noptions:\n  -h, --help           show this help message and exit\n  --debug              Enable debug logging\n  -p, --projects       List projects\n  -m, --models         List registered models (committed / pushed)\n  -lm, --local-models  List local models. Includes registered and non-registered (non-committed / non-pushed)\n  -t, --templates      List dataset templates\n  -d, --datasets       List datasets\n  -c, --connections    List local connections\n```\nAll results are shown in the format\n```\n[index] (id of the resource) name\n```\nfor example:\n```\nList of models for project Demo:\n--------------------------------\n[0] (03c9a01f-bd46-4e7c-9a60-4282039094e6) Diabetes Prediction\n[1] (74eca506-e967-48f1-92ad-fb217b07e181) IMDB Sentiment Analysis\n```\n\n## add<a id=\"add\"></a>\n\nAdd a new model to a given repository based on a model template. A model in any other existing ModelOps git repository (specified via the `-t <giturl>`) can be used.\n\n```bash\n> tmo add -h\nusage: tmo add [-h] [--debug] -t TEMPLATE_URL -b BRANCH\n\noptions:\n  -h, --help            show this help message and exit\n  --debug               Enable debug logging\n  -t TEMPLATE_URL, --template-url TEMPLATE_URL\n                        Git URL for template repository\n  -b BRANCH, --branch BRANCH\n                        Git branch to pull templates (default is main)\n```\n\nExample usage\n```bash\n> tmo add -t https://github.com/Teradata/modelops-demo-models -b master\n```\n\n## run<a id=\"run\"></a>\n\nThe cli can be used to validate the model training and evaluation logic locally before committing to git. This simplifies the development lifecycle and allows you to test and validate many options. It also enables you to avoid creating the dataset definitions in the AOA UI until you are ready and have a finalised version.\n\n```bash\n> tmo run -h\nusage: tmo run [-h] [--debug] [-id MODEL_ID] [-m MODE] [-d DATASET_ID] [-t DATASET_TEMPLATE_ID] [-ld LOCAL_DATASET] [-lt LOCAL_DATASET_TEMPLATE] [-c CONNECTION]\n\noptions:\n  -h, --help            show this help message and exit\n  --debug               Enable debug logging\n  -id MODEL_ID, --model-id MODEL_ID\n                        Id of model\n  -m MODE, --mode MODE  Mode (train or evaluate)\n  -d DATASET_ID, --dataset-id DATASET_ID\n                        Remote datasetId\n  -t DATASET_TEMPLATE_ID, --dataset-template-id DATASET_TEMPLATE_ID\n                        Remote datasetTemplateId\n  -ld LOCAL_DATASET, --local-dataset LOCAL_DATASET\n                        Path to local dataset metadata file\n  -lt LOCAL_DATASET_TEMPLATE, --local-dataset-template LOCAL_DATASET_TEMPLATE\n                        Path to local dataset template metadata file\n  -c CONNECTION, --connection CONNECTION\n                        Local connection id\n```\n\nYou can run all of this as a single command or interactively by selecting some optional arguments, or none of them.\n\nFor example, if you want to run the cli interactively you just select `tmo run` but if you wanted to run it non interactively to train a given model with a given datasetId you would expect\n```bash\n> tmo run -id <modelId> -m <mode> -d <datasetId>\n```\n\n## init<a id=\"init\"></a>\n\nWhen you create a git repository, its empty by default. The `init` command allows you to initialize the repository with the structure required by the AOA. It also adds a default README.md and HOWTO.md.\n\n```bash\n> tmo init -h\nusage: tmo init [-h] [--debug]\n\noptions:\n  -h, --help  show this help message and exit\n  --debug     Enable debug logging\n```\n\n## clone<a id=\"clone\"></a>\n\nThe `clone` command provides a convenient way to perform a git clone of the repository associated with a given project. The command can be run interactively and will allow you to select the project you wish to clone. Note that by default it clones to the current working directory so you either need to make sure you create an empty folder and run it from within there or else provide the `--path ` argument.\n\n```bash\n> tmo clone -h\nusage: tmo clone [-h] [--debug] [-id PROJECT_ID] [-p PATH]\n\noptions:\n  -h, --help            show this help message and exit\n  --debug               Enable debug logging\n  -id PROJECT_ID, --project-id PROJECT_ID\n                        Id of Project to clone\n  -p PATH, --path PATH  Path to clone repository to\n```\n\n## link<a id=\"link\"></a>\n\nLinks the current local repository to a remote project.\n\n```bash\n> tmo link -h\nusage: tmo link [-h] [--debug]\n\noptions:\n  -h, --help  show this help message and exit\n  --debug     Enable debug logging\n```\n\n## task<a id=\"task\"></a>\n\nManage feature tasks actions. The feature tasks are used to automate the feature engineering process. The tasks are stored in a git repository and can be shared across multiple projects.\n\n```bash\n> tmo task -h\nusage: tmo task [-h] {add,run} ...\n\noptions:\n  -h, --help  show this help message and exit\n\nactions:\n  valid actions\n\n  {add,run}\n    add       Add feature engineering task to working dir\n    run       Run feature engineering tasks locally\n```\n\n### task add<a id=\"task-add\"></a>\n\nAdd a new feature engineering task to a given repository based on a task template. A task in any other existing ModelOps git repository (specified via the `-t <giturl>`) can be used.\n\n```bash\n> tmo task add -h\nusage: tmo task add [-h] [--debug] -t TEMPLATE_URL -b BRANCH\n\noptions:\n  -h, --help            show this help message and exit\n  --debug               Enable debug logging\n  -t TEMPLATE_URL, --template-url TEMPLATE_URL\n                        Git URL for template repository\n  -b BRANCH, --branch BRANCH\n                        Git branch to pull task (default is main)\n```\n\nExample usage \n```bash\n> tmo task add -t https://github.com/Teradata/modelops-demo-models -b feature_engineering\n```\n\n### task run<a id=\"task-run\"></a>\n\nThe cli can be used to run feature engineering tasks locally before committing to git.\n\n```bash\n> tmo task run -h\nusage: tmo task run [-h] [--debug] [-c CONNECTION] [-f FUNCTION_NAME]\n\noptions:\n  -h, --help            show this help message and exit\n  --debug               Enable debug logging\n  -c CONNECTION, --connection CONNECTION\n                        Local connection id\n  -f FUNCTION_NAME, --function-name FUNCTION_NAME\n                        Task function name\n```\n\nYou can run all of this as a single command or interactively by selecting some optional arguments, or none of them.\n\nFor example, if you want to run the cli interactively you just select `tmo task run` but if you wanted to run it non interactively to run a given task function you would expect\n```bash\n> tmo task run -f <functionName> -c <connectionId>\n```\n\n## connection<a id=\"connection\"></a>\n\nThe connection credentials stored in the ModelOps service cannot be accessed remotely through the CLI for security reasons. Instead, users can manage connection information locally for the CLI. These connections are used by other CLI commands which access Vantage.\n\n```bash\n> tmo connection -h\nusage: tmo connection [-h] {list,add,remove,export,test,create-byom-table} ...\n\noptions:\n  -h, --help            show this help message and exit\n\nactions:\n  valid actions\n\n  {list,add,remove,export,test,create-byom-table}\n    list                List all local connections\n    add                 Add a local connection\n    remove              Remove a local connection\n    export              Export a local connection to be used as a shell script\n    test                Test a local connection\n    create-byom-table   Create a table to store BYOM models\n```\n\n## feature<a id=\"feature\"></a>\n\nManage feature metadata by creating and populating feature metadata table(s). The feature metadata tables contain information required when computing statistics during training, scoring etc. This metadata depends on the feature type (categorical or continuous). \n\nAs this metadata can contain sensitive profiling information (such as categories), it is recommended to treat this metadata in the same manner as you treat the features for a given use case. That is, the feature metadata should live in a project or use case level database. \n\n```bash\n> tmo feature -h\nusage: tmo feature [-h] {compute-stats,list-stats,create-stats-table,import-stats} ...\n\noptions:\n  -h, --help            show this help message and exit\n\naction:\n  valid actions\n\n  {compute-stats,list-stats,create-stats-table,import-stats}\n    compute-stats       Compute feature statistics\n    list-stats          List available statistics\n    create-stats-table  Create statistics table\n    import-stats        Import column statistics from local JSON file\n```\n\n### feature compute-stats<a id=\"feature-compute-stats\"></a>\n\nCompute the feature metadata information required when computing statistics during training, scoring etc. This metadata depends on the feature type (categorical or continuous).\n\nContinuous: the histograms edges  \nCategorical: the categories\n\n```bash\n> tmo feature compute-stats -h\nusage: tmo feature compute-stats [-h] [--debug] -s SOURCE_TABLE -m METADATA_TABLE [-t {continuous,categorical}] -c COLUMNS\n\noptions:\n  -h, --help            show this help message and exit\n  --debug               Enable debug logging\n  -s SOURCE_TABLE, --source-table SOURCE_TABLE\n                        Feature source table/view\n  -m METADATA_TABLE, --metadata-table METADATA_TABLE\n                        Metadata table for feature stats, including database name\n  -t {continuous,categorical}, --feature-type {continuous,categorical}\n                        Feature type: continuous or categorical (default is continuous)\n  -c COLUMNS, --columns COLUMNS\n                        List of feature columns\n```\n\nExample usage\n\n```bash\ntmo feature compute-stats \\\n  -s <feature-db>.<feature-data> \\\n  -m <feature-metadata-db>.<feature-metadata-table> \\\n  -t continuous -c numtimesprg,plglcconc,bloodp,skinthick,twohourserins,bmi,dipedfunc,age\n```\n\n### feature list-stats<a id=\"feature-list-stats\"></a>\n\nLists the name and type (categorical or continuous) for all the features stored in the specified metadata table.\n\n```bash\n> tmo feature list-stats -h\nusage: tmo feature list-stats [-h] [--debug] -m METADATA_TABLE\n\noptions:\n  -h, --help            show this help message and exit\n  --debug               Enable debug logging\n  -m METADATA_TABLE, --metadata-table METADATA_TABLE\n                        Metadata table for feature stats, including database name\n```\n\nExample usage\n\n```bash\ntmo feature list-stats \\\n  -m <feature-metadata-db>.<feature-metadata-table>\n```\n\n### feature create-stats-table<a id=\"feature-create-stats-table\"></a>\n\nCreates or prints out the required SQL to create the metadata statistics table with the specified name.\n\n```bash\n> tmo feature create-stats-table -h\nusage: tmo feature create-stats-table [-h] [--debug] -m METADATA_TABLE [-e]\n\noptions:\n  -h, --help            show this help message and exit\n  --debug               Enable debug logging\n  -m METADATA_TABLE, --metadata-table METADATA_TABLE\n                        Metadata table for feature stats, including database name\n  -e, --execute-ddl     Execute CREATE TABLE DDL, not just generate it\n```\n\nExample usage\n\n```bash\ntmo feature create-stats-table \\\n  -m <feature-metadata-db>.<feature-metadata-table> \\\n  -e\n```\n\n### feature import-stats<a id=\"feature-import-stats\"></a>\n\nImports feature metadata statistics from a local JSON file to the target table. It also supports showing an example of the file structure.\n\n```bash\n> tmo feature import-stats -h\nusage: tmo feature import-stats [-h] [--debug] -m METADATA_TABLE -f STATISTICS_FILE [-s]\n\noptions:\n  -h, --help            show this help message and exit\n  --debug               Enable debug logging\n  -m METADATA_TABLE, --metadata-table METADATA_TABLE\n                        Metadata table for feature stats, including database name\n  -f STATISTICS_FILE, --statistics-file STATISTICS_FILE\n                        Name of statistics JSON file\n  -s, --show-example    Show file structure example and exit\n```\n\nExample usage\n\n```bash\ntmo feature import-stats \\\n  -m <feature-metadata-db>.<feature-metadata-table> \\\n  -f <path-to-statistics-file>\n```\n\n## doctor<a id=\"doctor\"></a>\n\nRuns healtchecks on the current setup. First checks the health of the remote ModelOps service configuration and secondly checks the health of any of the locally created connections.\n\n```bash\n> tmo doctor -h\nusage: tmo doctor [-h] [--debug]\n\noptions:\n  -h, --help  show this help message and exit\n  --debug     Enable debug logging\n```\n\n# Authentication<a id=\"authentication\"></a>\n\nA number of authentication methods are supported for both the CLI and SDK. \n\n- device_code (interactive)\n- client_credentials (service-service)\n- bearer (raw bearer token)\n\nWhen working interactively, the recommended auth method for the CLI is `device_code`. It will guide you through the auth automatically. For the SDK, use  `bearer` if working interactively. \nFor both CLI and SDK, if working in an automated service-service manner, use `client_credentials`. \n\n# SDK<a id=\"sdk\"></a>\n\nThe SDK for ModelOps allows users to interact with ModelOps APIs from anywhere they can execute python such as notebooks, IDEs etc. It can also be used for devops to automate additional parts of the process and integrate into the wider organization.\n\n## Create Client<a id=\"create-client\"></a>\n\nBy default, creating an instance of the `AoaClient` looks for configuration stored in `~/.aoa/config.yaml`. When working with the SDK, we recommend that you specify (and override) all the necessary configuration as part of the `AoaClient` invocation. \n\nAn example to create a client using a bearer token for a given project is\n\n```python\nfrom aoa import AoaClient\n\nclient = AoaClient(\n    aoa_url=\"<modelops-endpoint>\",\n    auth_mode=\"bearer\",\n    auth_bearer=\"<bearer-token>\",\n    project_id=\"23e1df4b-b630-47a1-ab80-7ad5385fcd8d\",\n)\n```\n\nTo get the values to use for bearer token and aoa_url, go to the ModelOps UI -> Session Details -> SDK Config. \n\n## Read Entities<a id=\"read-entities\"></a>\n\nWe provide an extensive sdk implementation to interact with the APIs. You can find, create, update, archive, etc any entity that supports it via the SDK. In addition, most if not all search endpoints are also implemented in the sdk. Here are some examples\n\n```python\nfrom aoa import AoaClient\nimport pprint\n\nclient = AoaClient(project_id=\"23e1df4b-b630-47a1-ab80-7ad5385fcd8d\")\n\ndatasets = client.datasets().find_all()\npprint.pprint(datasets)\n\ndataset = client.datasets().find_by_id(\"11e1df4b-b630-47a1-ab80-7ad5385fcd8c\")\npprint.pprint(dataset)\n\njobs = client.jobs().find_by_id(\"21e1df4b-b630-47a1-ab80-7ad5385fcd1c\")\npprint.pprint(jobs)\n```\n\n## Deploy Model Version<a id=\"deploy-model-version\"></a>\n\nLet's assume we have a model version `4131df4b-b630-47a1-ab80-7ad5385fcd15` which we want to deploy In-Vantage and schedule it to execute once a month at midnight of the first day of the month using dataset connection `11e1df4b-b630-47a1-ab80-7ad5385fcd8c` and dataset template `d8a35d98-21ce-47d0-b9f2-00d355777de1`. We can use the SDK as follows to perform this.\n\n```python\nfrom aoa import AoaClient\n\nclient = AoaClient(project_id=\"23e1df4b-b630-47a1-ab80-7ad5385fcd8d\")\n\ntrained_model_id = \"4131df4b-b630-47a1-ab80-7ad5385fcd15\"\ndeploy_request = {\n   \"engineType\": \"IN_VANTAGE\",\n   \"publishOnly\": False,\n   \"language\": \"PMML\",\n   \"cron\": \"0 0 1 * *\",\n   \"byomModelLocation\": {\n     \"database\": \"<db-name>\",\n     \"table\": \"<table-name>\"\n    },\n    \"datasetConnectionId\": \"11e1df4b-b630-47a1-ab80-7ad5385fcd8c\",\n    \"datasetTemplateId\": \"d8a35d98-21ce-47d0-b9f2-00d355777de1\",\n    \"engineTypeConfig\": {\n        \"dockerImage\": \"\",\n        \"engine\": \"byom\",\n        \"resources\": {\n            \"memory\": \"1G\",\n            \"cpu\": \"1\"\n        }\n    }\n}\n\njob = client.trained_models().deploy(trained_model_id, deploy_request)\n\n# wait until the job completes (if the job fails it will raise an exception)\nclient.jobs().wait(job['id'])\n```\n\n## Import Model Version<a id=\"import-model-version\"></a>\n\nLet's assume we have a PMML model which we have trained in another data science platform. We want to import the artefacts for this version (model.pmml and data_stats.json) against a BYOM model `f937b5d8-02c6-5150-80c7-1e4ff07fea31`.\n\n```python\nfrom aoa import AoaClient\n\nclient = AoaClient(project_id=\"23e1df4b-b630-47a1-ab80-7ad5385fcd8d\")\n\n# set metadata for your import request\nmodel_id = '<model-uuid>'\nfilename = './pima.pmml'\nlanguage = 'PMML'\ndataset_connection_id = '<dataset-connection-uuid>'\ntrain_dataset_id = '<train-dataset-uuid>'\n\n# first, upload the artefacts which we want to associate with the BYOM model version\nimport_id = client.trained_model_artefacts().upload_byom_model(language,filename)\n\nimport_request = {\n    \"artefactImportId\": import_id,\n    \"externalId\": \"my_model_external_id\",\n    \"modelMonitoring\": {\n        \"language\": language,\n        \"useDefaultEvaluation\": True,\n        \"evaluationEnabled\": True,\n        \"modelType\": \"CLASSIFICATION\",\n        \"byomColumnExpression\": \"CAST(CAST(json_report AS JSON).JSONExtractValue('$.predicted_HasDiabetes') AS INT)\",\n        \"driftMonitoringEnabled\": True,\n        \"datasetId\": train_dataset_id,\n        \"datasetConnectionId\": dataset_connection_id,\n    },\n}\n\njob = client.models().import_byom(model_id, import_request)\n\n# wait until the job completes (if the job fails it will raise an exception)\nclient.jobs().wait(job[\"id\"])\n\n# now you can list the artefacts which were uploaded and linked to the model version\ntrained_model_id = job[\"metadata\"][\"trainedModel\"][\"id\"]\nartefacts = client.trained_model_artefacts().list_artefacts(trained_model_id)\n```\n\n\n## Release Notes<a id=\"release-notes\"></a>\n\n### 7.1.2\n\n- Feature: Added shortcut methods to `AoaClient` to improve developer experience.\n- Feature: Exposed more API methods, such as:\n  - `deployments().find_by_deployment_job_id()`\n  - `jobs().find_by_deployment_id()`\n  - `dataset_connections().validate_connection()`\n  - `describe_current_project()`\n- Documentation: Added example for scheduling a Jupyter notebook.\n\n### 7.1.1\n\n- Bug: Fix for OAuth2 Device code grant type.\n- Feature: Updated pypi.md with new release notes.\n- Bug: Fix for rest api compliance after core upgrading.\n\n### 7.1.0\n\n- Feature: Explicitly ask for `logmech` when defining a local Vantage connection.\n- Bug: Fixed `tmo connection create-byom-table` logging error.\n- Feature: Added support for `access_token` when using OAuth2 Device code grant type.\n- Feature: Updated `requests` dependency.\n- Feature: Client hardening efforts.\n- Feature: Added Feature Engineering Tasks functions `tmo task add` and `tmo task run`.\n\n### 7.0.6\n\n- Bug: Fixed compatibility issue with `teradataml>=20.0.0.0`.\n- Feature: Minor formatting updates.\n- Feature: Added user attributes API.\n- Feature: Updated programmatic BYOM import examples.\n\n### 7.0.5\n\n**WARNING** Please recreate statistics metadata if you are using this version. It will produce more accurate results and correct behaviour.\n\n- Bug: Fixed computing statistics metadata for continuous features with all NULLs.\n- Bug: Fixed training/eval/scoring statistics on columns with all NULLs.\n- Bug: Added a workaround for VAL Histograms failure due to SP argument being longer than 31000 characters.\n- Feature: Forced rounding of all decimal and flloat bin edges to Teradata-supported FLOAT literals (15 digits max).\n- Bug: Fixed CLI error computing continuous statistics metadata related to mixed case column names.\n\n### 7.0.4\n\n**WARNING** Please recreate statistics metadata if you are using this version. It will produce more accurate results and correct behaviour.\n\n- Bug: Fixed STO helper functions to pick up correct python environment.\n- Feature: Computing statistics metadata:\n    - Allow less than 10 bins for continuous integer features.\n    - Ignore continuous features that have single value for all rows.\n    - Allow different precision for decimal and float continuous features.\n    - Fix a bug with repeating edges for decimal/float continuous features.\n    - Using rounded edges for integer continuous features.\n    - Ignore NULLs for categorical features.\n    - Ignore categorical features that have NULL value for all rows.\n- Feature: Allow collecting training statistics without providing a target (unsupervised learning).\n- Feature: Report right and left outliers when computing statistics on continuous features.\n- Feature: Allow collecting training/evaluation/scoring statistics when statistics metadata is missing for some columns (missing columns are reported in data_stats.json).\n- Feature: For categorical features, report outlier categories in \"unmonitored_frequency\".\n- Feature: Assume statistics metadata is empty, in case it could not be read from the database.\n- Refactor: Minor fixes and improvements.\n\n### 7.0.3\n\n- Bug: Fixed statistics computation for symmetric distributions.\n- Feature: Added automatic retry for device_code expired session or refresh token.\n\n### 7.0.2\n\n- Refactor: Updated dependencies.\n\n### 7.0.1\n\n- Feature: Updates to new name.\n- Feature: Changes to support both 1.x and 2.x version of SQLAlchemy.\n- Feature: Added a command to create BYOM table.\n- Refactor: Various quality of life and performance improvements.\n\n### 7.0.0.0\n\n- Refactor: Refactor data statistics API / internals to be simpler (breaking changes).\n- Feature: Switch CLI authentication to use `device_code` grant flow.\n- Feature: Add raw Bearer token support for authentication (SDK).\n\n\n### 6.1.4\n\n- Feature: Document user facing stats functions.\n- Feature: Improve end user error messaging related to stats.\n- Bug: Fix `aoa init` and `aoa add` not working due to refactor in 6.1.3.\n\n### 6.1.3\n\n- Feature: Improve error messages for statistics calculations and validation.\n- Feature: Use [aia](https://pypi.org/project/aia/) for AIA chasing for missing intermediate certificates.\n- Bug: No defaults for set when BYOM and VAL DBs not configured on connections.\n- Bug: Fixed requirement versions to ensure more stability across python versions.\n- Bug: Fixed slow CLI for some commands due to repeated server initialization.\n\n### 6.1.2\n\n- Bug: Work around problems with special character in passwords for teradataml.\n\n### 6.1.0\n\n- Cleanup: Remove all non OAuth2 (JWT) authentication methods.\n- Cleanup: Remove `aoa configure`.\n- Feature: Improve error messages to user on CLI.\n- Feature: Add `aoa link` for linking project to repo locally.\n- Bug: Don't show archived datasets.\n- Bug: Fix issue with `aoa feature create-table`.\n\n\n### 6.0.0\n\n- Feature: Support API changes on ModelOps 06.00.\n- Feature: CLI DX improvements.\n- Feature: Add Telemetry query bands to session.\n- Feature: `aoa feature` support for managing feature metadata.\n- Feature: `aoa add` uses reference git repository for model templates.\n- Feature: Improve DX from Notebooks.\n\n\n### 5.0.0\n\n- Feature: Add simpler teradataml context creation via aoa_create_context.\n- Feature: Add database to connections.\n- Feature: Support for human-readable model folder names.\n- Feature: Improve UX of aoa run.\n- Feature: Improved error messages for users related to auth and configure.\n- Refactor: Package refactor of aoa.sto.util to aoa.util.sto.\n- Bug: cli listing not filtering archived entities.\n- Cleanup: Remove pyspark support from CLI.\n- \n\n### 4.1.12\n\n- Bug: aoa connection add now hides password symbols.\n- Bug: sto.util.cleanup_cli() used hardcoded models table.\n- Feature: sto.util.check_sto_version() checks In-Vantage Python version compatibility.\n- Feature: sto.util.collect_sto_versions() fetches dict with Python and packages versions.\n\n### 4.1.11\n\n- Bug: aoa run (evaluation) for R now uses the correct scoring file.\n\n### 4.1.10\n\n- Bug: aoa init templates were out of date.\n- Bug: aoa run (score) didn't read the dataset template correctly.\n- Bug: aoa run (score) tried to publish to prometheus.\n- Bug: aoa run (score) not passing model_table kwargs.\n\n### 4.1.9\n\n- Bug: Histogram buckets incorrectly offset by 1 for scoring metrics.\n\n### 4.1.7\n\n- Bug: Quoted and escaped exported connection environmental variables.\n- Bug: aoa clone with `path` argument didn't create .aoa/config.yaml in correct directory.\n- Feature: aoa clone without `path` now uses repository name by default.\n- Feature: update BYOM import to upload artefacts before creating version.\n\n### 4.1.6\n\n- Feature: Added local connections feature with Stored Password Protection.\n- Feature: Self creation of .aoa/config.yaml file when cloning a repo.\n- Bug: Fix argparse to use of common arguments.\n- Feature: Support dataset templates for listing datasets and selecting dataset for train/eval.\n- Bug: Fix aoa run for batch scoring, prompts for dataset template instead of dataset.\n- Bug: Fix batch scoring histograms as cumulative.\n\n### 4.1.5\n\n- Bug: Fix computing stats.\n- Feature: Autogenerate category labels and support for overriding them.\n- Feature: Prompt for confirmation when retiring/archiving.\n\n### 4.1.4\n\n- Feature: Retiring deployments and archiving projects support.\n- Feature: Added support for batch scoring monitoring.\n\n### 4.1.2\n\n- Bug: Fix computing stats.\n- Bug: Fix local SQL model training and evaluation.\n\n### 4.1\n\n- Bug: CLI shows archived entities when listing datasets, projects, etc.\n- Bug: Adapt histogram bins depending on range of integer types.\n\n### 4.0\n\n- Feature: Extract and record dataset statistics for Training, Evaluation.\n\n### 3.1.1\n\n- Feature: `aoa clone` respects project branch.\n- Bug: support Source Model ID from the backend.\n\n### 3.1\n\n- Feature: ability to separate evaluation and scoring logic into separate files for Python/R.\n\n### 3.0\n\n- Feature: Add support for Batch Scoring in run command.\n- Feature: Added STO utilities to extract metadata for micro-models.\n\n### 2.7.2\n\n- Feature: Add support for OAUTH2 token refresh flows.\n- Feature: Add dataset connections api support.\n\n### 2.7.1\n\n- Feature: Add TrainedModelArtefactsApi.\n- Bug: pyspark cli only accepted old resources format.\n- Bug: Auth mode not picked up from environment variables.\n\n### 2.7.0 \n\n- Feature: Add support for dataset templates.\n- Feature: Add support for listing models (local and remote), datasets, projects.\n- Feature: Remove pykerberos dependency and update docs.\n- Bug: Fix tests for new dataset template api structure.\n- Bug: Unable to view/list more than 20 datasets / entities of any type in the cli.\n\n### 2.6.2\n - Bug: Added list resources command.\n - Bug: Remove all kerberos dependencies from standard installation, as they can be now installed as an optional feature.\n - Feature: Add cli support for new artefact path formats.\n \n### 2.6.1 \n - Bug: Remove pykerberos as an installation dependency. \n",
    "bugtrack_url": null,
    "license": "Copyright (c) 2024 Teradata. All rights reserved.  LICENSE AGREEMENT  Software: Teradata ModelOps IMPORTANT - READ THIS AGREEMENT CAREFULLY BEFORE INSTALLING, DOWNLOADING, OR USING THE SOFTWARE. TERADATA WILL LICENSE THE APPLICABLE SOFTWARE TO YOU (AS DEFINED BELOW) ONLY IF YOU ACCEPT THE TERMS AND CONDITIONS OF THIS AGREEMENT AND MEET THE CONDITIONS FOR USING THE SOFTWARE DESCRIBED BELOW. BY INSTALLING, DOWNLOADING, OR USING THE SOFTWARE, YOU (1) AGREE TO THE TERMS AND CONDITIONS OF THIS AGREEMENT, AND (2) REPRESENT AND WARRANT THAT YOU POSSESS THE AUTHORITY TO ENTER INTO THIS AGREEMENT ON BEHALF OF YOU, YOUR EMPLOYER (WHEN ACTING ON BEHALF OF YOUR EMPLOYER), AND/OR A TERADATA-AUTHORIZED LICENSEE (WHEN YOU AND YOUR EMPLOYER ARE ACTING ON BEHALF OF A TERADATA-AUTHORIZED LICENSEE). IF YOU DO NOT AGREE TO THE TERMS AND CONDITIONS OF THIS AGREEMENT, DO NOT INSTALL, DOWNLOAD, OR USE THE SOFTWARE. IMPORTANT \u2013 BY DOWNLOADING, INSTALLING OR USING THE SOFTWARE: \u2022\tYOU ACKNOWLEDGE THAT THE SOFTWARE YOU ARE INSTALLING, DOWNLOADING OR USING FROM TERADATA IS SUBJECT TO THE RESTRICTIONS AND CONTROLS IMPOSED BY UNITED STATES EXPORT REGULATIONS. \u2022\tYOU CERTIFY THAT: o\tYOU DO NOT INTEND TO USE THE SOFTWARE FOR ANY PURPOSE PROHIBITED BY UNITED STATES EXPORT REGULATIONS, INCLUDING, WITHOUT LIMITATION, TERRORISM, CYBER-ATTACKS, CYBER-CRIMES, MONEY-LAUNDERING, INDUSTRIAL ESPIONAGE, OR NUCLEAR, CHEMICAL OR BIOLOGICAL WEAPONS PROLIFERATION. o\tYOU ARE NOT LISTED AS A DENIED PARTY ON ANY LIST GOVERNING UNITED STATES EXPORTS. o\tYOU ARE NOT A NATIONAL OF ANY COUNTRY OR IN ANY COUNTRY THAT IS NOT APPROVED FOR EXPORT OF THE SOFTWARE. This License Agreement (\u201cAgreement\u201d) is a legal contract between you, or as applicable on behalf of your employer and any Teradata-authorized licensee for whom your employer is acting on its behalf (\"You,\" \"Your,\" and \"Yours\") and Teradata Operations, Inc. (\"Teradata\") for the Software. \u201cSoftware\u201d refers to the applicable software product identified either above or on the site where this Agreement is located, which consists of computer software code in object code form only, as well as associated media, printed materials, and online or electronic documentation. The term \u201cSoftware\u201d also includes any and all error corrections, bug fixes, updates, upgrades, or new versions or releases of the Software (collectively and individually, \u201cEnhancements\u201d) that Teradata may elect in its sole discretion to provide you. Teradata is under no obligation to provide you with Enhancements under this Agreement. By accepting this Agreement, you represent and warrant that you possess the authority to enter into this Agreement on behalf of your employer and any Teradata-authorized licensee for whom you and/or your employer are acting on their behalf. \u201cScenario (i)\u201d: If you are downloading the Software on behalf of a Teradata-authorized licensee that has already purchased a license to the Software pursuant to an executed master agreement (\u201cMaster Agreement\u201d) with Teradata or one of its affiliates, the terms of such master agreement prevail over this Agreement. \u201cScenario (ii)\u201d: If Scenario (i) does not apply to you, then this Agreement and the terms of use for the site from which you downloaded the Software (\u201cGeneral Terms of Use\u201d) constitute the entire understanding of the parties with respect to the Software and Services, and supersede all other prior agreements and understandings whether oral or written. For clarity, Section 2.a.(i) does not apply to you. For example, individuals that have downloaded a copy of the Teradata Express version of the Teradata Relational Database product fall within this Scenario (ii). 1.  Term.  This Agreement commences on the earliest date of the first download, first copying, first installation, or first use of the Software (the \u201cEffective Date\u201d). Unless terminated earlier as provided herein, this Agreement, including your license to the Software, will expire or terminate on the first to occur of: (i) the same date that your Teradata-authorized license to use the Teradata Relational Database product expires or terminates or (ii) in accordance with the terms of this Agreement. 2.  License. (a)  Teradata grants you a nonexclusive, nontransferable, paid up license: (i) If Scenario (i) applies \u2013 to the Software in accordance with all of the terms and conditions of the Master Agreement. (ii) If Scenario (ii) applies - subject to your compliance with all of the terms and conditions of this Agreement, to install and use the Software on your computer workstation in object code form for your internal use solely for purposes of facilitating your Teradata-authorized license to use the Teradata Relational Database product. You may make reasonable archival backup copies of the Software, but may only use an archival copy in lieu of your primary copy and subject to the same restrictions as your primary copy. (b)  The term \"Third Party Software\" means computer programs or modules (including their documentation) that bear the logo, copyright and/or trademark of a third party (including open source software that are contained in files marked as \u201copen source\u201d or the like) or are otherwise subject to written license terms of a third party.  Third Party Software does not constitute Software.  Third Party Software is licensed to you subject to the applicable license terms accompanying it, included in/with it, referenced in it, or otherwise entered into by you with respect to it. (c)  You will not sell, copy, rent, loan, modify, transfer, disclose, embed, sublicense, create derivative works of or distribute the Software, in whole or in part, without Teradata\u2019s prior written consent.  You are granted no rights to obtain or use the Software\u2019s source code.  You will not reverse-assemble, reverse compile or reverse-engineer the Software, except as expressly permitted by applicable law without the possibility of contractual waiver.  Notwithstanding anything to the contrary, you do not have any license, right, or authority to subject the Software, in whole or in part or as part of a larger work, to any terms of any other agreement, including GNU Public Licenses. (d)  No license rights to the Software will be implied. The Software, which includes all copies thereof (whether in whole or in part), is and remains the exclusive property of Teradata and its licensors.  You will ensure that all copies of the Software contain Teradata's and its licensors' copyright notices, as well as all other proprietary legends.  Teradata reserves the right to inspect your use of the Software for purposes of verifying your compliance with the terms and conditions of this Agreement. If applicable, such right will be conducted according to any audit provisions that might exist in an underlying agreement between you and Teradata. (e) The Software may contain a disabling device that will prevent the Software from being used on data sets larger than a certain size. You agree not to: (i) use the Software on data sets larger than such or (ii) disable or circumvent any disabling device contained in the Software. (f) Notwithstanding anything to the contrary, Teradata shall have the right to collect and analyze data and other information relating to the provision, use and performance of various aspects of the Software and related systems and technologies (including, without limitation, information concerning your data and data derived therefrom), and Teradata will be free (during and after the term hereof) to (i) use such information and data to improve and enhance the Software and for other development, diagnostic and corrective purposes in connection with the Software and other Teradata offerings, and (ii) disclose such data solely in aggregate or other de-identified form in connection with its business. 3.  Responsibilities.  You are responsible for the installation of the Software, as well as for providing data security and backup operations.  This Agreement does not require Teradata to provide you with any Enhancements, consulting services, technical assistance, installation, training, support, or maintenance of any kind (collectively and individually, \u201cServices\u201d). To the extent that Teradata elects to provide you with any Services, such Services are provided to you at Teradata\u2019s sole discretion and may be modified or discontinued at any time for any reason. 4.  DISCLAIMER OF WARRANTY.  TERADATA: (A) PROVIDES SERVICES (IF ANY), (B) LICENSES THE SOFTWARE, AND (C) PROVIDES THIRD PARTY SOFTWARE TO YOU HEREUNDER ON AN \u201cAS-IS\u201d BASIS WITHOUT WARRANTIES OF ANY KIND (ORAL OR WRITTEN, EXPRESS OR IMPLIED, OR STATUTORY). WITHOUT LIMITATION TO THE FOREGOING, THERE ARE NO IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, OR NON-INFRINGEMENT.  TERADATA DOES NOT WARRANT THAT THE SOFTWARE, THIRD PARTY SOFTWARE, OR SERVICES WILL MEET YOUR REQUIREMENTS OR CONFORM TO ANY SPECIFICATIONS, OR THAT THE OPERATION OF THE SOFTWARE OR THIRD PARTY SOFTWARE WILL BE UNINTERRUPTED OR ERROR FREE.  YOU BEAR THE ENTIRE RISK AS TO SATISFACTORY QUALITY, PERFORMANCE, ACCURACY, AND RESULTS OBTAINED FROM THE SOFTWARE, THIRD PARTY SOFTWARE, AND SERVICES. SOME JURISDICTIONS RESTRICT DISCLAIMERS OF WARRANTY, SO THE ABOVE DISCLAIMERS MAY NOT FULLY APPLY TO YOU. 5.  LIMITATIONS ON LIABILITY: UNDER NO CIRCUMSTANCES WILL TERADATA\u2019S AND ITS LICENSORS\u2019 TOTAL CUMULATIVE LIABILITY FOR CLAIMS RELATING TO THIS AGREEMENT, THE SERVICES, THE SOFTWARE, AND/OR THIRD PARTY SOFTWARE (WHETHER BASED IN CONTRACT, STATUTE, OR TORT (INCLUDING NEGLIGENCE) OR OTHERWISE) EXCEED US$1,000; PROVIDED, HOWEVER, THAT THE FOREGOING WILL NOT APPLY TO CLAIMS FOR (I) PERSONAL INJURY, INCLUDING DEATH, TO THE EXTENT CAUSED BY TERADATA'S GROSS NEGLIGENCE OR WILLFUL MISCONDUCT; OR (II) PHYSICAL DAMAGE TO TANGIBLE REAL OR PERSONAL PROPERTY TO THE EXTENT CAUSED BY TERADATA'S GROSS NEGLIGENCE OR WILLFUL MISCONDUCT EQUAL TO THE AMOUNT OF DIRECT DAMAGES UP TO ONE MILLION DOLLARS PER OCCURRENCE.  IN NO EVENT WILL TERADATA OR ITS LICENSORS BE LIABLE FOR ANY INDIRECT, SPECIAL, PUNITIVE, INCIDENTAL OR CONSEQUENTIAL DAMAGES, OR FOR LOSS OF PROFITS, REVENUE, TIME, OPPORTUNITY, OR DATA, EVEN IF INFORMED OF THE POSSIBILITY OF SUCH DAMAGES. SOME JURISDICTIONS RESTRICT LIMITATIONS OF LIABILITY, SO THE ABOVE LIMITATIONS MAY NOT FULLY APPLY TO YOU. 6.  Government Restrictions. You agree that you will not, directly or indirectly, export or transmit any Software without obtaining Teradata\u2019s prior written authorization, as well as appropriate governmental approvals, including those required by the U.S. Government.  Use and or distribution of this software is subject to export laws and regulations of the United States and other jurisdictions. The links below connect you to applicable U.S. government agencies, and their regulations, that have jurisdiction over this transaction. http://www.bis.doc.gov/ http://www.treas.gov/offices/enforcement/ofac/  By installing, downloading, or using this product, you acknowledge that this transaction is subject to applicable export control laws and you certify that your installation, download, use and/or subsequent distribution of this product is not prohibited under applicable laws and regulations. The Government\u2019s use, duplication, or disclosure of Teradata\u2019s commercial computer software and commercial computer software documentation is subject to: (a) the Restricted Rights Notice set forth in 48 C.F.R. \u00b6 52.227-14 (Rights In Data - General); (b) Teradata\u2019s standard commercial license rights supplemented by 48 C.F.R. \u00b6 52.227-19 (Commercial Computer Software - Restricted Rights); and/or (c) the limited rights and license set forth 48 CFR \u00b6 252.227-7015 (Technical Data\u2013Commercial Items), as applicable. 7.  Termination and Expiration. A party may terminate this Agreement with or without cause, upon providing written notice to the other parties.  When this Agreement terminates or expires, you will immediately cease all use of the Software, permanently remove the Software from all computers, destroy all copies of the Software, and (upon receipt of Teradata\u2019s request) provide a signed written certification that the foregoing has occurred.  Sections 4-11 will survive expiration or termination of this Agreement. 8.  Choice of Law and Dispute Resolution. The parties will attempt in good faith to resolve any controversy or claim by negotiation or mediation.  If they are unable to do so, and regardless of the causes of action alleged and whether they arise under this Agreement or otherwise, the claim will be resolved by arbitration before a sole arbitrator in San Diego, California pursuant to the then-current Commercial Rules of the American Arbitration Association and the federal substantive and procedural law of arbitration.  The arbitrator\u2019s award will be final and binding, and may be entered in any court having jurisdiction thereof, but may include only damages consistent with the limitations in this Agreement.  Each party will bear its own attorney's fees and costs related to the arbitration. The obligations to negotiate, mediate and arbitrate shall not apply to claims for misuse or infringement of a party\u2019s intellectual property rights.  Any claim or action must be brought within two years after the claimant knows or should have known of the claim .  New York law will govern the interpretation and enforcement of this Agreement, except that the Federal Arbitration Act will govern the interpretation and enforcement of the arbitrability of claims under this Section. 9.  Feedback.  Notwithstanding anything to the contrary: (a) Teradata will have no obligation of any kind with respect to any Software-related comments, suggestions, design changes or improvements that you elect to provide to Teradata in either verbal or written form (collectively, \u201cSoftware Feedback\u201d), and (b) Teradata and its affiliates are hereby free to use any ideas, concepts, know-how or techniques, in whole or in part, contained in Software Feedback: (i) for any purpose whatsoever, including developing, manufacturing, and/or marketing products and/or services incorporating  Software Feedback in whole or in part, and (ii) without any restrictions or limitations, including requiring the payment of any license fees, royalties, or other consideration. 10.  Confidentiality.  You will not disclose the results of any testing or evaluations, including any benchmarks, insofar as it relates to the Software without Teradata\u2019s prior written consent. 11.  Entire Agreement. This Agreement constitutes the entire understanding of the parties with respect to the Software and supersedes all other prior agreements and understandings whether oral or written. No oral representation or change to this Agreement will be binding upon either party unless agreed to in writing and signed by authorized representatives of all parties.  You will not assign this Agreement or your rights, nor will you delegate your obligations under this Agreement.  Failure by either party to enforce any term or condition of this Agreement will not be deemed a waiver of future enforcement of that or any other term or condition.  The provisions of this Agreement are severable. \"Include\", \"includes\", and \"including\" shall be interpreted as introducing a list of examples which do not limit the generality of any preceding words or any words in the list of examples. ",
    "summary": "Python client for Teradata ModelOps (TMO)",
    "version": "7.1.2",
    "project_urls": {
        "Documentation": "https://pypi.org/project/teradatamodelops/#cli",
        "Homepage": "https://pypi.org/project/teradatamodelops/",
        "Repository": "https://github.td.teradata.com/ModelOps/ModelOpsPythonSDK"
    },
    "split_keywords": [
        "teradata",
        " vantage",
        " data science",
        " devops",
        " vantagecloud"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ea19343b546bd8ac05c93b0380de8f2e8b9207b96aaa33ceea7fa77de20e331a",
                "md5": "6b78654307404c33753c57e24be3f09f",
                "sha256": "711ec4092ce665b79cf2564ea31efa5671ca534f07406992536d223a4e65c9cc"
            },
            "downloads": -1,
            "filename": "teradatamodelops-7.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "6b78654307404c33753c57e24be3f09f",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 176893,
            "upload_time": "2024-10-23T11:34:51",
            "upload_time_iso_8601": "2024-10-23T11:34:51.666878Z",
            "url": "https://files.pythonhosted.org/packages/ea/19/343b546bd8ac05c93b0380de8f2e8b9207b96aaa33ceea7fa77de20e331a/teradatamodelops-7.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "4e64cc4dd45a5a56de7c32281643235672c01035b34464dc190f59066bdec774",
                "md5": "806e5bbeb256dc0813c1586308149548",
                "sha256": "3b800f6eee24a2aabf0d6a86bf456a81e5bb0078a2afd72ab6a18a93ab151c26"
            },
            "downloads": -1,
            "filename": "teradatamodelops-7.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "806e5bbeb256dc0813c1586308149548",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 157249,
            "upload_time": "2024-10-23T11:34:53",
            "upload_time_iso_8601": "2024-10-23T11:34:53.804023Z",
            "url": "https://files.pythonhosted.org/packages/4e/64/cc4dd45a5a56de7c32281643235672c01035b34464dc190f59066bdec774/teradatamodelops-7.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-23 11:34:53",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "teradatamodelops"
}
        
Elapsed time: 0.42579s