![Style Checks](https://github.com/intel-innersource/frameworks.ai.transfer-learning/actions/workflows/style-test.yaml/badge.svg)
![Doc Test](https://github.com/intel-innersource/frameworks.ai.transfer-learning/actions/workflows/docs-test.yaml/badge.svg)
![Unit Tests](https://github.com/intel-innersource/frameworks.ai.transfer-learning/actions/workflows/unittest.yaml/badge.svg)
![Integration Tests](https://github.com/intel-innersource/frameworks.ai.transfer-learning/actions/workflows/integration.yaml/badge.svg)
![Notebook Test](https://github.com/intel-innersource/frameworks.ai.transfer-learning/actions/workflows/notebook-test.yaml/badge.svg)
*Note: You may find it easier to read about Intel Transfer Learning tool, follow the Get
Started guide, and browse the API material from our published documentation site
https://intelai.github.io/transfer-learning.*
<!-- SkipBadges -->
# Intel® Transfer Learning Tool
Transfer learning workflows use the knowledge learned by a pre-trained model on
a large dataset to improve the performance of a related problem with a smaller
dataset.
## What is Intel® Transfer Learning Tool
Intel® Transfer Learning Tool makes it easier and faster for you to
create transfer learning workflows across a variety of AI use cases. Its
open-source Python\* library leverages public pretrained model hubs,
Intel-optimized deep learning frameworks, and your custom dataset to efficiently
generate new models optimized for Intel hardware.
This project documentation provides information, resource links, and instructions for the Intel
Transfer Learning Tool as well as Jupyter\* notebooks and examples that
demonstrate its usage.
**Features:**
* Supports PyTorch\* and TensorFlow\*
* Select from over [100 computer vision and natural language processing models](Models.md) from
Torchvision, PyTorch Hub, TensorFlow Hub, Keras, and Hugging Face
* Use cases include:
* Image Classification
* Text Classification
* Text Generation
* Vision Anomaly Detection
* Use your own custom dataset or get started quickly with built-in datasets
* Automatically create a trainable classification layer customized for your dataset
* Pre-process your dataset using scaling, cropping, batching, and splitting
* Use APIs for prediction, evaluation, and benchmarking
* Export your model for deployment or resume training from checkpoints
**Intel Optimizations:**
* Boost performance with Intel® Optimization for TensorFlow and Intel® Extension for PyTorch
* Quantize to INT8 to reduce model size and speed up inference using Intel® Neural Compressor
* Optimize model for FP32 inference using Intel Neural Compressor
* Reduce training time with auto-mixed precision for select hardware platforms
* Further reduce training time with multinode training
## How the Intel Transfer Learning Tool Works
The Intel Transfer Learning Tool lets you train AI models with TensorFlow or
PyTorch using either no-code command line interface (CLI) commands at a bash
prompt, or low-code application programming interface (API) calls from a Python
script.
Use your own dataset or select an existing image or text dataset listed in the
[public datasets](DATASETS.md) documentation. Construct your own CLI or API commands for training, evaluation,
and optimization using the TensorFlow or PyTorch framework, and finally export
your saved model optimized for inference on Intel CPUs.
An overview of the Intel Transfer Learning Tool flow is shown in this
figure:
<p align="center"><b>Intel Transfer Learning Tool Flow</b></p>
<img alt="Intel Transfer Learning Tool Flow" title="Intel Transfer Learing Tool Flow" src="images/TLT-tool_flow.svg" width="600">
## Get Started
Check out the [Get Started Guide](GetStarted.md) which will walk you through the
steps to check system requirements, install, and then run the tool with a couple of
examples showing no-code CLI and low-code API approaches. After that, you can check out
these additional CLI and API [Examples](examples/README.md).
<!-- ExpandGetStarted-Start -->
As described in the [Get Started Guide](GetStarted.md), once you have a Python
environment set up, you do a basic install of the Intel Transfer Learning
Tool. Here are some examples of commands you will find in the [Get Started Guide](GetStarted.md):
```
pip install intel-transfer-learning-tool
```
Then you can use the Transfer Learning Tool CLI interface (tlt) to train a
TensorFlow image classification model (resnet_v1_50), download and use an
existing built-in dataset (tf_flowers), and save the trained model to
`/tmp/output` using this one command:
```
tlt train --framework tensorflow --model-name resnet_v1_50 --dataset-name tf_flowers \
--output-dir /tmp/output --dataset-dir /tmp/data
```
Use `tlt --help` to see the list of CLI commands. More detailed help for each
command can be found using, for example, `tlt train --help`.
<!-- ExpandGetStarted-End -->
## Note on Evaluation and Bias
Intel Transfer Learning Tool provides standard evaluation metrics such as accuracy and loss for validation/test/train sets. While important, it's essential to acknowledge that these metrics may not explicitly capture biases. Users should be cautious and consider potential biases by analyzing disparities in the data and model prediction. Techniques such as confusion matrices, PR curves, ROC curves, local attribution-based and `gradCAM` explanations, can all be good indicators for bias. Clear documentation of model behavior and performance is also crucial for iterative bias mitigation. [Intel® Explainable AI Tools](https://github.com/IntelAI/intel-xai-tools/tree/main) provides components that demonstrate the aformentioned techniques with [Explainer](https://github.com/IntelAI/intel-xai-tools/tree/main/explainer), a simple API providing post-hoc model distillation and visualization methods, as well as The [Model Card Generator](https://github.com/IntelAI/intel-xai-tools/tree/main/model_card_gen) which provides an interactive HTML report that containing these workflows and demonstrations of model behavior.
## Support
The Intel Transfer Learning Tool team tracks bugs and enhancement requests using
[GitHub issues](https://github.com/IntelAI/transfer-learning-tool/issues). Before submitting a
suggestion or bug report, search the existing GitHub issues to see if your issue has already been reported.
See [Legal Information](Legal.md) for Disclaimers, Trademark, and Licensing information.
Raw data
{
"_id": null,
"home_page": "https://github.com/IntelAI/transfer-learning",
"name": "intel-transfer-learning-tool",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "",
"keywords": "",
"author": "IntelAI",
"author_email": "IntelAI@intel.com",
"download_url": "https://files.pythonhosted.org/packages/0c/4e/f743d4fa8074f6f5078ca1520865e5e1d152d57ec5ed8147d105e49f9a15/intel-transfer-learning-tool-0.6.0.tar.gz",
"platform": null,
"description": "![Style Checks](https://github.com/intel-innersource/frameworks.ai.transfer-learning/actions/workflows/style-test.yaml/badge.svg)\n![Doc Test](https://github.com/intel-innersource/frameworks.ai.transfer-learning/actions/workflows/docs-test.yaml/badge.svg)\n![Unit Tests](https://github.com/intel-innersource/frameworks.ai.transfer-learning/actions/workflows/unittest.yaml/badge.svg)\n![Integration Tests](https://github.com/intel-innersource/frameworks.ai.transfer-learning/actions/workflows/integration.yaml/badge.svg)\n![Notebook Test](https://github.com/intel-innersource/frameworks.ai.transfer-learning/actions/workflows/notebook-test.yaml/badge.svg)\n\n*Note: You may find it easier to read about Intel Transfer Learning tool, follow the Get\nStarted guide, and browse the API material from our published documentation site\nhttps://intelai.github.io/transfer-learning.*\n\n<!-- SkipBadges -->\n\n# Intel\u00ae Transfer Learning Tool\n\nTransfer learning workflows use the knowledge learned by a pre-trained model on\na large dataset to improve the performance of a related problem with a smaller\ndataset.\n\n## What is Intel\u00ae Transfer Learning Tool\n\nIntel\u00ae Transfer Learning Tool makes it easier and faster for you to\ncreate transfer learning workflows across a variety of AI use cases. Its\nopen-source Python\\* library leverages public pretrained model hubs,\nIntel-optimized deep learning frameworks, and your custom dataset to efficiently\ngenerate new models optimized for Intel hardware.\n\nThis project documentation provides information, resource links, and instructions for the Intel\nTransfer Learning Tool as well as Jupyter\\* notebooks and examples that\ndemonstrate its usage.\n\n**Features:**\n* Supports PyTorch\\* and TensorFlow\\*\n* Select from over [100 computer vision and natural language processing models](Models.md) from\n Torchvision, PyTorch Hub, TensorFlow Hub, Keras, and Hugging Face\n* Use cases include:\n * Image Classification\n * Text Classification\n * Text Generation\n * Vision Anomaly Detection\n* Use your own custom dataset or get started quickly with built-in datasets\n* Automatically create a trainable classification layer customized for your dataset\n* Pre-process your dataset using scaling, cropping, batching, and splitting\n* Use APIs for prediction, evaluation, and benchmarking\n* Export your model for deployment or resume training from checkpoints\n\n**Intel Optimizations:**\n* Boost performance with Intel\u00ae Optimization for TensorFlow and Intel\u00ae Extension for PyTorch\n* Quantize to INT8 to reduce model size and speed up inference using Intel\u00ae Neural Compressor\n* Optimize model for FP32 inference using Intel Neural Compressor\n* Reduce training time with auto-mixed precision for select hardware platforms\n* Further reduce training time with multinode training\n\n## How the Intel Transfer Learning Tool Works\n\nThe Intel Transfer Learning Tool lets you train AI models with TensorFlow or\nPyTorch using either no-code command line interface (CLI) commands at a bash\nprompt, or low-code application programming interface (API) calls from a Python\nscript.\n\nUse your own dataset or select an existing image or text dataset listed in the\n[public datasets](DATASETS.md) documentation. Construct your own CLI or API commands for training, evaluation,\nand optimization using the TensorFlow or PyTorch framework, and finally export\nyour saved model optimized for inference on Intel CPUs.\n\nAn overview of the Intel Transfer Learning Tool flow is shown in this\nfigure:\n\n<p align=\"center\"><b>Intel Transfer Learning Tool Flow</b></p>\n\n<img alt=\"Intel Transfer Learning Tool Flow\" title=\"Intel Transfer Learing Tool Flow\" src=\"images/TLT-tool_flow.svg\" width=\"600\">\n\n## Get Started\n\nCheck out the [Get Started Guide](GetStarted.md) which will walk you through the\nsteps to check system requirements, install, and then run the tool with a couple of\nexamples showing no-code CLI and low-code API approaches. After that, you can check out\nthese additional CLI and API [Examples](examples/README.md).\n\n<!-- ExpandGetStarted-Start -->\nAs described in the [Get Started Guide](GetStarted.md), once you have a Python \nenvironment set up, you do a basic install of the Intel Transfer Learning\nTool. Here are some examples of commands you will find in the [Get Started Guide](GetStarted.md):\n\n```\npip install intel-transfer-learning-tool\n```\n\nThen you can use the Transfer Learning Tool CLI interface (tlt) to train a\nTensorFlow image classification model (resnet_v1_50), download and use an\nexisting built-in dataset (tf_flowers), and save the trained model to\n`/tmp/output` using this one command:\n\n```\ntlt train --framework tensorflow --model-name resnet_v1_50 --dataset-name tf_flowers \\\n --output-dir /tmp/output --dataset-dir /tmp/data\n```\n\nUse `tlt --help` to see the list of CLI commands. More detailed help for each\ncommand can be found using, for example, `tlt train --help`.\n\n<!-- ExpandGetStarted-End -->\n\n## Note on Evaluation and Bias\n\nIntel Transfer Learning Tool provides standard evaluation metrics such as accuracy and loss for validation/test/train sets. While important, it's essential to acknowledge that these metrics may not explicitly capture biases. Users should be cautious and consider potential biases by analyzing disparities in the data and model prediction. Techniques such as confusion matrices, PR curves, ROC curves, local attribution-based and `gradCAM` explanations, can all be good indicators for bias. Clear documentation of model behavior and performance is also crucial for iterative bias mitigation. [Intel\u00ae Explainable AI Tools](https://github.com/IntelAI/intel-xai-tools/tree/main) provides components that demonstrate the aformentioned techniques with [Explainer](https://github.com/IntelAI/intel-xai-tools/tree/main/explainer), a simple API providing post-hoc model distillation and visualization methods, as well as The [Model Card Generator](https://github.com/IntelAI/intel-xai-tools/tree/main/model_card_gen) which provides an interactive HTML report that containing these workflows and demonstrations of model behavior.\n\n## Support\n\nThe Intel Transfer Learning Tool team tracks bugs and enhancement requests using\n[GitHub issues](https://github.com/IntelAI/transfer-learning-tool/issues). Before submitting a\nsuggestion or bug report, search the existing GitHub issues to see if your issue has already been reported.\n\nSee [Legal Information](Legal.md) for Disclaimers, Trademark, and Licensing information.\n",
"bugtrack_url": null,
"license": "Apache 2.0",
"summary": "Intel\u00ae Transfer Learning Tool",
"version": "0.6.0",
"project_urls": {
"Homepage": "https://github.com/IntelAI/transfer-learning"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "c9480722fe1eb922c20584523faf0e8e440742800e9b3011e82628eee9075e8d",
"md5": "1651b5597a3c5abf4ae43daa93dbd670",
"sha256": "9928c4fc057799316b39f545f5b0fd1eea10664fd9ff6f1bd88d2a3595aa7a70"
},
"downloads": -1,
"filename": "intel_transfer_learning_tool-0.6.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "1651b5597a3c5abf4ae43daa93dbd670",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 305068,
"upload_time": "2023-10-25T00:28:45",
"upload_time_iso_8601": "2023-10-25T00:28:45.825716Z",
"url": "https://files.pythonhosted.org/packages/c9/48/0722fe1eb922c20584523faf0e8e440742800e9b3011e82628eee9075e8d/intel_transfer_learning_tool-0.6.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "0c4ef743d4fa8074f6f5078ca1520865e5e1d152d57ec5ed8147d105e49f9a15",
"md5": "65054f4072e022a67249b5ea71ea094c",
"sha256": "ecb0b3754679f563ed9128694a15da4342cf3f489caf301ccdb54f8be031d521"
},
"downloads": -1,
"filename": "intel-transfer-learning-tool-0.6.0.tar.gz",
"has_sig": false,
"md5_digest": "65054f4072e022a67249b5ea71ea094c",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 189753,
"upload_time": "2023-10-25T00:29:04",
"upload_time_iso_8601": "2023-10-25T00:29:04.103312Z",
"url": "https://files.pythonhosted.org/packages/0c/4e/f743d4fa8074f6f5078ca1520865e5e1d152d57ec5ed8147d105e49f9a15/intel-transfer-learning-tool-0.6.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-10-25 00:29:04",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "IntelAI",
"github_project": "transfer-learning",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"tox": true,
"lcname": "intel-transfer-learning-tool"
}