mlbomdoc


Namemlbomdoc JSON
Version 0.1.1 PyPI version JSON
download
home_pagehttps://github.com/anthonyharrison/mlbomdoc
SummaryMLBOM documentation tool
upload_time2024-07-28 17:29:56
maintainerAnthony Harrison
docs_urlNone
authorAnthony Harrison
requires_python>=3.8
licenseApache-2.0
keywords documentation tools sbom mlbom devsecops cyclonedx
VCS
bugtrack_url
requirements lib4sbom sbom2doc
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # MLBOMDoc

MLBOMDOC is a human-readable document generator for an ML-BOM (ML Bill of Materials). MLBOMs document Machine Learning model components
which are typically contained within an SBOM (Software Bill of Materials). MLBOMs are supported for [CycloneDX](https://www.cyclonedx.org).

## Installation

To install use the following command:

`pip install mlbomdoc`

Alternatively, just clone the repo and install dependencies using the following command:

`pip install -U -r requirements.txt`

The tool requires Python 3 (3.8+). It is recommended to use a virtual python environment especially
if you are using different versions of python. `virtualenv` is a tool for setting up virtual python environments which
allows you to have all the dependencies for the tool set up in a single environment, or have different environments set
up for testing using different versions of Python.

## Usage

```
usage: mlbomdoc [-h] [-i INPUT_FILE] [--debug] [-f {console,json,markdown,pdf}] [-o OUTPUT_FILE] [-V]

MLBOMdoc generates documentation for a MLBOM.

options:
  -h, --help            show this help message and exit
  -V, --version         show program's version number and exit

Input:
  -i INPUT_FILE, --input-file INPUT_FILE
                        Name of MLBOM file

Output:
  --debug               add debug information
  -f {console,json,markdown,pdf}, --format {console,json,markdown,pdf}
                        Output format (default: output to console)
  -o OUTPUT_FILE, --output-file OUTPUT_FILE
                        output filename (default: output to stdout)
```
					
## Operation

The `--input-file` option is used to specify the MLBOM to be processed. The format of the SBOM is determined according to
the following filename conventions.

| SBOM      | Format    | Filename extension |
| --------- | --------- |--------------------|
| CycloneDX | JSON      | .json              |

The `--output-file` option is used to control the destination of the output generated by the tool. The
default is to report to the console, but it can also be stored in a file (specified using `--output-file` option).

## Example

Given the following MLBOM (test.json), the following output is produced to the console.

**NOTE** that the data is purely fictitious in order to demonstrate the capability of the tool.

```bash
{
  "$schema": "http://cyclonedx.org/schema/bom-1.5.schema.json",
  "bomFormat": "CycloneDX",
  "specVersion": "1.5",
  "serialNumber": "urn:uuid:997191f5-6c2b-4572-9a73-5e0f2d03cedd",
  "version": 1,
  "metadata": {
    "timestamp": "2024-01-02T11:02:22Z",
    "tools": {
      "components": [
        {
          "name": "lib4sbom",
          "version": "0.6.0",
          "type": "application"
        }
      ]
    },
    "component": {
      "type": "application",
      "bom-ref": "CDXRef-DOCUMENT",
      "name": "MLApp"
    }
  },
  "components": [
    {
      "type": "library",
      "bom-ref": "1-glibc",
      "name": "glibc",
      "version": "2.15",
      "supplier": {
        "name": "gnu"
      },
      "cpe": "cpe:/a:gnu:glibc:2.15",
      "licenses": [
        {
          "license": {
            "id": "GPL-3.0-only",
            "url": "https://www.gnu.org/licenses/gpl-3.0-standalone.html"
          }
        }
      ]
    },
    {
      "type": "operating-system",
      "bom-ref": "2-almalinux",
      "name": "almalinux",
      "version": "9.0",
      "supplier": {
        "name": "alma"
      },
      "cpe": "cpe:/o:alma:almalinux:9.0",
      "licenses": [
        {
          "license": {
            "id": "Apache-2.0",
            "url": "https://www.apache.org/licenses/LICENSE-2.0"
          }
        }
      ]
    },
    {
      "type": "library",
      "bom-ref": "3-glibc",
      "name": "glibc",
      "version": "2.29",
      "supplier": {
        "name": "gnu"
      },
      "cpe": "cpe:/a:gnu:glibc:2.29",
      "licenses": [
        {
          "license": {
            "id": "GPL-3.0-only",
            "url": "https://www.gnu.org/licenses/gpl-3.0-standalone.html"
          }
        }
      ],
      "properties": [
        {
          "name": "language",
          "value": "C"
        }
      ]
    },
    {
      "type": "library",
      "bom-ref": "4-tomcat",
      "name": "tomcat",
      "version": "9.0.46",
      "supplier": {
        "name": "apache"
      },
      "cpe": "cpe:/a:apache:tomcat:9.0.46",
      "licenses": [
        {
          "license": {
            "id": "Apache-2.0",
            "url": "https://www.apache.org/licenses/LICENSE-2.0"
          }
        }
      ]
    },
    {
      "type": "machine-learning-model",
      "bom-ref": "5-resnet-50",
      "name": "resnet-50",
      "version": "1.5",
      "supplier": {
        "name": "microsoft"
      },
      "description": "ResNet (Residual Network) is a convolutional neural network that democratized the concepts of residual learning and skip connections. This enables to train much deeper models.",
      "licenses": [
        {
          "license": {
            "id": "Apache-2.0",
            "url": "https://www.apache.org/licenses/LICENSE-2.0"
          }
        }
      ],
      "modelCard": {
        "bom-ref": "5-resnet-50-model",
        "modelParameters": {
          "approach": {
            "type": "supervised"
          },
          "task": "classification",
          "architectureFamily": "Convolutional neural network",
          "modelArchitecture": "ResNet-50",
          "datasets": [
            {
              "type": "dataset",
              "name": "ImageNet",
              "contents": {
                "url": "https://huggingface.co/datasets/imagenet-1k"
              },
              "classification": "public",
              "sensitiveData": "no personal data",
              "description": "ILSVRC 2012, commonly known as \"ImageNet\" is an image dataset organized according to the WordNet hierarchy. Each meaningful concept in WordNet, possibly described by multiple words or word phrases, is called a \"synonym set\" or \"synset\". There are more than 100,000 synsets in WordNet, majority of them are nouns (80,000+). ImageNet aims to provide on average 1000 images to illustrate each synset. Images of each concept are quality-controlled and human-annotated.",
              "governance": {
                "owners": [
                  {
                    "organization": {
                      "name": "microsoft"
                    },
                    "contact": {
                      "email": "sales@microsoft.com"
                    }
                  },
                  {
                    "organization": {
                      "name": "microsoft"
                    },
                    "contact": {
                      "email": "consulting@microsoft.com"
                    }
                  }
                ]
              }
            }
          ],
          "inputs": [
            {
              "format": "image"
            }
          ],
          "outputs": [
            {
              "format": "image class"
            }
          ]
        },
        "quantitativeAnalysis": {
          "performanceMetrics": [
            {
              "type": "CPU",
              "value": "10%",
              "confidenceInterval": {
                "lowerBound": "8",
                "upperBound": "12"
              }
            }
          ],
          "graphics": {
            "description": "Test data",
            "collection": [
              {
                "name": "cat",
                "image": {
                  "contentType": "text/plain",
                  "encoding": "base64",
                  "content": "cat.jpg"
                }
              },
              {
                "name": "dog",
                "image": {
                  "contentType": "text/plain",
                  "encoding": "base64",
                  "content": "dog.jpg"
                }
              }
            ]
          }
        },
        "considerations": {
          "users": [
            "Researcher"
          ],
          "technicalLimitations": [
            "To be used in the EU.",
            "To be used in the UK."
          ],
          "ethicalConsiderations": [
            {
              "name": "User from prohibited location",
              "mitigationStrategy": "Use geolocation to validate source of request."
            }
          ]
        },
        "properties": [
          {
            "name": "num_channels",
            "value": "3"
          }
        ]
      }
    }
  ]
}

```

The following commands will generate a summary of the contents of the MLBOM to the console.

```bash
mlbomdoc --input test.json 

╭───────────────╮
│ MLBOM Summary │
╰───────────────╯
┏━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ Item       ┃ Details                                                      ┃
┡━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩
│ MLBOM File │ test.json                                                    │
│ MLBOM Type │ cyclonedx                                                    │
│ Version    │ 1.5                                                          │
│ Name       │ MLApp                                                        │
│ Creator    │ tool:lib4sbom#0.6.0                                          │
│ Created    │ 2024-01-02T11:02:22Z                                         │
└────────────┴──────────────────────────────────────────────────────────────┘

╭───────────────────────────╮
│ Model Details - resnet-50 │
╰───────────────────────────╯
┏━━━━━━━━━━┳━━━━━━━━━━━━┓
┃ Item     ┃ Value      ┃
┡━━━━━━━━━━╇━━━━━━━━━━━━┩
│ Version  │ 1.5        │
│ Supplier │ microsoft  │
│ License  │ Apache-2.0 │
└──────────┴────────────┘
╭──────────────────╮
│ Model Parameters │
╰──────────────────╯
┏━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ Parameter           ┃ Value                        ┃
┡━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩
│ Approach            │ supervised                   │
│ Task                │ classification               │
│ Architecture Family │ Convolutional neural network │
│ Model Architecture  │ ResNet-50                    │
│ Input               │ image                        │
│ Output              │ image class                  │
└─────────────────────┴──────────────────────────────┘
╭───────────────╮
│ Model Dataset │
╰───────────────╯
┏━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ Parameter      ┃ Value                                                                                                                                                                                     ┃
┡━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩
│ Type           │ dataset                                                                                                                                                                                   │
│ Contents URL   │ https://huggingface.co/datasets/imagenet-1k                                                                                                                                               │
│ Classification │ public                                                                                                                                                                                    │
│ Sensitive Data │ no personal data                                                                                                                                                                          │
│ Description    │ ILSVRC 2012, commonly known as "ImageNet" is an image dataset organized according to the WordNet hierarchy. Each meaningful concept in WordNet, possibly described by multiple words or   │
│                │ word phrases, is called a "synonym set" or "synset". There are more than 100,000 synsets in WordNet, majority of them are nouns (80,000+). ImageNet aims to provide on average 1000       │
│                │ images to illustrate each synset. Images of each concept are quality-controlled and human-annotated.                                                                                      │
└────────────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
╭────────────────────╮
│ Dataset Governance │
╰────────────────────╯
┏━━━━━━━━━━┳━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ Category ┃ Organization ┃ Contact                  ┃
┡━━━━━━━━━━╇━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━┩
│ Owner    │ microsoft    │ sales@microsoft.com      │
│ Owner    │ microsoft    │ consulting@microsoft.com │
└──────────┴──────────────┴──────────────────────────┘
╭───────────────────────╮
│ Quantitative Analysis │
╰───────────────────────╯
╭─────────────────────╮
│ Performance Metrics │
╰─────────────────────╯
┏━━━━━━┳━━━━━━━┳━━━━━━━┳━━━━━━━━━━━━━┳━━━━━━━━━━━━━┓
┃ Type ┃ Value ┃ Slice ┃ Lower BOund ┃ Upper Bound ┃
┡━━━━━━╇━━━━━━━╇━━━━━━━╇━━━━━━━━━━━━━╇━━━━━━━━━━━━━┩
│ CPU  │ 10%   │       │ 8           │ 12          │
└──────┴───────┴───────┴─────────────┴─────────────┘
╭──────────────────────╮
│ Graphics - Test data │
╰──────────────────────╯
┏━━━━━━┳━━━━━━━━━┓
┃ Name ┃ Content ┃
┡━━━━━━╇━━━━━━━━━┩
│ cat  │ cat.jpg │
│ dog  │ dog.jpg │
└──────┴─────────┘
╭────────────────╮
│ Considerations │
╰────────────────╯
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ Category                                     ┃ Value                                          ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩
│ Users                                        │ Researcher                                     │
│ Technical Limitations                        │ To be used in the EU.                          │
│ Technical Limitations                        │ To be used in the UK.                          │
│ Ethical Considerations                       │ User from prohibited location                  │
│ Ethical Considerations - Mitigation Strategy │ Use geolocation to validate source of request. │
└──────────────────────────────────────────────┴────────────────────────────────────────────────┘
╭────────────╮
│ Properties │
╰────────────╯
┏━━━━━━━━━━━━━━┳━━━━━━━┓
┃ Name         ┃ Value ┃
┡━━━━━━━━━━━━━━╇━━━━━━━┩
│ num_channels │ 3     │
└──────────────┴───────┘
                                                                   
```

## Licence

Licenced under the Apache 2.0 Licence.

## Limitations

The tool has the following limitations

- Invalid SBOMs will result in unpredictable results.

## Feedback and Contributions

Bugs and feature requests can be made via GitHub Issues.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/anthonyharrison/mlbomdoc",
    "name": "mlbomdoc",
    "maintainer": "Anthony Harrison",
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "anthony.p.harrison@gmail.com",
    "keywords": "documentation, tools, SBOM, MLBOM, DevSecOps, CycloneDX",
    "author": "Anthony Harrison",
    "author_email": "anthony.p.harrison@gmail.com",
    "download_url": null,
    "platform": null,
    "description": "# MLBOMDoc\n\nMLBOMDOC is a human-readable document generator for an ML-BOM (ML Bill of Materials). MLBOMs document Machine Learning model components\nwhich are typically contained within an SBOM (Software Bill of Materials). MLBOMs are supported for [CycloneDX](https://www.cyclonedx.org).\n\n## Installation\n\nTo install use the following command:\n\n`pip install mlbomdoc`\n\nAlternatively, just clone the repo and install dependencies using the following command:\n\n`pip install -U -r requirements.txt`\n\nThe tool requires Python 3 (3.8+). It is recommended to use a virtual python environment especially\nif you are using different versions of python. `virtualenv` is a tool for setting up virtual python environments which\nallows you to have all the dependencies for the tool set up in a single environment, or have different environments set\nup for testing using different versions of Python.\n\n## Usage\n\n```\nusage: mlbomdoc [-h] [-i INPUT_FILE] [--debug] [-f {console,json,markdown,pdf}] [-o OUTPUT_FILE] [-V]\n\nMLBOMdoc generates documentation for a MLBOM.\n\noptions:\n  -h, --help            show this help message and exit\n  -V, --version         show program's version number and exit\n\nInput:\n  -i INPUT_FILE, --input-file INPUT_FILE\n                        Name of MLBOM file\n\nOutput:\n  --debug               add debug information\n  -f {console,json,markdown,pdf}, --format {console,json,markdown,pdf}\n                        Output format (default: output to console)\n  -o OUTPUT_FILE, --output-file OUTPUT_FILE\n                        output filename (default: output to stdout)\n```\n\t\t\t\t\t\n## Operation\n\nThe `--input-file` option is used to specify the MLBOM to be processed. The format of the SBOM is determined according to\nthe following filename conventions.\n\n| SBOM      | Format    | Filename extension |\n| --------- | --------- |--------------------|\n| CycloneDX | JSON      | .json              |\n\nThe `--output-file` option is used to control the destination of the output generated by the tool. The\ndefault is to report to the console, but it can also be stored in a file (specified using `--output-file` option).\n\n## Example\n\nGiven the following MLBOM (test.json), the following output is produced to the console.\n\n**NOTE** that the data is purely fictitious in order to demonstrate the capability of the tool.\n\n```bash\n{\n  \"$schema\": \"http://cyclonedx.org/schema/bom-1.5.schema.json\",\n  \"bomFormat\": \"CycloneDX\",\n  \"specVersion\": \"1.5\",\n  \"serialNumber\": \"urn:uuid:997191f5-6c2b-4572-9a73-5e0f2d03cedd\",\n  \"version\": 1,\n  \"metadata\": {\n    \"timestamp\": \"2024-01-02T11:02:22Z\",\n    \"tools\": {\n      \"components\": [\n        {\n          \"name\": \"lib4sbom\",\n          \"version\": \"0.6.0\",\n          \"type\": \"application\"\n        }\n      ]\n    },\n    \"component\": {\n      \"type\": \"application\",\n      \"bom-ref\": \"CDXRef-DOCUMENT\",\n      \"name\": \"MLApp\"\n    }\n  },\n  \"components\": [\n    {\n      \"type\": \"library\",\n      \"bom-ref\": \"1-glibc\",\n      \"name\": \"glibc\",\n      \"version\": \"2.15\",\n      \"supplier\": {\n        \"name\": \"gnu\"\n      },\n      \"cpe\": \"cpe:/a:gnu:glibc:2.15\",\n      \"licenses\": [\n        {\n          \"license\": {\n            \"id\": \"GPL-3.0-only\",\n            \"url\": \"https://www.gnu.org/licenses/gpl-3.0-standalone.html\"\n          }\n        }\n      ]\n    },\n    {\n      \"type\": \"operating-system\",\n      \"bom-ref\": \"2-almalinux\",\n      \"name\": \"almalinux\",\n      \"version\": \"9.0\",\n      \"supplier\": {\n        \"name\": \"alma\"\n      },\n      \"cpe\": \"cpe:/o:alma:almalinux:9.0\",\n      \"licenses\": [\n        {\n          \"license\": {\n            \"id\": \"Apache-2.0\",\n            \"url\": \"https://www.apache.org/licenses/LICENSE-2.0\"\n          }\n        }\n      ]\n    },\n    {\n      \"type\": \"library\",\n      \"bom-ref\": \"3-glibc\",\n      \"name\": \"glibc\",\n      \"version\": \"2.29\",\n      \"supplier\": {\n        \"name\": \"gnu\"\n      },\n      \"cpe\": \"cpe:/a:gnu:glibc:2.29\",\n      \"licenses\": [\n        {\n          \"license\": {\n            \"id\": \"GPL-3.0-only\",\n            \"url\": \"https://www.gnu.org/licenses/gpl-3.0-standalone.html\"\n          }\n        }\n      ],\n      \"properties\": [\n        {\n          \"name\": \"language\",\n          \"value\": \"C\"\n        }\n      ]\n    },\n    {\n      \"type\": \"library\",\n      \"bom-ref\": \"4-tomcat\",\n      \"name\": \"tomcat\",\n      \"version\": \"9.0.46\",\n      \"supplier\": {\n        \"name\": \"apache\"\n      },\n      \"cpe\": \"cpe:/a:apache:tomcat:9.0.46\",\n      \"licenses\": [\n        {\n          \"license\": {\n            \"id\": \"Apache-2.0\",\n            \"url\": \"https://www.apache.org/licenses/LICENSE-2.0\"\n          }\n        }\n      ]\n    },\n    {\n      \"type\": \"machine-learning-model\",\n      \"bom-ref\": \"5-resnet-50\",\n      \"name\": \"resnet-50\",\n      \"version\": \"1.5\",\n      \"supplier\": {\n        \"name\": \"microsoft\"\n      },\n      \"description\": \"ResNet (Residual Network) is a convolutional neural network that democratized the concepts of residual learning and skip connections. This enables to train much deeper models.\",\n      \"licenses\": [\n        {\n          \"license\": {\n            \"id\": \"Apache-2.0\",\n            \"url\": \"https://www.apache.org/licenses/LICENSE-2.0\"\n          }\n        }\n      ],\n      \"modelCard\": {\n        \"bom-ref\": \"5-resnet-50-model\",\n        \"modelParameters\": {\n          \"approach\": {\n            \"type\": \"supervised\"\n          },\n          \"task\": \"classification\",\n          \"architectureFamily\": \"Convolutional neural network\",\n          \"modelArchitecture\": \"ResNet-50\",\n          \"datasets\": [\n            {\n              \"type\": \"dataset\",\n              \"name\": \"ImageNet\",\n              \"contents\": {\n                \"url\": \"https://huggingface.co/datasets/imagenet-1k\"\n              },\n              \"classification\": \"public\",\n              \"sensitiveData\": \"no personal data\",\n              \"description\": \"ILSVRC 2012, commonly known as \\\"ImageNet\\\" is an image dataset organized according to the WordNet hierarchy. Each meaningful concept in WordNet, possibly described by multiple words or word phrases, is called a \\\"synonym set\\\" or \\\"synset\\\". There are more than 100,000 synsets in WordNet, majority of them are nouns (80,000+). ImageNet aims to provide on average 1000 images to illustrate each synset. Images of each concept are quality-controlled and human-annotated.\",\n              \"governance\": {\n                \"owners\": [\n                  {\n                    \"organization\": {\n                      \"name\": \"microsoft\"\n                    },\n                    \"contact\": {\n                      \"email\": \"sales@microsoft.com\"\n                    }\n                  },\n                  {\n                    \"organization\": {\n                      \"name\": \"microsoft\"\n                    },\n                    \"contact\": {\n                      \"email\": \"consulting@microsoft.com\"\n                    }\n                  }\n                ]\n              }\n            }\n          ],\n          \"inputs\": [\n            {\n              \"format\": \"image\"\n            }\n          ],\n          \"outputs\": [\n            {\n              \"format\": \"image class\"\n            }\n          ]\n        },\n        \"quantitativeAnalysis\": {\n          \"performanceMetrics\": [\n            {\n              \"type\": \"CPU\",\n              \"value\": \"10%\",\n              \"confidenceInterval\": {\n                \"lowerBound\": \"8\",\n                \"upperBound\": \"12\"\n              }\n            }\n          ],\n          \"graphics\": {\n            \"description\": \"Test data\",\n            \"collection\": [\n              {\n                \"name\": \"cat\",\n                \"image\": {\n                  \"contentType\": \"text/plain\",\n                  \"encoding\": \"base64\",\n                  \"content\": \"cat.jpg\"\n                }\n              },\n              {\n                \"name\": \"dog\",\n                \"image\": {\n                  \"contentType\": \"text/plain\",\n                  \"encoding\": \"base64\",\n                  \"content\": \"dog.jpg\"\n                }\n              }\n            ]\n          }\n        },\n        \"considerations\": {\n          \"users\": [\n            \"Researcher\"\n          ],\n          \"technicalLimitations\": [\n            \"To be used in the EU.\",\n            \"To be used in the UK.\"\n          ],\n          \"ethicalConsiderations\": [\n            {\n              \"name\": \"User from prohibited location\",\n              \"mitigationStrategy\": \"Use geolocation to validate source of request.\"\n            }\n          ]\n        },\n        \"properties\": [\n          {\n            \"name\": \"num_channels\",\n            \"value\": \"3\"\n          }\n        ]\n      }\n    }\n  ]\n}\n\n```\n\nThe following commands will generate a summary of the contents of the MLBOM to the console.\n\n```bash\nmlbomdoc --input test.json \n\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 MLBOM Summary \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n\u250f\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2513\n\u2503 Item       \u2503 Details                                                      \u2503\n\u2521\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2529\n\u2502 MLBOM File \u2502 test.json                                                    \u2502\n\u2502 MLBOM Type \u2502 cyclonedx                                                    \u2502\n\u2502 Version    \u2502 1.5                                                          \u2502\n\u2502 Name       \u2502 MLApp                                                        \u2502\n\u2502 Creator    \u2502 tool:lib4sbom#0.6.0                                          \u2502\n\u2502 Created    \u2502 2024-01-02T11:02:22Z                                         \u2502\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Model Details - resnet-50 \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n\u250f\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2513\n\u2503 Item     \u2503 Value      \u2503\n\u2521\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2529\n\u2502 Version  \u2502 1.5        \u2502\n\u2502 Supplier \u2502 microsoft  \u2502\n\u2502 License  \u2502 Apache-2.0 \u2502\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Model Parameters \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n\u250f\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2513\n\u2503 Parameter           \u2503 Value                        \u2503\n\u2521\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2529\n\u2502 Approach            \u2502 supervised                   \u2502\n\u2502 Task                \u2502 classification               \u2502\n\u2502 Architecture Family \u2502 Convolutional neural network \u2502\n\u2502 Model Architecture  \u2502 ResNet-50                    \u2502\n\u2502 Input               \u2502 image                        \u2502\n\u2502 Output              \u2502 image class                  \u2502\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Model Dataset \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n\u250f\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2513\n\u2503 Parameter      \u2503 Value                                                                                                                                                                                     \u2503\n\u2521\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2529\n\u2502 Type           \u2502 dataset                                                                                                                                                                                   \u2502\n\u2502 Contents URL   \u2502 https://huggingface.co/datasets/imagenet-1k                                                                                                                                               \u2502\n\u2502 Classification \u2502 public                                                                                                                                                                                    \u2502\n\u2502 Sensitive Data \u2502 no personal data                                                                                                                                                                          \u2502\n\u2502 Description    \u2502 ILSVRC 2012, commonly known as \"ImageNet\" is an image dataset organized according to the WordNet hierarchy. Each meaningful concept in WordNet, possibly described by multiple words or   \u2502\n\u2502                \u2502 word phrases, is called a \"synonym set\" or \"synset\". There are more than 100,000 synsets in WordNet, majority of them are nouns (80,000+). ImageNet aims to provide on average 1000       \u2502\n\u2502                \u2502 images to illustrate each synset. Images of each concept are quality-controlled and human-annotated.                                                                                      \u2502\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Dataset Governance \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n\u250f\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2513\n\u2503 Category \u2503 Organization \u2503 Contact                  \u2503\n\u2521\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2529\n\u2502 Owner    \u2502 microsoft    \u2502 sales@microsoft.com      \u2502\n\u2502 Owner    \u2502 microsoft    \u2502 consulting@microsoft.com \u2502\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Quantitative Analysis \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Performance Metrics \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n\u250f\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2513\n\u2503 Type \u2503 Value \u2503 Slice \u2503 Lower BOund \u2503 Upper Bound \u2503\n\u2521\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2529\n\u2502 CPU  \u2502 10%   \u2502       \u2502 8           \u2502 12          \u2502\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Graphics - Test data \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n\u250f\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2513\n\u2503 Name \u2503 Content \u2503\n\u2521\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2529\n\u2502 cat  \u2502 cat.jpg \u2502\n\u2502 dog  \u2502 dog.jpg \u2502\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Considerations \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n\u250f\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2513\n\u2503 Category                                     \u2503 Value                                          \u2503\n\u2521\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2529\n\u2502 Users                                        \u2502 Researcher                                     \u2502\n\u2502 Technical Limitations                        \u2502 To be used in the EU.                          \u2502\n\u2502 Technical Limitations                        \u2502 To be used in the UK.                          \u2502\n\u2502 Ethical Considerations                       \u2502 User from prohibited location                  \u2502\n\u2502 Ethical Considerations - Mitigation Strategy \u2502 Use geolocation to validate source of request. \u2502\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Properties \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n\u250f\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2513\n\u2503 Name         \u2503 Value \u2503\n\u2521\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2529\n\u2502 num_channels \u2502 3     \u2502\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n                                                                   \n```\n\n## Licence\n\nLicenced under the Apache 2.0 Licence.\n\n## Limitations\n\nThe tool has the following limitations\n\n- Invalid SBOMs will result in unpredictable results.\n\n## Feedback and Contributions\n\nBugs and feature requests can be made via GitHub Issues.\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "MLBOM documentation tool",
    "version": "0.1.1",
    "project_urls": {
        "Homepage": "https://github.com/anthonyharrison/mlbomdoc"
    },
    "split_keywords": [
        "documentation",
        " tools",
        " sbom",
        " mlbom",
        " devsecops",
        " cyclonedx"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e1e829c872ef17f557a62ff4081eeb6e1f54f994f47ca58890fb4be6409c7a2a",
                "md5": "eedd0c6ab6ebbdcbdfcd4da3f2cee35f",
                "sha256": "51e526b458ae5140aa8cc84a7e992bf99c8b521f1e138f08bb70e07fce2b0269"
            },
            "downloads": -1,
            "filename": "mlbomdoc-0.1.1-py2.py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "eedd0c6ab6ebbdcbdfcd4da3f2cee35f",
            "packagetype": "bdist_wheel",
            "python_version": "py2.py3",
            "requires_python": ">=3.8",
            "size": 13419,
            "upload_time": "2024-07-28T17:29:56",
            "upload_time_iso_8601": "2024-07-28T17:29:56.990915Z",
            "url": "https://files.pythonhosted.org/packages/e1/e8/29c872ef17f557a62ff4081eeb6e1f54f994f47ca58890fb4be6409c7a2a/mlbomdoc-0.1.1-py2.py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-07-28 17:29:56",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "anthonyharrison",
    "github_project": "mlbomdoc",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "lib4sbom",
            "specs": [
                [
                    ">=",
                    "0.7.2"
                ]
            ]
        },
        {
            "name": "sbom2doc",
            "specs": [
                [
                    ">=",
                    "0.4.5"
                ]
            ]
        }
    ],
    "tox": true,
    "lcname": "mlbomdoc"
}
        
Elapsed time: 0.67129s