PyTorchLayerViz


NamePyTorchLayerViz JSON
Version 1.2.3 PyPI version JSON
download
home_pageNone
SummaryPyTorchLayerViz is a Python library that allows you to visualize the weights and feature maps of a PyTorch model.
upload_time2024-10-09 14:27:36
maintainerNone
docs_urlNone
authorSimone Panico
requires_pythonNone
licenseNone
keywords python pytorch deep learning model
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ![Version](https://img.shields.io/github/v/release/simone-panico/PyTorchLayerViz)
![License](https://img.shields.io/github/license/simone-panico/PyTorchLayerViz)
![Commit Activity](https://img.shields.io/github/commit-activity/m/simone-panico/PyTorchLayerViz)
![Last Commit](https://img.shields.io/github/last-commit/simone-panico/PyTorchLayerViz)
![Issues](https://img.shields.io/github/issues/simone-panico/PyTorchLayerViz)
![Platform](https://img.shields.io/badge/platform-PyTorch-blue)


# PyTorchLayerViz

**PyTorchLayerViz** is a Python library designed to assist developers and researchers in visualizing the weights and feature maps of PyTorch models. This tool provides easy-to-use functions to help understand and interpret deep learning models, making it an essential utility for anyone working with PyTorch.

## Table of Contents

- [PyTorchLayerViz](#pytorchlayerviz)
  - [Table of Contents](#table-of-contents)
  - [Installation](#installation)
  - [Usage](#usage)
    - [Parameters](#parameters)
  - [Features](#features)
  - [Examples](#examples)
    - [Example Picture](#example-picture)
    - [Code](#code)
    - [Output](#output)
  - [Contributing](#contributing)
  - [License](#license)
  - [Contact](#contact)

## Installation

To install PyTorchLayerViz, you can use pip:

```bash
pip install pytorchlayerviz
```

## Usage

Here is a basic example of how to use PyTorchLayerViz:

```python
from PyTorchLayerViz import get_feature_maps
import matplotlib.pyplot as plt
from PIL import Image
import torch
from torch import nn
from torchvision import datasets, transforms, models
from torchvision.transforms import ToTensor

# Define your model
model = torch.nn.Sequential(
    torch.nn.Conv2d(3, 20, 5),
    torch.nn.ReLU(),
    torch.nn.Conv2d(20, 64, 5),
    torch.nn.ReLU()
)

layers_to_check = [nn.Conv2d] # Define all Layers you want to pass your picture

input_image_path = 'pictures/hamburger.jpg' # Path to your example picture

numpyArr = get_feature_maps(model = model, layers_to_check = layers_to_check, input_image_path = input_image_path, print_image=True) # Call function from pytorchlayerviz
```

### Parameters

- **model (nn.Module)** – The PyTorch model whose layers' feature maps you want to visualize. *Required*.
- **layers_to_check (arr of nn.Module)** – List of layer types (e.g., `nn.Conv2d`) to check for feature maps. *Required*.
- **input_image_path (str)** – Path to the input image file. *Required*.
- **transform (transforms.Compose, optional)** – A function/transform that takes in an image and returns a transformed version. Default is None. *Optional*.
- **sequential_order (bool, optional)** – If True, the layers are visualized in the order they are defined in the model. If false it will first go through the first layer defined in the arrDefault is True. *Optional*.
- **print_image (bool, optional)** – If True the Images are getting printed with matplotlib. Default is False. *Optional*.

**Return** The function 'get_feature_maps()` returns the pictures as NumPy Arrays

If transform is none, this will be used:

```python
    transform = transforms.Compose([
    transforms.Resize((224, 224)),  # Resize the image to 224x224 pixels
    transforms.ToTensor(),  # Convert the image to a PyTorch tensor
    ])
```

If you want to pass your own transform, make sure you resize the image and convert it to a tensor with `transforms.ToTensor()`

## Features

* Visualize Weights: Easily visualize the weights of each layer in your PyTorch model.
* Visualize Feature Maps: Generate and visualize feature maps for given inputs.
* Customizable: Flexible options for customizing visualizations.


## Examples

### Example Picture

![Example Picture](pictures/hamburger.jpg)

### Code

```python
pretrained_model = models.vgg16(pretrained=True)
input_image_path = 'hamburger.jpg'
layers_to_check= [nn.MaxPool2d]

numpyArr = get_feature_maps(model = pretrained_model, layers_to_check = layers_to_check, input_image_path = input_image_path, sequential_order = False, print_image = True)
```

### Output

![Hamburger result Picture](pictures/hamburger_results.png)


## Contributing

I welcome contributions to PyTorchLayerViz! If you'd like to contribute, please follow these steps:

1. Fork the repository.
2. Create a new branch (*git checkout -b feature-branch*).
3. Make your changes.
4. Commit your changes (*git commit -m 'Add new feature'*).
5. Push to the branch (*git push origin feature-branch*).
6. Open a pull request.

## License

This project is licensed under the MIT License - see the [LICENSE](LICENSE.md) file for details.

## Contact

For any questions, suggestions, or issues, please open an issue on GitHub or contact me.

* Simone Panico: simone.panico@icloud.com
* Github Issues: https://github.com/simone-panico/PyTorchLayerViz/issues


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "PyTorchLayerViz",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "python, pytorch, deep learning, model",
    "author": "Simone Panico",
    "author_email": "simone.panico@icloud.com",
    "download_url": null,
    "platform": null,
    "description": "![Version](https://img.shields.io/github/v/release/simone-panico/PyTorchLayerViz)\n![License](https://img.shields.io/github/license/simone-panico/PyTorchLayerViz)\n![Commit Activity](https://img.shields.io/github/commit-activity/m/simone-panico/PyTorchLayerViz)\n![Last Commit](https://img.shields.io/github/last-commit/simone-panico/PyTorchLayerViz)\n![Issues](https://img.shields.io/github/issues/simone-panico/PyTorchLayerViz)\n![Platform](https://img.shields.io/badge/platform-PyTorch-blue)\n\n\n# PyTorchLayerViz\n\n**PyTorchLayerViz** is a Python library designed to assist developers and researchers in visualizing the weights and feature maps of PyTorch models. This tool provides easy-to-use functions to help understand and interpret deep learning models, making it an essential utility for anyone working with PyTorch.\n\n## Table of Contents\n\n- [PyTorchLayerViz](#pytorchlayerviz)\n  - [Table of Contents](#table-of-contents)\n  - [Installation](#installation)\n  - [Usage](#usage)\n    - [Parameters](#parameters)\n  - [Features](#features)\n  - [Examples](#examples)\n    - [Example Picture](#example-picture)\n    - [Code](#code)\n    - [Output](#output)\n  - [Contributing](#contributing)\n  - [License](#license)\n  - [Contact](#contact)\n\n## Installation\n\nTo install PyTorchLayerViz, you can use pip:\n\n```bash\npip install pytorchlayerviz\n```\n\n## Usage\n\nHere is a basic example of how to use PyTorchLayerViz:\n\n```python\nfrom PyTorchLayerViz import get_feature_maps\nimport matplotlib.pyplot as plt\nfrom PIL import Image\nimport torch\nfrom torch import nn\nfrom torchvision import datasets, transforms, models\nfrom torchvision.transforms import ToTensor\n\n# Define your model\nmodel = torch.nn.Sequential(\n    torch.nn.Conv2d(3, 20, 5),\n    torch.nn.ReLU(),\n    torch.nn.Conv2d(20, 64, 5),\n    torch.nn.ReLU()\n)\n\nlayers_to_check = [nn.Conv2d] #\u00a0Define all Layers you want to pass your picture\n\ninput_image_path = 'pictures/hamburger.jpg' #\u00a0Path to your example picture\n\nnumpyArr = get_feature_maps(model = model, layers_to_check = layers_to_check, input_image_path = input_image_path, print_image=True) #\u00a0Call function from pytorchlayerviz\n```\n\n### Parameters\n\n- **model (nn.Module)** \u2013 The PyTorch model whose layers' feature maps you want to visualize. *Required*.\n- **layers_to_check (arr of nn.Module)** \u2013 List of layer types (e.g., `nn.Conv2d`) to check for feature maps. *Required*.\n- **input_image_path (str)** \u2013 Path to the input image file. *Required*.\n- **transform (transforms.Compose, optional)** \u2013 A function/transform that takes in an image and returns a transformed version. Default is None. *Optional*.\n- **sequential_order (bool, optional)** \u2013 If True, the layers are visualized in the order they are defined in the model. If false it will first go through the first layer defined in the arrDefault is True. *Optional*.\n- **print_image (bool, optional)** \u2013 If True the Images are getting printed with matplotlib. Default is False. *Optional*.\n\n**Return** The function 'get_feature_maps()` returns the pictures as NumPy Arrays\n\nIf transform is none, this will be used:\n\n```python\n    transform = transforms.Compose([\n    transforms.Resize((224, 224)),  # Resize the image to 224x224 pixels\n    transforms.ToTensor(),  # Convert the image to a PyTorch tensor\n    ])\n```\n\nIf you want to pass your own transform, make sure you resize the image and convert it to a tensor with `transforms.ToTensor()`\n\n## Features\n\n* Visualize Weights: Easily visualize the weights of each layer in your PyTorch model.\n* Visualize Feature Maps: Generate and visualize feature maps for given inputs.\n* Customizable: Flexible options for customizing visualizations.\n\n\n## Examples\n\n### Example Picture\n\n![Example Picture](pictures/hamburger.jpg)\n\n### Code\n\n```python\npretrained_model = models.vgg16(pretrained=True)\ninput_image_path = 'hamburger.jpg'\nlayers_to_check= [nn.MaxPool2d]\n\nnumpyArr = get_feature_maps(model = pretrained_model, layers_to_check = layers_to_check, input_image_path = input_image_path, sequential_order = False, print_image = True)\n```\n\n### Output\n\n![Hamburger result Picture](pictures/hamburger_results.png)\n\n\n## Contributing\n\nI welcome contributions to PyTorchLayerViz! If you'd like to contribute, please follow these steps:\n\n1. Fork the repository.\n2. Create a new branch (*git checkout -b feature-branch*).\n3. Make your changes.\n4. Commit your changes (*git commit -m 'Add new feature'*).\n5. Push to the branch (*git push origin feature-branch*).\n6. Open a pull request.\n\n## License\n\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE.md) file for details.\n\n## Contact\n\nFor any questions, suggestions, or issues, please open an issue on GitHub or contact me.\n\n* Simone Panico: simone.panico@icloud.com\n* Github Issues: https://github.com/simone-panico/PyTorchLayerViz/issues\n\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "PyTorchLayerViz is a Python library that allows you to visualize the weights and feature maps of a PyTorch model.",
    "version": "1.2.3",
    "project_urls": null,
    "split_keywords": [
        "python",
        " pytorch",
        " deep learning",
        " model"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8bab05b17eda08b0b2cc4a06ef888f0d31417efcaa39a2380bcbe6842406a20b",
                "md5": "34a7ec5ccc2fa5933409f5110c50aa96",
                "sha256": "c83939ef25318d49aaa2ff784407d4ae0514d4500261143064f46fbf6bc86c9f"
            },
            "downloads": -1,
            "filename": "PyTorchLayerViz-1.2.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "34a7ec5ccc2fa5933409f5110c50aa96",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 6119,
            "upload_time": "2024-10-09T14:27:36",
            "upload_time_iso_8601": "2024-10-09T14:27:36.481311Z",
            "url": "https://files.pythonhosted.org/packages/8b/ab/05b17eda08b0b2cc4a06ef888f0d31417efcaa39a2380bcbe6842406a20b/PyTorchLayerViz-1.2.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-09 14:27:36",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "pytorchlayerviz"
}
        
Elapsed time: 0.44109s