Name | frechet-coefficient JSON |
Version |
2.0.7
JSON |
| download |
home_page | None |
Summary | Frechet Coefficient |
upload_time | 2024-11-27 22:11:53 |
maintainer | Adrian Kucharski |
docs_url | None |
author | Adrian Kucharski |
requires_python | <3.13,>=3.9 |
license | MIT License Copyright (c) 2024 Adrian Kucharski Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. |
keywords |
fid
frechet
frechet-coefficient
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Frechet Coefficient
Frechet Coefficient is a Python package for calculating various similarity metrics between images, including Frechet Distance, Frechet Coefficient, and Hellinger Distance. It leverages pre-trained models from TensorFlow's Keras applications and Torchvision to extract features from images.
## Table of Contents
- [Installation](#installation)
- [Usage](#usage)
- [Features](#features)
- [Citation](#citation)
- [License](#license)
## Installation
To install the package, use the following command:
```sh
pip install frechet-coefficient # if you have TensorFlow or PyTorch
```
```sh
pip install frechet-coefficient[tensorflow] # for TensorFlow support
```
```sh
pip install frechet-coefficient[torch] # for PyTorch support
```
Requirements:
- Python 3.9-3.12
- TensorFlow >= 2.16.0 OR PyTorch >= 2.0.0 with Torchvision >= 0.15.0 # you can try older versions too
- imageio >= 2.29.0
- numpy >= 1.21.0
## Usage
You can use the command-line interface (CLI) to calculate similarity metrics between two directories of images.
```sh
frechet-coefficient <path_to_directory1> <path_to_directory2> --metric <metric> [options]
```
Remember to use enough images to get a meaningful result. If your datasets are small, consider using `--random_patches` argument to calculate metrics on random patches of images.
### Positional Arguments
- `dir1`: Path to the first directory of images.
- `dir2`: Path to the second directory of images.
### Options
- `--metric`: Metric to calculate (fd, fc, hd).
- `--batch_size`: Batch size for processing images.
- `--num_of_images`: Number of images to load from each directory.
- `--as_gray`: Load images as grayscale.
- `--random_patches`: Calculate metrics on random patches of images.
- `--patch_size`: Size of the random patches.
- `--num_of_patch`: Number of random patches to extract.
- `--model`: Pre-trained model to use as feature extractor (inceptionv3, resnet50v2, xception, densenet201, convnexttiny, efficientnetv2s).
- `--verbose`: Enable verbose output.
### Example CLI Commands
To calculate the Frechet Distance between two sets of images, use the following command:
```sh
frechet-coefficient images/set1 images/set2 --metric fd
```
To calculate the Frechet Coefficient between two sets of images using the InceptionV3 model, use the following command:
```sh
frechet-coefficient images/set1 images/set2 --metric fc --model inceptionv3
```
To calculate the Hellinger Distance between two sets of images using random patches, use the following command:
```sh
frechet-coefficient images/set1 images/set2 --metric hd --random_patches --patch_size 128 --num_of_patch 10000
```
### Python Code
You can also use python code to calculate similarity metrics between two sets of images.
```python
import numpy as np
from typing import List
from frechet_coefficient import ImageSimilarityMetrics, load_images
# Initialize the ImageSimilarityMetrics class
ism = ImageSimilarityMetrics(model='inceptionv3', verbose=0)
images_1: List[np.ndarray] = load_images(path=...) # shape: (num_of_images, height, width, channels)
images_2: List[np.ndarray] = load_images(path=...) # shape: (num_of_images, height, width, channels)
# Calculate Frechet Distance
fd = ism.calculate_frechet_distance(images_1, images_2, batch_size=4)
# Calculate Frechet Coefficient
fc = ism.calculate_frechet_coefficient(images_1, images_2, batch_size=4)
# Calculate Hellinger Distance
hd = ism.calculate_hellinger_distance(images_1, images_2, batch_size=4)
# Calculate means vectors and covariance matrices
mean_1, cov_1 = ism.derive_mean_cov(images_1, batch_size=4)
mean_2, cov_2 = ism.derive_mean_cov(images_2, batch_size=4)
# Calculate metrics using mean vectors and covariance matrices
fd = ism.calculate_fd_with_mean_cov(mean_1, cov_1, mean_2, cov_2)
fc = ism.calculate_fc_with_mean_cov(mean_1, cov_1, mean_2, cov_2)
hd = ism.calculate_hd_with_mean_cov(mean_1, cov_1, mean_2, cov_2)
```
You can also calculate similarity metrics between two sets of images using random patches.
```python
import numpy as np
from typing import List
from frechet_coefficient import ImageSimilarityMetrics, crop_random_patches, load_images
# Initialize the ImageSimilarityMetrics class
ism = ImageSimilarityMetrics(model='inceptionv3', verbose=0)
images_1: List[np.ndarray] = load_images(path=...) # shape: (num_of_images, height, width, channels)
images_2: List[np.ndarray] = load_images(path=...) # shape: (num_of_images, height, width, channels)
# Crop random patches from images
images_1_patches = crop_random_patches(images_1, size=(128, 128), num_of_patch=10000)
images_2_patches = crop_random_patches(images_2, size=(128, 128), num_of_patch=10000)
# Calculate Frechet Distance
fd = ism.calculate_frechet_distance(images_1_patches, images_2_patches, batch_size=4)
# Calculate Frechet Coefficient
fc = ism.calculate_frechet_coefficient(images_1_patches, images_2_patches, batch_size=4)
# Calculate Hellinger Distance
hd = ism.calculate_hellinger_distance(images_1_patches, images_2_patches, batch_size=4)
```
### Metrics
- `fd`: Frechet Distance (with InceptionV3 model is equivalent to FID)
- `fc`: Frechet Coefficient
- `hd`: Hellinger Distance
The Hellinger Distance is numerically unstable for small datasets. The main reason is poorly estimated covariance matrices and calculating determinant of a large matrix (e.g. 768x768) might lead to numerical instability.
To mitigate this issue, you can use the `--random_patches` argument to calculate metrics on random patches of images with a very high number of patches (e.g., 50000).
### Models
| Model | Input size | Output size | Parameters | Keras Applications | Torchvision |
|:---------------:|:----------:|:-----------:|:----------:|:---------------------------------------------------------------:|:----------------------------------------------------------------------------:|
| InceptionV3 | 299x299 | 2048 | 23.9M | [inceptionv3](https://keras.io/api/applications/inceptionv3/) | [inception](https://pytorch.org/vision/0.20/models/inception.html) |
| ResNet50v2 | 224x224 | 2048 | 25.6M | [resnet](https://keras.io/api/applications/resnet/) | not available in PyTorch |
| Xception | 224x224 | 2048 | 22.9M | [xception](https://keras.io/api/applications/xception/) | not available in PyTorch |
| DenseNet201 | 224x224 | 1920 | 20.2M | [densenet](https://keras.io/api/applications/densenet/) | [densenet](https://pytorch.org/vision/0.20/models/densenet.html) |
| ConvNeXtTiny | 224x224 | 768 | 28.6M | [convnext](https://keras.io/api/applications/convnext/) | [convnext](https://pytorch.org/vision/0.20/models/convnext.html) |
| EfficientNetV2S | 384x384 | 1280 | 21.6M | [efficientnet](https://keras.io/api/applications/efficientnet/) | [efficientnetv2](https://pytorch.org/vision/0.20/models/efficientnetv2.html) |
### PyTorch
To set PyTorch device, use the following code:
```python
import os
os.environ["FRECHET_COEFFICIENT_DEVICE_TORCH"] = "cuda" # or "cpu"
# import the package after setting the device
```
## Features
- Calculate Frechet Distance, Frechet Coefficient, and Hellinger Distance between two sets of images.
- Support for multiple pre-trained models.
- Option to calculate metrics on random patches of images.
## Citation
If you use this package in your research, please consider citing the following paper:
- not available yet
## License
This project is licensed under the MIT License. See the [`LICENSE`] file for details.
Raw data
{
"_id": null,
"home_page": null,
"name": "frechet-coefficient",
"maintainer": "Adrian Kucharski",
"docs_url": null,
"requires_python": "<3.13,>=3.9",
"maintainer_email": null,
"keywords": "fid, frechet, frechet-coefficient",
"author": "Adrian Kucharski",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/eb/39/302843a39e0c1f0d943bac3924294b4c1ae23e8ce0b4a53f01b990d30e03/frechet_coefficient-2.0.7.tar.gz",
"platform": null,
"description": "# Frechet Coefficient\n\nFrechet Coefficient is a Python package for calculating various similarity metrics between images, including Frechet Distance, Frechet Coefficient, and Hellinger Distance. It leverages pre-trained models from TensorFlow's Keras applications and Torchvision to extract features from images.\n\n## Table of Contents\n\n- [Installation](#installation)\n- [Usage](#usage)\n- [Features](#features)\n- [Citation](#citation)\n- [License](#license)\n\n## Installation\n\nTo install the package, use the following command:\n\n```sh\npip install frechet-coefficient # if you have TensorFlow or PyTorch\n```\n\n```sh\npip install frechet-coefficient[tensorflow] # for TensorFlow support\n```\n\n```sh\npip install frechet-coefficient[torch] # for PyTorch support\n```\n\nRequirements:\n- Python 3.9-3.12\n- TensorFlow >= 2.16.0 OR PyTorch >= 2.0.0 with Torchvision >= 0.15.0 # you can try older versions too\n- imageio >= 2.29.0\n- numpy >= 1.21.0\n\n\n## Usage\n\nYou can use the command-line interface (CLI) to calculate similarity metrics between two directories of images.\n\n```sh\nfrechet-coefficient <path_to_directory1> <path_to_directory2> --metric <metric> [options]\n```\n\nRemember to use enough images to get a meaningful result. If your datasets are small, consider using `--random_patches` argument to calculate metrics on random patches of images.\n\n### Positional Arguments\n- `dir1`: Path to the first directory of images.\n- `dir2`: Path to the second directory of images.\n\n### Options\n\n- `--metric`: Metric to calculate (fd, fc, hd).\n- `--batch_size`: Batch size for processing images.\n- `--num_of_images`: Number of images to load from each directory.\n- `--as_gray`: Load images as grayscale.\n- `--random_patches`: Calculate metrics on random patches of images.\n- `--patch_size`: Size of the random patches.\n- `--num_of_patch`: Number of random patches to extract.\n- `--model`: Pre-trained model to use as feature extractor (inceptionv3, resnet50v2, xception, densenet201, convnexttiny, efficientnetv2s).\n- `--verbose`: Enable verbose output.\n\n### Example CLI Commands\n\nTo calculate the Frechet Distance between two sets of images, use the following command:\n```sh\nfrechet-coefficient images/set1 images/set2 --metric fd\n```\n\nTo calculate the Frechet Coefficient between two sets of images using the InceptionV3 model, use the following command:\n```sh\nfrechet-coefficient images/set1 images/set2 --metric fc --model inceptionv3\n```\n\nTo calculate the Hellinger Distance between two sets of images using random patches, use the following command:\n```sh\nfrechet-coefficient images/set1 images/set2 --metric hd --random_patches --patch_size 128 --num_of_patch 10000\n```\n\n### Python Code\n\nYou can also use python code to calculate similarity metrics between two sets of images.\n\n```python\nimport numpy as np\nfrom typing import List\nfrom frechet_coefficient import ImageSimilarityMetrics, load_images\n\n# Initialize the ImageSimilarityMetrics class\nism = ImageSimilarityMetrics(model='inceptionv3', verbose=0)\n\nimages_1: List[np.ndarray] = load_images(path=...) # shape: (num_of_images, height, width, channels)\nimages_2: List[np.ndarray] = load_images(path=...) # shape: (num_of_images, height, width, channels)\n\n# Calculate Frechet Distance\nfd = ism.calculate_frechet_distance(images_1, images_2, batch_size=4)\n# Calculate Frechet Coefficient\nfc = ism.calculate_frechet_coefficient(images_1, images_2, batch_size=4)\n# Calculate Hellinger Distance\nhd = ism.calculate_hellinger_distance(images_1, images_2, batch_size=4)\n\n# Calculate means vectors and covariance matrices\nmean_1, cov_1 = ism.derive_mean_cov(images_1, batch_size=4)\nmean_2, cov_2 = ism.derive_mean_cov(images_2, batch_size=4)\n\n# Calculate metrics using mean vectors and covariance matrices\nfd = ism.calculate_fd_with_mean_cov(mean_1, cov_1, mean_2, cov_2)\nfc = ism.calculate_fc_with_mean_cov(mean_1, cov_1, mean_2, cov_2)\nhd = ism.calculate_hd_with_mean_cov(mean_1, cov_1, mean_2, cov_2)\n\n```\n\nYou can also calculate similarity metrics between two sets of images using random patches.\n\n```python\nimport numpy as np\nfrom typing import List\nfrom frechet_coefficient import ImageSimilarityMetrics, crop_random_patches, load_images\n\n# Initialize the ImageSimilarityMetrics class\nism = ImageSimilarityMetrics(model='inceptionv3', verbose=0)\n\nimages_1: List[np.ndarray] = load_images(path=...) # shape: (num_of_images, height, width, channels)\nimages_2: List[np.ndarray] = load_images(path=...) # shape: (num_of_images, height, width, channels)\n\n# Crop random patches from images\nimages_1_patches = crop_random_patches(images_1, size=(128, 128), num_of_patch=10000)\nimages_2_patches = crop_random_patches(images_2, size=(128, 128), num_of_patch=10000)\n\n# Calculate Frechet Distance\nfd = ism.calculate_frechet_distance(images_1_patches, images_2_patches, batch_size=4)\n# Calculate Frechet Coefficient\nfc = ism.calculate_frechet_coefficient(images_1_patches, images_2_patches, batch_size=4)\n# Calculate Hellinger Distance\nhd = ism.calculate_hellinger_distance(images_1_patches, images_2_patches, batch_size=4)\n```\n\n\n### Metrics\n\n- `fd`: Frechet Distance (with InceptionV3 model is equivalent to FID)\n- `fc`: Frechet Coefficient\n- `hd`: Hellinger Distance\n\nThe Hellinger Distance is numerically unstable for small datasets. The main reason is poorly estimated covariance matrices and calculating determinant of a large matrix (e.g. 768x768) might lead to numerical instability.\nTo mitigate this issue, you can use the `--random_patches` argument to calculate metrics on random patches of images with a very high number of patches (e.g., 50000).\n\n### Models\n\n| Model | Input size | Output size | Parameters | Keras Applications | Torchvision |\n|:---------------:|:----------:|:-----------:|:----------:|:---------------------------------------------------------------:|:----------------------------------------------------------------------------:|\n| InceptionV3 | 299x299 | 2048 | 23.9M | [inceptionv3](https://keras.io/api/applications/inceptionv3/) | [inception](https://pytorch.org/vision/0.20/models/inception.html) |\n| ResNet50v2 | 224x224 | 2048 | 25.6M | [resnet](https://keras.io/api/applications/resnet/) | not available in PyTorch |\n| Xception | 224x224 | 2048 | 22.9M | [xception](https://keras.io/api/applications/xception/) | not available in PyTorch |\n| DenseNet201 | 224x224 | 1920 | 20.2M | [densenet](https://keras.io/api/applications/densenet/) | [densenet](https://pytorch.org/vision/0.20/models/densenet.html) |\n| ConvNeXtTiny | 224x224 | 768 | 28.6M | [convnext](https://keras.io/api/applications/convnext/) | [convnext](https://pytorch.org/vision/0.20/models/convnext.html) |\n| EfficientNetV2S | 384x384 | 1280 | 21.6M | [efficientnet](https://keras.io/api/applications/efficientnet/) | [efficientnetv2](https://pytorch.org/vision/0.20/models/efficientnetv2.html) |\n\n\n### PyTorch \nTo set PyTorch device, use the following code:\n```python\nimport os\nos.environ[\"FRECHET_COEFFICIENT_DEVICE_TORCH\"] = \"cuda\" # or \"cpu\"\n\n# import the package after setting the device\n```\n\n## Features\n\n- Calculate Frechet Distance, Frechet Coefficient, and Hellinger Distance between two sets of images.\n- Support for multiple pre-trained models.\n- Option to calculate metrics on random patches of images. \n\n## Citation\n\nIf you use this package in your research, please consider citing the following paper:\n\n- not available yet\n\n## License\n\nThis project is licensed under the MIT License. See the [`LICENSE`] file for details.\n",
"bugtrack_url": null,
"license": "MIT License Copyright (c) 2024 Adrian Kucharski Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.",
"summary": "Frechet Coefficient",
"version": "2.0.7",
"project_urls": {
"Documentation": "https://github.com/adriankucharski/frechet-coefficient",
"Homepage": "https://github.com/adriankucharski/frechet-coefficient",
"Repository": "https://github.com/adriankucharski/frechet-coefficient"
},
"split_keywords": [
"fid",
" frechet",
" frechet-coefficient"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "1d8eb0793dcf8fcc2cb7c1ee8837883bbe4cc4243e2d8bad95d155e198af0647",
"md5": "a303492cab0a0dc66b652ea11c80c839",
"sha256": "5a11776fe6bc2902356e837bc9e5a946e0c6297b621545ecd7199ee28feb1420"
},
"downloads": -1,
"filename": "frechet_coefficient-2.0.7-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a303492cab0a0dc66b652ea11c80c839",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.13,>=3.9",
"size": 14532,
"upload_time": "2024-11-27T22:11:52",
"upload_time_iso_8601": "2024-11-27T22:11:52.137824Z",
"url": "https://files.pythonhosted.org/packages/1d/8e/b0793dcf8fcc2cb7c1ee8837883bbe4cc4243e2d8bad95d155e198af0647/frechet_coefficient-2.0.7-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "eb39302843a39e0c1f0d943bac3924294b4c1ae23e8ce0b4a53f01b990d30e03",
"md5": "85495ba09090f11b481bf51904d29146",
"sha256": "9676bee50f82ee7348b6ec300ff335477f9caf49e7386e3edddcf307577c7b92"
},
"downloads": -1,
"filename": "frechet_coefficient-2.0.7.tar.gz",
"has_sig": false,
"md5_digest": "85495ba09090f11b481bf51904d29146",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.13,>=3.9",
"size": 12380,
"upload_time": "2024-11-27T22:11:53",
"upload_time_iso_8601": "2024-11-27T22:11:53.354196Z",
"url": "https://files.pythonhosted.org/packages/eb/39/302843a39e0c1f0d943bac3924294b4c1ae23e8ce0b4a53f01b990d30e03/frechet_coefficient-2.0.7.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-27 22:11:53",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "adriankucharski",
"github_project": "frechet-coefficient",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "frechet-coefficient"
}