leap-ie


Nameleap-ie JSON
Version 0.3.0 PyPI version JSON
download
home_pagehttps://www.leap-labs.com/
SummaryLeap Labs Interpretability Engine
upload_time2024-03-20 12:20:06
maintainerNone
docs_urlNone
authorJessica Rumbelow
requires_python>=3.8
licenseProprietary
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Leap Interpretability Engine

Congratulations on being a _very_ early adopter of our interpretability engine! Not sure what's going on? Check out the [FAQ](#faq).

## Installation

Use the package manager [pip](https://pip.pypa.io/en/stable/) to install leap-ie.

```bash
pip install leap-ie
```

During installation `leap-ie` does not modify any dependencies related to PyTorch or Tensorflow in order to preserve your development environment. However, `leap-ie` requires that the following version requirements are met:

**PyTorch**
| Library | Version |
| ---------- | ------- |
| torch | >=1.13.0 |
| torchvision | >=0.14.0 |
| timm | >=0.9.12 |

**Tensorflow**
| Library | Version |
| ---------- | ------- |
| tensorflow | >=2.4.0 |

If you do not have the required libraries installed, you can quickly install them by specifying them as extras:

**PyTorch**

```bash
pip install leap-ie[with-torch]
```

**Tensorflow**

```bash
pip install leap-ie[with-tensorflow]
```

### Generating an API Key

Sign in and generate your API key in the [leap app](https://app.leap-labs.com/) - you'll need this to get started.

## Get started!

```
from leap_ie.vision import engine
from leap_ie.vision.models import get_model

preprocessing_fn, model, class_list = get_model('resnet18', source='torchvision')

config = {"leap_api_key": "YOUR_API_KEY"}

results_df, results_dict = engine.generate(project_name="leap!", model=model, class_list=class_list, config = config, target_classes=[1], preprocessing=preprocessing_fn)
```

We provide easy access to all [image classification torchvision models](https://pytorch.org/vision/main/models.html#classification) via `leap_ie.models.get_model("model_name", source="torchvision")`. We can also automatically pull image classification models from huggingface - just use the model id: `get_model('nateraw/vit-age-classifier', source='huggingface')`.

## Usage

Using the interpretability engine with your own models is really easy! All you need to do is import leap_ie, and wrap your model in our generate function:

```python

from leap_ie.vision import engine

df_results, dict_results = engine.generate(
    project_name="interpretability",
    model=your_model,
    class_list=["hotdog", "not_hotdog"],
    config={"leap_api_key": "YOUR_LEAP_API_KEY"},
)
```

Currently we support image classification models only. We expect the model to take a batch of images as input, and return a batch of logits (NOT probabilities). For most models this will work out of the box, but if your model returns something else (e.g. a dictionary, or probabilities) you might have to edit it, or add a wrapper before passing it to `engine.generate()`.

```python

class ModelWrapper(nn.Module):
    def __init__(self, model):
        super().__init__()
        self.model = model

    def forward(self, x):
        x = self.model(x)
        return x["logits"]

model = ModelWrapper(your_model)

```

## Results

The generate function returns a pandas dataframe and a dictionary of numpy arrays. If you're in a jupyter notebook, you can view these dataframe inline using `engine.display_df(df_results)`, but for the best experience we recommend you head to the [leap app](https://app.leap-labs.com/), or [log directly to your weights and biases dashboard](#weights-and-biases-integration).

For more information about the data we return, see [prototypes](#what-is-a-prototype), [entanglements](#what-is-entanglement), and [feature isolations](#what-is-feature-isolation). If used with samples (see [Sample Feature Isolation](#sample-feature-isolation)), the dataframe contains feature isolations for each sample, for the target classes (if provided), or for the top 3 predicted classes.

## Supported Frameworks

We support both pytorch and tensorflow! Specify your package with the `mode` parameter, using `'tf'` for tensorflow and `'pt'` for pytorch.

If using pytorch, we expect the model to take images to be in channels first format, e.g. of shape `[1, channels, height, width]`. If tensorflow, channels last, e.g.`[1, height, width, channels]`.

## Weights and Biases Integration

We can also log results directly to your WandB projects. To do this, set `project_name` to the name of the WandB project where you'd like the results to be logged, and add your WandB API key and entity name to the `config` dictionary:

```python
config = {
    "wandb_api_key": "YOUR_WANDB_API_KEY",
    "wandb_entity": "your_wandb_entity",
    "leap_api_key": "YOUR_LEAP_API_KEY",
}
df_results, dict_results = engine.generate(
    project_name="your_wandb_project_name",
    model=your_model,
    class_list=["hotdog", "not_hotdog"],
    config=config,
)
```

## Prototype Generation

Given your model, we generate [prototypes](#what-is-a-prototype) and [entanglements](#what-is-entanglement) We also [isolate entangled features](#what-is-feature-isolation) in your prototypes.

```python
from leap_ie.vision import engine
from leap_ie.vision.models import get_model

config = {"leap_api_key": "YOUR_LEAP_API_KEY"}

# Replace this model with your own, or explore any imagenet classifier from torchvision (https://pytorch.org/vision/stable/models.html).
preprocessing_fn, model, class_list = get_model("resnet18", source="torchvision")

# indexes of classes to generate prototypes for. In this case, ['tench', 'goldfish', 'great white shark'].
target_classes = [0, 1, 2]

# generate prototypes
df_results, dict_results = engine.generate(
    project_name="resnet18",
    model=model,
    class_list=class_list,
    config=config,
    target_classes=target_classes,
    preprocessing=preprocessing_fn,
    samples=None,
    device=None,
    mode="pt",
)

# For the best experience, head to https://app.leap-labs.com/ to explore your prototypes and feature isolations in the browser!
# Or, if you're in a jupyter notebook, you can display your results inline:
engine.display_df(df_results)
```

## Sample Feature Isolation

Given some input image, we can show you which features your model thinks belong to each class. If you specify target classes, we'll isolate features for those, or if not, we'll isolate features for the three highest probability classes.

```python
from torchvision import transforms
from leap_ie.vision import engine
from leap_ie.vision.models import get_model
from PIL import Image

config = {"leap_api_key": "YOUR_LEAP_API_KEY"}

# Replace this model with your own, or explore any imagenet classifier from torchvision (https://pytorch.org/vision/stable/models.html).
preprocessing_fn, model, class_list = get_model("resnet18", source="torchvision")

# load an image
image_path = "tools.jpeg"
tt = transforms.ToTensor()
image = preprocessing_fn[0](tt(Image.open(image_path)).unsqueeze(0))

# to isolate features:
df_results, dict_results = engine.generate(
    project_name="resnet18",
    model=model,
    class_list=class_list,
    config=config,
    target_classes=None,
    preprocessing=preprocessing_fn,
    samples=image,
    mode="pt",
)

# For the best experience, head to https://app.leap-labs.com/ to explore your prototypes and feature isolations in the browser!
# Or, if you're in a jupyter notebook, you can display your results inline:
engine.display_df(df_results)
```

## engine.generate()

The generate function is used for both prototype generation directly from the model, and for feature isolation on your input samples.

```python
leap_ie.vision.engine.generate(
    project_name,
    model,
    class_list,
    config,
    target_classes=None,
    preprocessing=None,
    samples=None,
    device=None,
    mode="pt",
)
```

- **project_name** (`str`): Name of your project. Used for logging.

  - _Required_: Yes
  - _Default_: None

- **model** (`object`): Model for interpretation. Currently we support image classification models only. We expect the model to take a batch of images as input, and return a batch of logits (NOT probabilities). If using pytorch, we expect the model to take images to be in channels first format, e.g. of shape `[1, channels, height, width]`. If tensorflow, channels last, e.g.`[1, height, width, channels]`.

  - _Required_: Yes
  - _Default_: None

- **class_list** (`list`): List of class names corresponding to your model's output classes, e.g. ['hotdog', 'not hotdog', ...].

  - _Required_: Yes
  - _Default_: None

- **config** (`dict` or `str`): Configuration dictionary, or path to a json file containing your configuration. At minimum, this must contain `{"leap_api_key": "YOUR_LEAP_API_KEY"}`.

  - _Required_: Yes
  - _Default_: None

- **target_classes** (`list`, optional): List of target class indices to generate prototypes or isolations for, e.g. `[0,1]`. If None, prototypes will be generated for the class at output index 0 only, e.g. 'hotdog', and feature isolations will be generated for the top 3 predicted classes.

  - _Required_: No
  - _Default_: None

- **preprocessing** (`function`, optional): Preprocessing function to be used for generation. This can be None, but for best results, use the preprocessing function used on inputs for inference.

  - _Required_: No
  - _Default_: None

- **samples** (`array`, optional): None, or a batch of images to perform feature isolation on. If provided, only feature isolation is performed (not prototype generation). We expect samples to be of shape `[num_images, height, width, channels]` if using tensorflow, or `[1, channels, height, width]` if using pytorch.

  - _Required_: No
  - _Default_: None

- **device** (`str`, optional): Device to be used for generation. If None, we will try to find a device.

  - _Required_: No
  - _Default_: None

- **mode** (`str`, optional): Framework to use, either 'pt' for pytorch or 'tf' for tensorflow. Default is 'pt'.
  - _Required_: No
  - _Default_: `pt`

## Config

Leap provides a number of configuration options to fine-tune the interpretability engine's performance with your models. You can provide it as a dictionary or a path to a .json file.

- **hf_weight** (`int`): How much to penalise high-frequency patterns in the input. If you are generating very blurry and indistinct prototypes, decrease this. If you are getting very noisy prototypes, increase it. This depends on your model architecture and is hard for us to predict, so you might want to experiment. It's a bit like focussing a microscope. Best practice is to start with zero, and gradually increase.

  - _Default_: `0`

- **input_dim** (`list`): The dimensions of the input that your model expects.

  - _Default_: `[224, 224, 3]` if mode is "tf" else `[3, 224, 224]`

- **isolation** (`bool`): Whether to isolate features for entangled classes. Set to False if you want prototypes only.

  - _Default_: `True`

- **find_lr_steps** (`int`): How many steps to tune the learning rate over at the start of the generation process. We do this automatically for you, but if you want to tune the learning rate manually, set this to zero and provide a learning rate with **lr**.

  - _Default_: `500`

- **max_steps** (`int`): How many steps to run the prototype generation/feature isolation process for. If you get indistinct prototypes or isolations, try increasing this number.
  - _Default_: `1500`

Here are all of the config options currently available:

```python
config = {
    alpha_mask: bool = False
    alpha_only: bool = False
    alpha_weight: int = 1
    baseline_init: int = 0
    diversity_weight: int = 0
    find_lr_steps: int = 500
    hf_weight: int = 0
    input_dim: tuple = [3, 224, 224]
    isolate_classes: list = None
    isolation: bool = True
    isolation_hf_weight: int = 1
    isolation_lr: float = 0.05
    log_freq: int = 100
    lr: float = 0.05
    max_isolate_classes: int = 3
    max_lr: float = 1.0
    max_steps: int = 1500
    min_lr: float = 0.0001
    mode: str = "pt"
    num_lr_windows: int = 50
    project_name: str
    samples: list = None
    seed: int = 0
    stop_lr_early: bool = True
    transform: str = "xl"
    use_alpha: bool = False
    use_baseline: bool = False
    use_hipe: bool = False
    }
```

- **alpha_mask** (`bool`): If True, applies a mask during prototype generation which encourages the resulting prototypes to be minimal, centered and concentrated. Experimental.
  - _Default_: `False`
- **alpha_only** (`bool`): If True, during the prototype generation process, only an alpha channel is optimised. This results in generation prototypical shapes and textures only, with no colour information.
  - _Default_: `False`
- **baseline_init** (`int` or `str`): How to initialise the input. A sensible option is the mean of your expected input data, if you know it. Use 'r' to initialise with random noise for more varied results with different random seeds.
  - _Default_: `0`
- **diversity_weight** (`int`): When generating multiple prototypes for the same class, we can apply a diversity objective to push for more varied inputs. The higher this number, the harder the optimisation process will push for different inputs. Experimental.

  - _Default_: `0`

- **find_lr_steps** (`int`): How many steps to tune the learning rate over at the start of the generation process. We do this automatically for you, but if you want to tune the learning rate manually, set this to zero and provide a learning rate with **lr**.
  - _Default_: `500`
- **hf_weight** (`int`): How much to penalise high-frequency patterns in the input. If you are generating very blurry and indistinct prototypes, decrease this. If you are getting very noisy prototypes, increase it. This depends on your model architecture and is hard for us to predict, so you might want to experiment. It's a bit like focussing binoculars. Best practice is to start with zero, and gradually increase.
  - _Default_: `1`
- **input_dim** (`list`): The dimensions of the input that your model expects.
  - _Default_: `[224, 224, 3]` if mode is "tf" else `[3, 224, 224]`
- **isolate_classes** (`list`): If you'd like to isolate features for specific classes, rather than the top _n_, specify their indices here for EACH target, e.g. [[2,7,8], [2,3]].
  - _Default_: `None`
- **isolation** (`bool`): Whether to isolate features for entangled classes. Set to False if you want prototypes only.
  - _Default_: `True`
- **isolation_hf_weight** (`int`): How much to penalise high-frequency patterns in the feature isolation mask. See hf_weight.
  - _Default_: `1`
- **isolation_lr** (`float`): How much to update the isolation mask at each step during the feature isolation process.
  - _Default_: `0.05`
- **log_freq** (`int`): Interval at which to log images.
  - _Default_: `100`
- **lr** (`float`): How much to update the prototype at each step during the prototype generation process. We find this for you automatically between **max_lr** and **min_lr**, but if you would like to tune it manually, set **find_lr_steps** to zero and provide it here.
  - _Default_: `0.05`
- **max_isolate_classes** (`int`): How many classes to isolate features for, if isolate_classes is not provided.

  - _Default_: `min(3, len(class_list))`

- **max_lr** (`float`): Maximum learning rate for learning rate finder.
- _Default_: `1.0`

- **max_steps** (`int`): How many steps to run the prototype generation/feature isolation process for. If you get indistinct prototypes or isolations, try increasing this number.

  - _Default_: `1000`

- **min_lr** (`float`): Minimum learning rate for learning rate finder.
- _Default_: `0.0001`

- **seed** (`int`): Random seed for initialisation.
  - _Default_: `0`
- **transform** (`str`): Random affine transformation to guard against adversarial noise. You can also experiment with the following options: ['s', 'm', 'l', 'xl']. You can also set this to `None` and provide your own transformation in `engine.generate(preprocessing=your transformation).
  - _Default_: `xl`
- **use_alpha** (`bool`): If True, adds an alpha channel to the prototype. This results in the prototype generation process returning semi-transparent prototypes, which allow it to express ambivalence about the values of pixels that don't change the model prediction.
  - _Default_: `False`
- **use_baseline** (`bool`): Whether to generate an equidistant baseline input prior to the prototype generation process. It takes a bit longer, but setting this to True will ensure that all prototypes generated for a model are not biased by input initialisation.
  - _Default_: `False`
- **wandb_api_key** (`str`): Provide your weights and biases API key here to enable logging results directly to your WandB dashboard.
  - _Default_: `None`
- **wandb_entity** (`str`): If logging to WandB, make sure to provide your WandB entity name here.
  - _Default_: `None`

## FAQ

### What is a prototype?

Prototype generation is a global interpretability method. It provides insight into what a model has learned _without_ looking at its performance on test data, by extracting learned features directly from the model itself. This is important, because there's no guarantee that your test data covers all potential failure modes. It's another way of understanding _what_ your model has learned, and helping you to predict how it will behave in deployment, on unseen data.

So what is a prototype? For each class that your model has been trained to predict, we can generate an input that maximises the probability of that output – this is the model's _prototype_ for that class. It's a representation of what the model 'thinks' that class _is_.

For example, if you have a model trained to diagnose cancer from biopsy slides, prototype generation can show you what the model has learned to look for - what it 'thinks' malignant cells look like. This means you can check to see if it's looking for the right stuff, and ensure that it hasn't learned any spurious correlations from its training data that would cause dangerous mistakes in deployment (e.g. looking for lab markings on the slides, rather than at cell morphology).

### What is entanglement?

During the prototype generation process we extract a lot of information from the model, including which other classes _share features_ with the class prototype that we're generating. Depending on your domain, some entanglement may be expected - for example, an animal classifier is likely to have significant entanglement between 'cat' and 'dog', because those classes share (at least) the 'fur' feature. However, entanglement - especially unexpected entanglement, that doesn't make sense in your domain - can also be a very good indicator of where your model is likely to make misclassifications in deployment.

### What is feature isolation?

Feature isolation does what it says on the tin - it isolates which features in the input the model is using to make its prediction.

We can apply feature isolation in two ways:

- 1. On a prototype that we've generated, to isolate which features are shared between entangled classes, and so help explain how those classes are entangled; and
- 2. On some input data, to explain individual predictions that your model makes, by isolating the features in the input that correspond to the predicted class (similar to saliency mapping).

So, you can use it to both understand properties of your model as a whole, and to better understand the individual predictions it makes.

            

Raw data

            {
    "_id": null,
    "home_page": "https://www.leap-labs.com/",
    "name": "leap-ie",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": null,
    "author": "Jessica Rumbelow",
    "author_email": "jessica@leap-labs.com",
    "download_url": null,
    "platform": null,
    "description": "# Leap Interpretability Engine\n\nCongratulations on being a _very_ early adopter of our interpretability engine! Not sure what's going on? Check out the [FAQ](#faq).\n\n## Installation\n\nUse the package manager [pip](https://pip.pypa.io/en/stable/) to install leap-ie.\n\n```bash\npip install leap-ie\n```\n\nDuring installation `leap-ie` does not modify any dependencies related to PyTorch or Tensorflow in order to preserve your development environment. However, `leap-ie` requires that the following version requirements are met:\n\n**PyTorch**\n| Library | Version |\n| ---------- | ------- |\n| torch | >=1.13.0 |\n| torchvision | >=0.14.0 |\n| timm | >=0.9.12 |\n\n**Tensorflow**\n| Library | Version |\n| ---------- | ------- |\n| tensorflow | >=2.4.0 |\n\nIf you do not have the required libraries installed, you can quickly install them by specifying them as extras:\n\n**PyTorch**\n\n```bash\npip install leap-ie[with-torch]\n```\n\n**Tensorflow**\n\n```bash\npip install leap-ie[with-tensorflow]\n```\n\n### Generating an API Key\n\nSign in and generate your API key in the [leap app](https://app.leap-labs.com/) - you'll need this to get started.\n\n## Get started!\n\n```\nfrom leap_ie.vision import engine\nfrom leap_ie.vision.models import get_model\n\npreprocessing_fn, model, class_list = get_model('resnet18', source='torchvision')\n\nconfig = {\"leap_api_key\": \"YOUR_API_KEY\"}\n\nresults_df, results_dict = engine.generate(project_name=\"leap!\", model=model, class_list=class_list, config = config, target_classes=[1], preprocessing=preprocessing_fn)\n```\n\nWe provide easy access to all [image classification torchvision models](https://pytorch.org/vision/main/models.html#classification) via `leap_ie.models.get_model(\"model_name\", source=\"torchvision\")`. We can also automatically pull image classification models from huggingface - just use the model id: `get_model('nateraw/vit-age-classifier', source='huggingface')`.\n\n## Usage\n\nUsing the interpretability engine with your own models is really easy! All you need to do is import leap_ie, and wrap your model in our generate function:\n\n```python\n\nfrom leap_ie.vision import engine\n\ndf_results, dict_results = engine.generate(\n    project_name=\"interpretability\",\n    model=your_model,\n    class_list=[\"hotdog\", \"not_hotdog\"],\n    config={\"leap_api_key\": \"YOUR_LEAP_API_KEY\"},\n)\n```\n\nCurrently we support image classification models only. We expect the model to take a batch of images as input, and return a batch of logits (NOT probabilities). For most models this will work out of the box, but if your model returns something else (e.g. a dictionary, or probabilities) you might have to edit it, or add a wrapper before passing it to `engine.generate()`.\n\n```python\n\nclass ModelWrapper(nn.Module):\n    def __init__(self, model):\n        super().__init__()\n        self.model = model\n\n    def forward(self, x):\n        x = self.model(x)\n        return x[\"logits\"]\n\nmodel = ModelWrapper(your_model)\n\n```\n\n## Results\n\nThe generate function returns a pandas dataframe and a dictionary of numpy arrays. If you're in a jupyter notebook, you can view these dataframe inline using `engine.display_df(df_results)`, but for the best experience we recommend you head to the [leap app](https://app.leap-labs.com/), or [log directly to your weights and biases dashboard](#weights-and-biases-integration).\n\nFor more information about the data we return, see [prototypes](#what-is-a-prototype), [entanglements](#what-is-entanglement), and [feature isolations](#what-is-feature-isolation). If used with samples (see [Sample Feature Isolation](#sample-feature-isolation)), the dataframe contains feature isolations for each sample, for the target classes (if provided), or for the top 3 predicted classes.\n\n## Supported Frameworks\n\nWe support both pytorch and tensorflow! Specify your package with the `mode` parameter, using `'tf'` for tensorflow and `'pt'` for pytorch.\n\nIf using pytorch, we expect the model to take images to be in channels first format, e.g. of shape `[1, channels, height, width]`. If tensorflow, channels last, e.g.`[1, height, width, channels]`.\n\n## Weights and Biases Integration\n\nWe can also log results directly to your WandB projects. To do this, set `project_name` to the name of the WandB project where you'd like the results to be logged, and add your WandB API key and entity name to the `config` dictionary:\n\n```python\nconfig = {\n    \"wandb_api_key\": \"YOUR_WANDB_API_KEY\",\n    \"wandb_entity\": \"your_wandb_entity\",\n    \"leap_api_key\": \"YOUR_LEAP_API_KEY\",\n}\ndf_results, dict_results = engine.generate(\n    project_name=\"your_wandb_project_name\",\n    model=your_model,\n    class_list=[\"hotdog\", \"not_hotdog\"],\n    config=config,\n)\n```\n\n## Prototype Generation\n\nGiven your model, we generate [prototypes](#what-is-a-prototype) and [entanglements](#what-is-entanglement) We also [isolate entangled features](#what-is-feature-isolation) in your prototypes.\n\n```python\nfrom leap_ie.vision import engine\nfrom leap_ie.vision.models import get_model\n\nconfig = {\"leap_api_key\": \"YOUR_LEAP_API_KEY\"}\n\n# Replace this model with your own, or explore any imagenet classifier from torchvision (https://pytorch.org/vision/stable/models.html).\npreprocessing_fn, model, class_list = get_model(\"resnet18\", source=\"torchvision\")\n\n# indexes of classes to generate prototypes for. In this case, ['tench', 'goldfish', 'great white shark'].\ntarget_classes = [0, 1, 2]\n\n# generate prototypes\ndf_results, dict_results = engine.generate(\n    project_name=\"resnet18\",\n    model=model,\n    class_list=class_list,\n    config=config,\n    target_classes=target_classes,\n    preprocessing=preprocessing_fn,\n    samples=None,\n    device=None,\n    mode=\"pt\",\n)\n\n# For the best experience, head to https://app.leap-labs.com/ to explore your prototypes and feature isolations in the browser!\n# Or, if you're in a jupyter notebook, you can display your results inline:\nengine.display_df(df_results)\n```\n\n## Sample Feature Isolation\n\nGiven some input image, we can show you which features your model thinks belong to each class. If you specify target classes, we'll isolate features for those, or if not, we'll isolate features for the three highest probability classes.\n\n```python\nfrom torchvision import transforms\nfrom leap_ie.vision import engine\nfrom leap_ie.vision.models import get_model\nfrom PIL import Image\n\nconfig = {\"leap_api_key\": \"YOUR_LEAP_API_KEY\"}\n\n# Replace this model with your own, or explore any imagenet classifier from torchvision (https://pytorch.org/vision/stable/models.html).\npreprocessing_fn, model, class_list = get_model(\"resnet18\", source=\"torchvision\")\n\n# load an image\nimage_path = \"tools.jpeg\"\ntt = transforms.ToTensor()\nimage = preprocessing_fn[0](tt(Image.open(image_path)).unsqueeze(0))\n\n# to isolate features:\ndf_results, dict_results = engine.generate(\n    project_name=\"resnet18\",\n    model=model,\n    class_list=class_list,\n    config=config,\n    target_classes=None,\n    preprocessing=preprocessing_fn,\n    samples=image,\n    mode=\"pt\",\n)\n\n# For the best experience, head to https://app.leap-labs.com/ to explore your prototypes and feature isolations in the browser!\n# Or, if you're in a jupyter notebook, you can display your results inline:\nengine.display_df(df_results)\n```\n\n## engine.generate()\n\nThe generate function is used for both prototype generation directly from the model, and for feature isolation on your input samples.\n\n```python\nleap_ie.vision.engine.generate(\n    project_name,\n    model,\n    class_list,\n    config,\n    target_classes=None,\n    preprocessing=None,\n    samples=None,\n    device=None,\n    mode=\"pt\",\n)\n```\n\n- **project_name** (`str`): Name of your project. Used for logging.\n\n  - _Required_: Yes\n  - _Default_: None\n\n- **model** (`object`): Model for interpretation. Currently we support image classification models only. We expect the model to take a batch of images as input, and return a batch of logits (NOT probabilities). If using pytorch, we expect the model to take images to be in channels first format, e.g. of shape `[1, channels, height, width]`. If tensorflow, channels last, e.g.`[1, height, width, channels]`.\n\n  - _Required_: Yes\n  - _Default_: None\n\n- **class_list** (`list`): List of class names corresponding to your model's output classes, e.g. ['hotdog', 'not hotdog', ...].\n\n  - _Required_: Yes\n  - _Default_: None\n\n- **config** (`dict` or `str`): Configuration dictionary, or path to a json file containing your configuration. At minimum, this must contain `{\"leap_api_key\": \"YOUR_LEAP_API_KEY\"}`.\n\n  - _Required_: Yes\n  - _Default_: None\n\n- **target_classes** (`list`, optional): List of target class indices to generate prototypes or isolations for, e.g. `[0,1]`. If None, prototypes will be generated for the class at output index 0 only, e.g. 'hotdog', and feature isolations will be generated for the top 3 predicted classes.\n\n  - _Required_: No\n  - _Default_: None\n\n- **preprocessing** (`function`, optional): Preprocessing function to be used for generation. This can be None, but for best results, use the preprocessing function used on inputs for inference.\n\n  - _Required_: No\n  - _Default_: None\n\n- **samples** (`array`, optional): None, or a batch of images to perform feature isolation on. If provided, only feature isolation is performed (not prototype generation). We expect samples to be of shape `[num_images, height, width, channels]` if using tensorflow, or `[1, channels, height, width]` if using pytorch.\n\n  - _Required_: No\n  - _Default_: None\n\n- **device** (`str`, optional): Device to be used for generation. If None, we will try to find a device.\n\n  - _Required_: No\n  - _Default_: None\n\n- **mode** (`str`, optional): Framework to use, either 'pt' for pytorch or 'tf' for tensorflow. Default is 'pt'.\n  - _Required_: No\n  - _Default_: `pt`\n\n## Config\n\nLeap provides a number of configuration options to fine-tune the interpretability engine's performance with your models. You can provide it as a dictionary or a path to a .json file.\n\n- **hf_weight** (`int`): How much to penalise high-frequency patterns in the input. If you are generating very blurry and indistinct prototypes, decrease this. If you are getting very noisy prototypes, increase it. This depends on your model architecture and is hard for us to predict, so you might want to experiment. It's a bit like focussing a microscope. Best practice is to start with zero, and gradually increase.\n\n  - _Default_: `0`\n\n- **input_dim** (`list`): The dimensions of the input that your model expects.\n\n  - _Default_: `[224, 224, 3]` if mode is \"tf\" else `[3, 224, 224]`\n\n- **isolation** (`bool`): Whether to isolate features for entangled classes. Set to False if you want prototypes only.\n\n  - _Default_: `True`\n\n- **find_lr_steps** (`int`): How many steps to tune the learning rate over at the start of the generation process. We do this automatically for you, but if you want to tune the learning rate manually, set this to zero and provide a learning rate with **lr**.\n\n  - _Default_: `500`\n\n- **max_steps** (`int`): How many steps to run the prototype generation/feature isolation process for. If you get indistinct prototypes or isolations, try increasing this number.\n  - _Default_: `1500`\n\nHere are all of the config options currently available:\n\n```python\nconfig = {\n    alpha_mask: bool = False\n    alpha_only: bool = False\n    alpha_weight: int = 1\n    baseline_init: int = 0\n    diversity_weight: int = 0\n    find_lr_steps: int = 500\n    hf_weight: int = 0\n    input_dim: tuple = [3, 224, 224]\n    isolate_classes: list = None\n    isolation: bool = True\n    isolation_hf_weight: int = 1\n    isolation_lr: float = 0.05\n    log_freq: int = 100\n    lr: float = 0.05\n    max_isolate_classes: int = 3\n    max_lr: float = 1.0\n    max_steps: int = 1500\n    min_lr: float = 0.0001\n    mode: str = \"pt\"\n    num_lr_windows: int = 50\n    project_name: str\n    samples: list = None\n    seed: int = 0\n    stop_lr_early: bool = True\n    transform: str = \"xl\"\n    use_alpha: bool = False\n    use_baseline: bool = False\n    use_hipe: bool = False\n    }\n```\n\n- **alpha_mask** (`bool`): If True, applies a mask during prototype generation which encourages the resulting prototypes to be minimal, centered and concentrated. Experimental.\n  - _Default_: `False`\n- **alpha_only** (`bool`): If True, during the prototype generation process, only an alpha channel is optimised. This results in generation prototypical shapes and textures only, with no colour information.\n  - _Default_: `False`\n- **baseline_init** (`int` or `str`): How to initialise the input. A sensible option is the mean of your expected input data, if you know it. Use 'r' to initialise with random noise for more varied results with different random seeds.\n  - _Default_: `0`\n- **diversity_weight** (`int`): When generating multiple prototypes for the same class, we can apply a diversity objective to push for more varied inputs. The higher this number, the harder the optimisation process will push for different inputs. Experimental.\n\n  - _Default_: `0`\n\n- **find_lr_steps** (`int`): How many steps to tune the learning rate over at the start of the generation process. We do this automatically for you, but if you want to tune the learning rate manually, set this to zero and provide a learning rate with **lr**.\n  - _Default_: `500`\n- **hf_weight** (`int`): How much to penalise high-frequency patterns in the input. If you are generating very blurry and indistinct prototypes, decrease this. If you are getting very noisy prototypes, increase it. This depends on your model architecture and is hard for us to predict, so you might want to experiment. It's a bit like focussing binoculars. Best practice is to start with zero, and gradually increase.\n  - _Default_: `1`\n- **input_dim** (`list`): The dimensions of the input that your model expects.\n  - _Default_: `[224, 224, 3]` if mode is \"tf\" else `[3, 224, 224]`\n- **isolate_classes** (`list`): If you'd like to isolate features for specific classes, rather than the top _n_, specify their indices here for EACH target, e.g. [[2,7,8], [2,3]].\n  - _Default_: `None`\n- **isolation** (`bool`): Whether to isolate features for entangled classes. Set to False if you want prototypes only.\n  - _Default_: `True`\n- **isolation_hf_weight** (`int`): How much to penalise high-frequency patterns in the feature isolation mask. See hf_weight.\n  - _Default_: `1`\n- **isolation_lr** (`float`): How much to update the isolation mask at each step during the feature isolation process.\n  - _Default_: `0.05`\n- **log_freq** (`int`): Interval at which to log images.\n  - _Default_: `100`\n- **lr** (`float`): How much to update the prototype at each step during the prototype generation process. We find this for you automatically between **max_lr** and **min_lr**, but if you would like to tune it manually, set **find_lr_steps** to zero and provide it here.\n  - _Default_: `0.05`\n- **max_isolate_classes** (`int`): How many classes to isolate features for, if isolate_classes is not provided.\n\n  - _Default_: `min(3, len(class_list))`\n\n- **max_lr** (`float`): Maximum learning rate for learning rate finder.\n- _Default_: `1.0`\n\n- **max_steps** (`int`): How many steps to run the prototype generation/feature isolation process for. If you get indistinct prototypes or isolations, try increasing this number.\n\n  - _Default_: `1000`\n\n- **min_lr** (`float`): Minimum learning rate for learning rate finder.\n- _Default_: `0.0001`\n\n- **seed** (`int`): Random seed for initialisation.\n  - _Default_: `0`\n- **transform** (`str`): Random affine transformation to guard against adversarial noise. You can also experiment with the following options: ['s', 'm', 'l', 'xl']. You can also set this to `None` and provide your own transformation in `engine.generate(preprocessing=your transformation).\n  - _Default_: `xl`\n- **use_alpha** (`bool`): If True, adds an alpha channel to the prototype. This results in the prototype generation process returning semi-transparent prototypes, which allow it to express ambivalence about the values of pixels that don't change the model prediction.\n  - _Default_: `False`\n- **use_baseline** (`bool`): Whether to generate an equidistant baseline input prior to the prototype generation process. It takes a bit longer, but setting this to True will ensure that all prototypes generated for a model are not biased by input initialisation.\n  - _Default_: `False`\n- **wandb_api_key** (`str`): Provide your weights and biases API key here to enable logging results directly to your WandB dashboard.\n  - _Default_: `None`\n- **wandb_entity** (`str`): If logging to WandB, make sure to provide your WandB entity name here.\n  - _Default_: `None`\n\n## FAQ\n\n### What is a prototype?\n\nPrototype generation is a global interpretability method. It provides insight into what a model has learned _without_ looking at its performance on test data, by extracting learned features directly from the model itself. This is important, because there's no guarantee that your test data covers all potential failure modes. It's another way of understanding _what_ your model has learned, and helping you to predict how it will behave in deployment, on unseen data.\n\nSo what is a prototype? For each class that your model has been trained to predict, we can generate an input that maximises the probability of that output \u2013 this is the model's _prototype_ for that class. It's a representation of what the model 'thinks' that class _is_.\n\nFor example, if you have a model trained to diagnose cancer from biopsy slides, prototype generation can show you what the model has learned to look for - what it 'thinks' malignant cells look like. This means you can check to see if it's looking for the right stuff, and ensure that it hasn't learned any spurious correlations from its training data that would cause dangerous mistakes in deployment (e.g. looking for lab markings on the slides, rather than at cell morphology).\n\n### What is entanglement?\n\nDuring the prototype generation process we extract a lot of information from the model, including which other classes _share features_ with the class prototype that we're generating. Depending on your domain, some entanglement may be expected - for example, an animal classifier is likely to have significant entanglement between 'cat' and 'dog', because those classes share (at least) the 'fur' feature. However, entanglement - especially unexpected entanglement, that doesn't make sense in your domain - can also be a very good indicator of where your model is likely to make misclassifications in deployment.\n\n### What is feature isolation?\n\nFeature isolation does what it says on the tin - it isolates which features in the input the model is using to make its prediction.\n\nWe can apply feature isolation in two ways:\n\n- 1. On a prototype that we've generated, to isolate which features are shared between entangled classes, and so help explain how those classes are entangled; and\n- 2. On some input data, to explain individual predictions that your model makes, by isolating the features in the input that correspond to the predicted class (similar to saliency mapping).\n\nSo, you can use it to both understand properties of your model as a whole, and to better understand the individual predictions it makes.\n",
    "bugtrack_url": null,
    "license": "Proprietary",
    "summary": "Leap Labs Interpretability Engine",
    "version": "0.3.0",
    "project_urls": {
        "Homepage": "https://www.leap-labs.com/"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3666dae1b1072841949566fcbef1f6e4e9b5222beaa58eb7731ef05a060d19ff",
                "md5": "1b68065db97ef1931e34fc677dbb6e3e",
                "sha256": "ff2474ad43f8ff39f52dcf2753b0d0a112ad2e057c07166c0f76de76a27c2ae1"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp310-cp310-macosx_10_9_x86_64.whl",
            "has_sig": false,
            "md5_digest": "1b68065db97ef1931e34fc677dbb6e3e",
            "packagetype": "bdist_wheel",
            "python_version": "cp310",
            "requires_python": ">=3.8",
            "size": 1368777,
            "upload_time": "2024-03-20T12:20:06",
            "upload_time_iso_8601": "2024-03-20T12:20:06.551658Z",
            "url": "https://files.pythonhosted.org/packages/36/66/dae1b1072841949566fcbef1f6e4e9b5222beaa58eb7731ef05a060d19ff/leap_ie-0.3.0-cp310-cp310-macosx_10_9_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ad69ace16664e887a51b93ad371faadac62c7e2a9d713bf3572e41a0bf644efa",
                "md5": "967baa59bdcd5bc1a6cf4741b706308e",
                "sha256": "33fd1b7bfe9e2f8d19a1178fdd4c585a6fc24261918f259ad0f8340faf6a4fad"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp310-cp310-macosx_11_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "967baa59bdcd5bc1a6cf4741b706308e",
            "packagetype": "bdist_wheel",
            "python_version": "cp310",
            "requires_python": ">=3.8",
            "size": 1301583,
            "upload_time": "2024-03-20T12:20:09",
            "upload_time_iso_8601": "2024-03-20T12:20:09.168443Z",
            "url": "https://files.pythonhosted.org/packages/ad/69/ace16664e887a51b93ad371faadac62c7e2a9d713bf3572e41a0bf644efa/leap_ie-0.3.0-cp310-cp310-macosx_11_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "35fca8d7553ffeadc813a0bdc1608abf4f8f8f4cebbc525db2e67b31ac3e4f57",
                "md5": "e949c8772b22e0a732f7eb2045c0a75d",
                "sha256": "35f7b76d91af8672de9850daf26f0d9dc125d7078959e3d8afcdf7343497c9ce"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
            "has_sig": false,
            "md5_digest": "e949c8772b22e0a732f7eb2045c0a75d",
            "packagetype": "bdist_wheel",
            "python_version": "cp310",
            "requires_python": ">=3.8",
            "size": 6867012,
            "upload_time": "2024-03-20T12:20:11",
            "upload_time_iso_8601": "2024-03-20T12:20:11.056443Z",
            "url": "https://files.pythonhosted.org/packages/35/fc/a8d7553ffeadc813a0bdc1608abf4f8f8f4cebbc525db2e67b31ac3e4f57/leap_ie-0.3.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "24e4ce074985ab88511a082d61dd0af547d100cced66030f40e4c47b6346c047",
                "md5": "f866488c50716e349f0e20bbc4da9833",
                "sha256": "f4f9fa690bf1c7a4b6eaca0be977945daa2c21159c941d95e60499ff2e186b33"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp310-cp310-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "f866488c50716e349f0e20bbc4da9833",
            "packagetype": "bdist_wheel",
            "python_version": "cp310",
            "requires_python": ">=3.8",
            "size": 1173641,
            "upload_time": "2024-03-20T12:20:13",
            "upload_time_iso_8601": "2024-03-20T12:20:13.562910Z",
            "url": "https://files.pythonhosted.org/packages/24/e4/ce074985ab88511a082d61dd0af547d100cced66030f40e4c47b6346c047/leap_ie-0.3.0-cp310-cp310-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "15fff699ceb2587b071290e799fbbd0fb46faa13f7d288894a59c7107e947cc3",
                "md5": "5c6a473fdd81997dab02228ba4d134c0",
                "sha256": "a7af0182fa54979d5bc94f6862554bd6fa9ee82bdc859616a3d5b1c35050e845"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp310-cp310-win_arm64.whl",
            "has_sig": false,
            "md5_digest": "5c6a473fdd81997dab02228ba4d134c0",
            "packagetype": "bdist_wheel",
            "python_version": "cp310",
            "requires_python": ">=3.8",
            "size": 986581,
            "upload_time": "2024-03-20T12:20:15",
            "upload_time_iso_8601": "2024-03-20T12:20:15.863925Z",
            "url": "https://files.pythonhosted.org/packages/15/ff/f699ceb2587b071290e799fbbd0fb46faa13f7d288894a59c7107e947cc3/leap_ie-0.3.0-cp310-cp310-win_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ce41552c074a10c004f30d7f1274b3630887145b26f5320e489ce026605c44b9",
                "md5": "027ce69d10243ee365b8c8fc50c20f01",
                "sha256": "8dc55c56d8ec720363fca8e3862f0d746402a4698b4e27d097e56c3e373cab7d"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp311-cp311-macosx_10_9_x86_64.whl",
            "has_sig": false,
            "md5_digest": "027ce69d10243ee365b8c8fc50c20f01",
            "packagetype": "bdist_wheel",
            "python_version": "cp311",
            "requires_python": ">=3.8",
            "size": 1369596,
            "upload_time": "2024-03-20T12:20:17",
            "upload_time_iso_8601": "2024-03-20T12:20:17.815704Z",
            "url": "https://files.pythonhosted.org/packages/ce/41/552c074a10c004f30d7f1274b3630887145b26f5320e489ce026605c44b9/leap_ie-0.3.0-cp311-cp311-macosx_10_9_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f3dc90d7eedca7218fed5c182d6962fe0ccefdb66f71d59b61a7f0a97cf5afed",
                "md5": "eb69e50f74c4ec28326aac6fb9de2d02",
                "sha256": "34d4c20db8e6b43f8b7a9f2030b00c546741e724116e381e99c49e8a2a002c3c"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp311-cp311-macosx_11_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "eb69e50f74c4ec28326aac6fb9de2d02",
            "packagetype": "bdist_wheel",
            "python_version": "cp311",
            "requires_python": ">=3.8",
            "size": 1302424,
            "upload_time": "2024-03-20T12:20:19",
            "upload_time_iso_8601": "2024-03-20T12:20:19.802091Z",
            "url": "https://files.pythonhosted.org/packages/f3/dc/90d7eedca7218fed5c182d6962fe0ccefdb66f71d59b61a7f0a97cf5afed/leap_ie-0.3.0-cp311-cp311-macosx_11_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b61d2d45ae33c4e0ba589a3655727d8b495419e146b5f2a3a39ff97d76c573d5",
                "md5": "e80533b088c1c9c5188da877b3058b51",
                "sha256": "82db904aeaca23fa8fcb577f4977607afab21dcb993287fb2a3da380aef4e137"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
            "has_sig": false,
            "md5_digest": "e80533b088c1c9c5188da877b3058b51",
            "packagetype": "bdist_wheel",
            "python_version": "cp311",
            "requires_python": ">=3.8",
            "size": 7704086,
            "upload_time": "2024-03-20T12:20:21",
            "upload_time_iso_8601": "2024-03-20T12:20:21.676180Z",
            "url": "https://files.pythonhosted.org/packages/b6/1d/2d45ae33c4e0ba589a3655727d8b495419e146b5f2a3a39ff97d76c573d5/leap_ie-0.3.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a70a2836e7cb8d815b6faad79e465693cefc2465f84155d2eace79b1abbdc3aa",
                "md5": "382d7e768185438c51981f2acde7665d",
                "sha256": "67bf87ea05599757f7ff83763e641898ba04a31d28f404e2626e23908153d3e4"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp311-cp311-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "382d7e768185438c51981f2acde7665d",
            "packagetype": "bdist_wheel",
            "python_version": "cp311",
            "requires_python": ">=3.8",
            "size": 1177226,
            "upload_time": "2024-03-20T12:20:24",
            "upload_time_iso_8601": "2024-03-20T12:20:24.117982Z",
            "url": "https://files.pythonhosted.org/packages/a7/0a/2836e7cb8d815b6faad79e465693cefc2465f84155d2eace79b1abbdc3aa/leap_ie-0.3.0-cp311-cp311-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c5edcb28ef701aab834cd575e8e19d73d72f2b3a7e5f872117ff0b4733ee5811",
                "md5": "7f5f06038a23bb211d08f8bf1ce7cf12",
                "sha256": "2d2812a031e2e213bbb96c5627844cd00c6a42b6cfff2e714172954d79d28aea"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp311-cp311-win_arm64.whl",
            "has_sig": false,
            "md5_digest": "7f5f06038a23bb211d08f8bf1ce7cf12",
            "packagetype": "bdist_wheel",
            "python_version": "cp311",
            "requires_python": ">=3.8",
            "size": 989956,
            "upload_time": "2024-03-20T12:20:25",
            "upload_time_iso_8601": "2024-03-20T12:20:25.923631Z",
            "url": "https://files.pythonhosted.org/packages/c5/ed/cb28ef701aab834cd575e8e19d73d72f2b3a7e5f872117ff0b4733ee5811/leap_ie-0.3.0-cp311-cp311-win_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ae447969c872f87eeb1fd57060ef41140eb3a7e1d13c36b511426e493aadb07e",
                "md5": "21f64fef38e05eac18838217e2e41bf5",
                "sha256": "4d83ac7b6ee1663267ee42c87394ac74710d4f6679b8b28afcb2263d588c18d8"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp312-cp312-macosx_10_9_x86_64.whl",
            "has_sig": false,
            "md5_digest": "21f64fef38e05eac18838217e2e41bf5",
            "packagetype": "bdist_wheel",
            "python_version": "cp312",
            "requires_python": ">=3.8",
            "size": 1321608,
            "upload_time": "2024-03-20T12:20:28",
            "upload_time_iso_8601": "2024-03-20T12:20:28.631647Z",
            "url": "https://files.pythonhosted.org/packages/ae/44/7969c872f87eeb1fd57060ef41140eb3a7e1d13c36b511426e493aadb07e/leap_ie-0.3.0-cp312-cp312-macosx_10_9_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "143422137b17dbfc75839194db9631e1fe46f13b7bd533ec775d713e990b097a",
                "md5": "24cf7aac1bed105189621caf6338ee3e",
                "sha256": "05929e92e9204d48dba2fef085422014d108aa49e7d162293ccd4c3c08d3f6f4"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp312-cp312-macosx_11_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "24cf7aac1bed105189621caf6338ee3e",
            "packagetype": "bdist_wheel",
            "python_version": "cp312",
            "requires_python": ">=3.8",
            "size": 1274789,
            "upload_time": "2024-03-20T12:20:31",
            "upload_time_iso_8601": "2024-03-20T12:20:31.006703Z",
            "url": "https://files.pythonhosted.org/packages/14/34/22137b17dbfc75839194db9631e1fe46f13b7bd533ec775d713e990b097a/leap_ie-0.3.0-cp312-cp312-macosx_11_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a71cd826549f1c4969b9b1f0d012960bb7a4f5f134cf397a10679271f5d31215",
                "md5": "20e2e869b64abebac3d5ec7b914bbe93",
                "sha256": "652ed033273573701067c75ce43b605ab76d67519f32734753282feb6458ac70"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
            "has_sig": false,
            "md5_digest": "20e2e869b64abebac3d5ec7b914bbe93",
            "packagetype": "bdist_wheel",
            "python_version": "cp312",
            "requires_python": ">=3.8",
            "size": 8019010,
            "upload_time": "2024-03-20T12:20:32",
            "upload_time_iso_8601": "2024-03-20T12:20:32.754905Z",
            "url": "https://files.pythonhosted.org/packages/a7/1c/d826549f1c4969b9b1f0d012960bb7a4f5f134cf397a10679271f5d31215/leap_ie-0.3.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d05616a824dcee081d746380c8e169a46ccde1515687a96f330a140f8ff61ee0",
                "md5": "39e4094ff8a5f4ba45361c56ec1c8ea7",
                "sha256": "8be07bc8a2c68551866b0b7c4b1b3eb5dfad44cc1c99bd93902cac75e1f7e2cd"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp312-cp312-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "39e4094ff8a5f4ba45361c56ec1c8ea7",
            "packagetype": "bdist_wheel",
            "python_version": "cp312",
            "requires_python": ">=3.8",
            "size": 1148053,
            "upload_time": "2024-03-20T12:20:35",
            "upload_time_iso_8601": "2024-03-20T12:20:35.324804Z",
            "url": "https://files.pythonhosted.org/packages/d0/56/16a824dcee081d746380c8e169a46ccde1515687a96f330a140f8ff61ee0/leap_ie-0.3.0-cp312-cp312-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0f33f3eed61ddd8b471ae6630007e5902b219980c8499adcde8ac18f7e21d431",
                "md5": "775b68be5a5ee52e14f7ab1c4bded31b",
                "sha256": "4ddda4a06360e65a86a32e2138a23e00ad57b5ed53471885d5a2470fdcc44d51"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp312-cp312-win_arm64.whl",
            "has_sig": false,
            "md5_digest": "775b68be5a5ee52e14f7ab1c4bded31b",
            "packagetype": "bdist_wheel",
            "python_version": "cp312",
            "requires_python": ">=3.8",
            "size": 961289,
            "upload_time": "2024-03-20T12:20:37",
            "upload_time_iso_8601": "2024-03-20T12:20:37.507519Z",
            "url": "https://files.pythonhosted.org/packages/0f/33/f3eed61ddd8b471ae6630007e5902b219980c8499adcde8ac18f7e21d431/leap_ie-0.3.0-cp312-cp312-win_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d41a5512fb49b4287dac54a8bbfc2b74ec42f521b1b7d4f3bb116663bdb5d200",
                "md5": "fef17e82fbd43478f1b93d821ae4d57c",
                "sha256": "51ae46dfd7ae85c53e63beb6742bf065995d3c95c28b15a51ec282609e897ffb"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp38-cp38-macosx_10_9_x86_64.whl",
            "has_sig": false,
            "md5_digest": "fef17e82fbd43478f1b93d821ae4d57c",
            "packagetype": "bdist_wheel",
            "python_version": "cp38",
            "requires_python": ">=3.8",
            "size": 1356257,
            "upload_time": "2024-03-20T12:20:39",
            "upload_time_iso_8601": "2024-03-20T12:20:39.160165Z",
            "url": "https://files.pythonhosted.org/packages/d4/1a/5512fb49b4287dac54a8bbfc2b74ec42f521b1b7d4f3bb116663bdb5d200/leap_ie-0.3.0-cp38-cp38-macosx_10_9_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "89609f4366b015d825c7cf9dd403a253a07a94668e099230b3c56656705c3266",
                "md5": "31774a21f7e6afe4ded16ef7358dc026",
                "sha256": "4dfc90762d0156da90ec9b99bff1772f93154576136de3efbb5d2f30baff19c6"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp38-cp38-macosx_11_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "31774a21f7e6afe4ded16ef7358dc026",
            "packagetype": "bdist_wheel",
            "python_version": "cp38",
            "requires_python": ">=3.8",
            "size": 1292189,
            "upload_time": "2024-03-20T12:20:41",
            "upload_time_iso_8601": "2024-03-20T12:20:41.972364Z",
            "url": "https://files.pythonhosted.org/packages/89/60/9f4366b015d825c7cf9dd403a253a07a94668e099230b3c56656705c3266/leap_ie-0.3.0-cp38-cp38-macosx_11_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "79d004683b27d92bc9fe892e4746578c9ed273f4f3d931cd98ad6d980f8c3c1b",
                "md5": "22fcb22a1d4fcd3f9863ba8794db7d4c",
                "sha256": "2467180a03ffe84f3354ebed3415d32dab7fcb4b7c8fabe377da5229f2978655"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
            "has_sig": false,
            "md5_digest": "22fcb22a1d4fcd3f9863ba8794db7d4c",
            "packagetype": "bdist_wheel",
            "python_version": "cp38",
            "requires_python": ">=3.8",
            "size": 7059189,
            "upload_time": "2024-03-20T12:20:43",
            "upload_time_iso_8601": "2024-03-20T12:20:43.963547Z",
            "url": "https://files.pythonhosted.org/packages/79/d0/04683b27d92bc9fe892e4746578c9ed273f4f3d931cd98ad6d980f8c3c1b/leap_ie-0.3.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "421bcaa1d766764a3c25b2805c333dfc903f58f9cdfaccf1e111b602be70dda4",
                "md5": "62fd7aac64762cc570df09c798a68abc",
                "sha256": "2c16b193cb4d75e4c1f7813f2457d0aa81dfb1868a36b172dd9242a24cac5b72"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp38-cp38-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "62fd7aac64762cc570df09c798a68abc",
            "packagetype": "bdist_wheel",
            "python_version": "cp38",
            "requires_python": ">=3.8",
            "size": 1197376,
            "upload_time": "2024-03-20T12:20:47",
            "upload_time_iso_8601": "2024-03-20T12:20:47.435463Z",
            "url": "https://files.pythonhosted.org/packages/42/1b/caa1d766764a3c25b2805c333dfc903f58f9cdfaccf1e111b602be70dda4/leap_ie-0.3.0-cp38-cp38-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "22b3ba930c2a12e002f2f336c3b210ce3d410dd7defeadaf8f5def287dbaa718",
                "md5": "1a0e4ff088ebc8bfd88d65c8ec87e8ca",
                "sha256": "c48a1a4940d59cca8e4d71ec03fa349f272794c39edf47644c537631d7d6f810"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp39-cp39-macosx_10_9_x86_64.whl",
            "has_sig": false,
            "md5_digest": "1a0e4ff088ebc8bfd88d65c8ec87e8ca",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": ">=3.8",
            "size": 1371422,
            "upload_time": "2024-03-20T12:20:49",
            "upload_time_iso_8601": "2024-03-20T12:20:49.065206Z",
            "url": "https://files.pythonhosted.org/packages/22/b3/ba930c2a12e002f2f336c3b210ce3d410dd7defeadaf8f5def287dbaa718/leap_ie-0.3.0-cp39-cp39-macosx_10_9_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e4c0e60ca8d284f1e6856d5c89440f7e57923acedfb8498cb29d12dc4f99ead4",
                "md5": "8163a32aff63b9a562076ef5e04fc915",
                "sha256": "b9588f7badde5344cd0a51a5c94ba0c5d76ff18702dc848a897aeb33cf6c5f2f"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp39-cp39-macosx_11_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "8163a32aff63b9a562076ef5e04fc915",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": ">=3.8",
            "size": 1303479,
            "upload_time": "2024-03-20T12:20:51",
            "upload_time_iso_8601": "2024-03-20T12:20:51.060789Z",
            "url": "https://files.pythonhosted.org/packages/e4/c0/e60ca8d284f1e6856d5c89440f7e57923acedfb8498cb29d12dc4f99ead4/leap_ie-0.3.0-cp39-cp39-macosx_11_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c38b43afd0ea41fb7f1a8c7d23192876d740c640ced2f855bbb5831b57b12544",
                "md5": "57f3968af6cb2bee21c6c71f70bfa118",
                "sha256": "b6272743e9446843b79923c3c67788fcb05e6da543dfb66e51e59366d8172116"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
            "has_sig": false,
            "md5_digest": "57f3968af6cb2bee21c6c71f70bfa118",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": ">=3.8",
            "size": 6865718,
            "upload_time": "2024-03-20T12:20:53",
            "upload_time_iso_8601": "2024-03-20T12:20:53.434337Z",
            "url": "https://files.pythonhosted.org/packages/c3/8b/43afd0ea41fb7f1a8c7d23192876d740c640ced2f855bbb5831b57b12544/leap_ie-0.3.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "bbf2aec3ba63da2a49d3793d02b61a60e8fbb7379d868860e1b6f7dfd6c58eaa",
                "md5": "ba510822470525eb026baba049c5f68f",
                "sha256": "191c29290ab8333ffae95de69c494063e9606168fed8b02442771559130bfedf"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp39-cp39-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "ba510822470525eb026baba049c5f68f",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": ">=3.8",
            "size": 1175176,
            "upload_time": "2024-03-20T12:20:56",
            "upload_time_iso_8601": "2024-03-20T12:20:56.169266Z",
            "url": "https://files.pythonhosted.org/packages/bb/f2/aec3ba63da2a49d3793d02b61a60e8fbb7379d868860e1b6f7dfd6c58eaa/leap_ie-0.3.0-cp39-cp39-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "5cc431491fc4bee0ac7ad3c752ee845ac4cce4a3fcd866009619af3310728ee8",
                "md5": "26de6ddfc1aec009436e7e1bf174a89d",
                "sha256": "99a93086c59c489e0d19d6c557b8a6ec4cea38b19be3c36e6b4f742e8673f4c6"
            },
            "downloads": -1,
            "filename": "leap_ie-0.3.0-cp39-cp39-win_arm64.whl",
            "has_sig": false,
            "md5_digest": "26de6ddfc1aec009436e7e1bf174a89d",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": ">=3.8",
            "size": 988088,
            "upload_time": "2024-03-20T12:20:57",
            "upload_time_iso_8601": "2024-03-20T12:20:57.741390Z",
            "url": "https://files.pythonhosted.org/packages/5c/c4/31491fc4bee0ac7ad3c752ee845ac4cce4a3fcd866009619af3310728ee8/leap_ie-0.3.0-cp39-cp39-win_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-03-20 12:20:06",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "leap-ie"
}
        
Elapsed time: 0.40523s