Name | Version | Summary | date |
fasterai |
0.2.5 |
A library to make neural networks lighter and faster with fastai |
2024-12-10 08:44:48 |
sparsify-nightly |
1.7.0.20240304 |
Easy-to-use UI for automatically sparsifying neural networks and creating sparsification recipes for better inference performance and a smaller footprint |
2024-03-05 13:38:57 |
clika-inference |
0.0.2 |
A fake package to warn the user they are not installing the correct package. |
2024-01-31 06:43:35 |
clika-compression |
0.0.2 |
A fake package to warn the user they are not installing the correct package. |
2024-01-31 06:43:33 |
clika-client |
0.0.2 |
A fake package to warn the user they are not installing the correct package. |
2024-01-31 06:43:31 |
clika-ace |
0.0.2 |
A fake package to warn the user they are not installing the correct package. |
2024-01-31 06:43:30 |
pruningdistribution |
0.1.0 |
Pruning of CNNs with distributions |
2024-01-19 02:49:02 |
sparsify |
1.6.1 |
Easy-to-use UI for automatically sparsifying neural networks and creating sparsification recipes for better inference performance and a smaller footprint |
2023-12-20 14:28:37 |
optimum-deepsparse |
0.1.0.dev1 |
Optimum DeepSparse is an extension of the Hugging Face Transformers library that integrates the DeepSparse inference runtime. DeepSparse offers GPU-class performance on CPUs, making it possible to run Transformers and other deep learning models on commodity hardware with sparsity. Optimum DeepSparse provides a framework for developers to easily integrate DeepSparse into their applications, regardless of the hardware platform. |
2023-10-26 02:02:45 |
optimum-graphcore |
0.7.1 |
Optimum Library is an extension of the Hugging Face Transformers library, providing a framework to integrate third-party libraries from Hardware Partners and interface with their specific functionality. |
2023-07-31 09:34:12 |
torch-sconce |
0.0.5 |
torch_sconce: torch helper |
2023-07-19 05:32:31 |
apple-upscale |
0.1.1 |
Export utility for unconstrained channel pruned models |
2023-07-14 21:59:14 |
mmenot |
0.9.5 |
ENOT Framework integration package for OpenMMLab codebase |
2023-07-03 07:12:59 |
structured-pruning-adapters |
0.7.1 |
Structured Pruning Adapters for PyTorch |
2023-06-29 09:34:38 |
delve |
0.1.50 |
Delve lets you monitor PyTorch model layer saturation during training |
2023-03-31 21:51:17 |