PyDigger - unearthing stuff about Python


NameVersionSummarydate
compressed-tensors-nightly 0.8.0.20241118 Library for utilization of compressed safetensors of neural network models 2024-11-18 01:07:43
llmcompressor 0.3.0 A library for compressing large language models utilizing the latest techniques and research in the field for both training aware and post training techniques. The library is designed to be flexible and easy to use on top of PyTorch and HuggingFace Transformers, allowing for quick experimentation. 2024-11-13 05:17:48
llmcompressor-nightly 0.3.0.20241112 A library for compressing large language models utilizing the latest techniques and research in the field for both training aware and post training techniques. The library is designed to be flexible and easy to use on top of PyTorch and HuggingFace Transformers, allowing for quick experimentation. 2024-11-12 16:42:51
compressed-tensors 0.7.1 Library for utilization of compressed safetensors of neural network models 2024-10-17 18:12:47
deepsparse-ent 1.8.0 An inference runtime offering GPU-class performance on CPUs and APIs to integrate ML into your application 2024-07-19 16:32:32
deepsparse 1.8.0 An inference runtime offering GPU-class performance on CPUs and APIs to integrate ML into your application 2024-07-19 16:29:01
sparsezoo 1.8.1 Neural network model repository for highly sparse and sparse-quantized models with matching sparsification recipes 2024-07-19 16:25:22
sparseml-nightly 1.8.0.20240630 Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models 2024-06-30 20:38:38
sparseml 1.8.0 Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models 2024-06-21 15:53:30
deepsparse-nightly 1.8.0.20240502 An inference runtime offering GPU-class performance on CPUs and APIs to integrate ML into your application 2024-05-06 19:42:19
sparsezoo-nightly 1.8.0.20240506 Neural network model repository for highly sparse and sparse-quantized models with matching sparsification recipes 2024-05-06 19:37:51
sparsify-nightly 1.7.0.20240304 Easy-to-use UI for automatically sparsifying neural networks and creating sparsification recipes for better inference performance and a smaller footprint 2024-03-05 13:38:57
sparsify 1.6.1 Easy-to-use UI for automatically sparsifying neural networks and creating sparsification recipes for better inference performance and a smaller footprint 2023-12-20 14:28:37
optimum-deepsparse 0.1.0.dev1 Optimum DeepSparse is an extension of the Hugging Face Transformers library that integrates the DeepSparse inference runtime. DeepSparse offers GPU-class performance on CPUs, making it possible to run Transformers and other deep learning models on commodity hardware with sparsity. Optimum DeepSparse provides a framework for developers to easily integrate DeepSparse into their applications, regardless of the hardware platform. 2023-10-26 02:02:45
Neuralmagic, Inc.
hourdayweektotal
30190410647264379
Elapsed time: 3.72618s