Name | Version | Summary | date |
optimum-intel |
1.21.0 |
Optimum Library is an extension of the Hugging Face Transformers library, providing a framework to integrate third-party libraries from Hardware Partners and interface with their specific functionality. |
2024-12-06 12:25:14 |
minicons |
0.3.8 |
A package of useful functions to analyze transformer based language models. |
2024-11-30 22:04:08 |
optimum-tpu |
0.2.0 |
Optimum TPU is the interface between the Hugging Face Transformers library and Google Cloud TPU devices. |
2024-11-20 13:07:21 |
optimum-neuron |
0.0.26 |
Optimum Neuron is the interface between the Hugging Face Transformers and Diffusers libraries and AWS Trainium and Inferentia accelerators. It provides a set of tools enabling easy model loading, training and inference on single and multiple neuron core settings for different downstream tasks. |
2024-11-15 15:54:07 |
optimum |
1.23.3 |
Optimum Library is an extension of the Hugging Face Transformers library, providing a framework to integrate third-party libraries from Hardware Partners and interface with their specific functionality. |
2024-10-29 17:43:32 |
optimum-habana |
1.14.1 |
Optimum Habana is the interface between the Hugging Face Transformers and Diffusers libraries and Habana's Gaudi processor (HPU). It provides a set of tools enabling easy model loading, training and inference on single- and multi-HPU settings for different downstream tasks. |
2024-10-29 17:09:40 |
the-fairly-project |
0.4 |
Bias analysis toolkit for NLP and multimodal models |
2024-10-29 00:43:16 |
positional-encodings |
6.0.4 |
1D, 2D, and 3D Sinusodal Positional Encodings in PyTorch |
2024-10-23 07:33:48 |
lumenspark |
0.1.5 |
Lumenspark: A Transformer Model Optimized for Text Generation and Classification with Low Compute and Memory Requirements. |
2024-10-15 22:53:11 |
torchscale-gml |
0.2.3 |
Transformers at any scale |
2024-10-14 17:30:43 |
os-llm-agents |
0.0.1 |
Python package that helps to build LLM agents based on open-source models from Huggingface Hub. |
2024-10-07 14:53:55 |
local-attention |
1.9.15 |
Local attention, window with lookback, for language modeling |
2024-09-06 14:56:56 |
text-embedder |
0.1.2 |
A unified inference library for transformer-based pre-trained multilingual embedding models |
2024-08-22 19:53:29 |
kobert-transformers |
0.6.0 |
Transformers library for KoBERT, DistilKoBERT |
2024-08-20 11:15:56 |
product-key-memory |
0.2.11 |
Product Key Memory |
2024-07-30 14:37:14 |
sage-spelling |
1.1.0 |
SAGE: Spell checking via Augmentation and Generative distribution Emulation |
2024-07-21 11:44:46 |
CandyLLM |
0.0.6 |
CandyLLM: Unified framework for HuggingFace and OpenAI Text-generation Models |
2024-07-17 18:59:30 |
kimchima |
0.5.4 |
The collections of tools for ML model development. |
2024-07-01 05:36:40 |
nanodl |
1.2.5.dev1 |
A Jax-based library for designing and training transformer models from scratch. |
2024-05-26 21:28:50 |
tsdae |
1.1.0 |
Tranformer-based Denoising AutoEncoder for Sentence Transformers Unsupervised pre-training. |
2024-05-26 11:48:12 |