Name | Version | Summary | date |
slimformers |
1.4.6 |
Lightweight Optimization and Model Adaptation |
2025-08-01 15:36:17 |
optimum |
1.27.0 |
Optimum Library is an extension of the Hugging Face Transformers library, providing a framework to integrate third-party libraries from Hardware Partners and interface with their specific functionality. |
2025-07-30 16:40:44 |
trl |
0.20.0 |
Train transformer language models with reinforcement learning. |
2025-07-29 04:10:06 |
optimum-habana |
1.18.1 |
Optimum Habana is the interface between the Hugging Face Transformers and Diffusers libraries and Habana's Gaudi processor (HPU). It provides a set of tools enabling easy model loading, training and inference on single- and multi-HPU settings for different downstream tasks. |
2025-07-24 16:52:56 |
jllama-py |
1.1.13 |
A simple AI Tool set application |
2025-07-19 05:47:27 |
local-attention |
1.11.2 |
Local attention, window with lookback, for language modeling |
2025-07-16 14:30:37 |
minicons |
0.3.14 |
A package of useful functions to analyze transformer based language models. |
2025-02-20 20:00:59 |
optimum-neuron |
0.0.28 |
Optimum Neuron is the interface between the Hugging Face Transformers and Diffusers libraries and AWS Trainium and Inferentia accelerators. It provides a set of tools enabling easy model loading, training and inference on single and multiple neuron core settings for different downstream tasks. |
2025-02-07 10:24:33 |
optimum-intel |
1.21.0 |
Optimum Library is an extension of the Hugging Face Transformers library, providing a framework to integrate third-party libraries from Hardware Partners and interface with their specific functionality. |
2024-12-06 12:25:14 |
optimum-tpu |
0.2.0 |
Optimum TPU is the interface between the Hugging Face Transformers library and Google Cloud TPU devices. |
2024-11-20 13:07:21 |
the-fairly-project |
0.4 |
Bias analysis toolkit for NLP and multimodal models |
2024-10-29 00:43:16 |
positional-encodings |
6.0.4 |
1D, 2D, and 3D Sinusodal Positional Encodings in PyTorch |
2024-10-23 07:33:48 |
lumenspark |
0.1.5 |
Lumenspark: A Transformer Model Optimized for Text Generation and Classification with Low Compute and Memory Requirements. |
2024-10-15 22:53:11 |
torchscale-gml |
0.2.3 |
Transformers at any scale |
2024-10-14 17:30:43 |
os-llm-agents |
0.0.1 |
Python package that helps to build LLM agents based on open-source models from Huggingface Hub. |
2024-10-07 14:53:55 |
text-embedder |
0.1.2 |
A unified inference library for transformer-based pre-trained multilingual embedding models |
2024-08-22 19:53:29 |
kobert-transformers |
0.6.0 |
Transformers library for KoBERT, DistilKoBERT |
2024-08-20 11:15:56 |
product-key-memory |
0.2.11 |
Product Key Memory |
2024-07-30 14:37:14 |
sage-spelling |
1.1.0 |
SAGE: Spell checking via Augmentation and Generative distribution Emulation |
2024-07-21 11:44:46 |
CandyLLM |
0.0.6 |
CandyLLM: Unified framework for HuggingFace and OpenAI Text-generation Models |
2024-07-17 18:59:30 |