Name | Version | Summary | date |
rrllm |
0.1.5 |
Rubin's rules to account for performance instability of LLMs |
2023-03-20 19:29:16 |
APAC-SCALE |
0.1.2 |
Transformers at any scale |
2023-03-14 20:54:12 |
trapper |
0.0.13 |
State-of-the-art NLP through transformer models in a modular design and consistent APIs. |
2023-02-03 11:07:20 |
transformers-collection |
0.2.0 |
A collection of transformer models built using huggingface for various tasks. |
2023-01-26 15:03:12 |
hugsvision |
0.75.5 |
A easy to use huggingface wrapper for computer vision. |
2023-01-22 01:21:16 |
nuwa-pytorch |
0.7.8 |
NÜWA - Pytorch |
2023-01-17 17:57:29 |
decoder-ring |
0.1.2 |
Type-hinted interface to use several decoders on text-generation models |
2022-12-31 00:50:08 |
transformers-visualizer |
0.2.2 |
Explain your 🤗 transformers without effort! Display the internal behavior of your model. |
2022-12-29 16:12:43 |
adjacent-attention-pytorch |
0.0.12 |
Adjacent Attention Network - Pytorch |
2022-12-24 16:52:18 |
transfusion |
0.1.0.dev0 |
Transformers 🤝 diffusion |
2022-12-22 15:29:18 |
Attention-and-Transformers |
0.0.15 |
Building attention mechanisms and Transformer models from scratch. Alias ATF. |
2022-12-17 19:13:12 |
flamingo-pytorch |
0.1.2 |
Flamingo - Pytorch |
2022-10-18 21:46:03 |