Name | Version | Summary | date |
strassen-attention |
0.1.5 |
Strassen Attention |
2025-07-08 20:21:34 |
alphagenome-pytorch |
0.0.40 |
AlphaGenome |
2025-07-08 19:32:08 |
simplicial-attention |
0.0.10 |
(2) - Simplicial Attention |
2025-07-08 19:19:34 |
flwr-nightly |
1.19.0.dev20250524 |
Flower: A Friendly Federated AI Framework |
2025-05-24 23:06:21 |
metnet |
4.1.18 |
PyTorch MetNet Implementation |
2025-03-28 12:36:08 |
native-sparse-attention-pytorch |
0.1.24 |
Native Sparse Attention |
2025-03-20 14:38:57 |
f5-tts-mlx |
0.2.6 |
F5-TTS - MLX |
2025-03-19 02:11:37 |
transfusion-pytorch |
0.10.2 |
Transfusion in Pytorch |
2025-03-18 21:35:37 |
nGPT-pytorch |
0.2.10 |
nGPT |
2025-03-13 15:33:51 |
vector-quantize-pytorch |
1.22.2 |
Vector Quantization - Pytorch |
2025-03-11 21:43:33 |
flwr |
1.16.0 |
Flower: A Friendly Federated AI Framework |
2025-03-11 16:32:25 |
danling |
0.3.13 |
Scaffold for experienced Machine Learning Researchers |
2025-03-11 13:04:13 |
python-lilypad |
0.0.23 |
An open-source prompt engineering framework. |
2025-02-25 03:25:39 |
swarms |
7.4.1 |
Swarms - TGSC |
2025-02-24 01:00:44 |
tab-transformer-pytorch |
0.4.2 |
Tab Transformer - Pytorch |
2025-02-24 00:12:18 |
nanospeech |
0.0.6 |
Simple, hackable text-to-speech with PyTorch or MLX. |
2025-02-23 02:43:28 |
x-transformers |
2.1.2 |
X-Transformers |
2025-02-22 17:51:50 |
pi-vae-pytorch |
1.1.0 |
A Pytorch implementation of Poisson Identifiable VAE (pi-VAE), a variational auto encoder used to construct latent variable models of neural activity while simultaneously modeling the relation between the latent and task variables. |
2025-02-22 17:48:37 |
PyEDCR |
1.1.3 |
PyEDCR is a metacognitive neuro-symbolic method for learning error detection and correction rules in deployed ML models using combinatorial sub-modular set optimization |
2025-02-19 06:34:54 |
improving-transformers-world-model |
0.0.11 |
Improving Transformers World Model for RL |
2025-02-18 19:37:12 |