Name | Version | Summary | date |
---|---|---|---|
siren-pytorch | 0.1.7 | Implicit Neural Representations with Periodic Activation Functions | 2023-07-28 14:59:36 |
flash-attention-jax | 0.3.1 | Flash Attention - in Jax | 2023-07-18 02:45:18 |
memory-efficient-attention-pytorch | 0.1.6 | Memory Efficient Attention - Pytorch | 2023-07-18 02:42:57 |
discrete-key-value-bottleneck-pytorch | 0.1.1 | Discrete Key / Value Bottleneck - Pytorch | 2023-07-09 23:57:56 |
AttentionGrid | 0.0.2 | AttentionGrid - Library | 2023-07-06 15:45:09 |
block-recurrent-transformer-pytorch | 0.4.3 | Block Recurrent Transformer - Pytorch | 2023-07-05 20:32:13 |
graph-transformer-pytorch | 0.1.1 | Graph Transformer - Pytorch | 2023-06-22 14:42:19 |
dalle-pytorch | 1.6.6 | DALL-E - Pytorch | 2023-05-24 18:40:32 |
coordinate-descent-attention | 0.0.11 | Coordinate Descent Attention - Pytorch | 2023-05-22 16:58:51 |
conformer | 0.3.2 | The convolutional module from the Conformer paper | 2023-05-17 21:08:48 |
FLASH-pytorch | 0.1.8 | FLASH - Transformer Quality in Linear Time - Pytorch | 2023-05-12 18:10:32 |
tab-transformer-pytorch | 0.2.6 | Tab Transformer - Pytorch | 2023-05-10 13:47:09 |
rvq-vae-gpt | 0.0.4 | Yet another attempt at GPT in quantized latent space | 2023-04-14 16:37:31 |
memory-compressed-attention | 0.0.7 | Memory-Compressed Self Attention | 2023-04-10 03:40:06 |
perceiver-ar-pytorch | 0.0.10 | Perceiver AR | 2023-04-06 19:20:04 |
PaLM-rlhf-pytorch | 0.2.1 | PaLM + Reinforcement Learning with Human Feedback - Pytorch | 2023-04-05 14:29:18 |
memorizing-transformers-pytorch | 0.4.0 | Memorizing Transformer - Pytorch | 2023-03-24 18:28:40 |
nuwa-pytorch | 0.7.8 | NÜWA - Pytorch | 2023-01-17 17:57:29 |
isab-pytorch | 0.2.3 | Induced Set Attention Block - Pytorch | 2023-01-10 21:03:16 |
einops-exts | 0.0.4 | Einops Extensions | 2023-01-05 20:02:43 |
hour | day | week | total |
---|---|---|---|
43 | 2439 | 10285 | 263404 |