PyDigger - unearthing stuff about Python


NameVersionSummarydate
flash-attn 2.7.2.post1 Flash Attention: Fast and Memory-Efficient Exact Attention 2024-12-08 05:58:10
causal-conv1d 1.5.0.post8 Causal depthwise conv1d in CUDA, with a PyTorch interface 2024-12-06 09:48:49
quant-matmul 1.2.0 Quantized MatMul in CUDA with a PyTorch interface 2024-03-20 03:44:36
fast-hadamard-transform 1.0.4.post1 Fast Hadamard Transform in CUDA, with a PyTorch interface 2024-02-13 05:49:17
flash-attn-wheels-test 2.0.8.post17 Flash Attention: Fast and Memory-Efficient Exact Attention 2023-08-13 21:27:09
flash-attn-xwyzsn 1.0.7 Flash Attention: Fast and Memory-Efficient Exact Attention 2023-06-01 03:53:40
Tri Dao
hourdayweektotal
2611005682275692
Elapsed time: 0.90798s