Name | Version | Summary | date |
scalable-softmax |
0.1.1 |
PyTorch implementation of Scalable-Softmax for attention mechanisms |
2025-02-08 22:49:20 |
local-attention |
1.10.0 |
Local attention, window with lookback, for language modeling |
2025-01-12 16:54:15 |
positional-encodings |
6.0.4 |
1D, 2D, and 3D Sinusodal Positional Encodings in PyTorch |
2024-10-23 07:33:48 |
graph_attention_student |
0.18.4 |
MEGAN: Multi Explanation Graph Attention Network |
2024-10-16 11:54:18 |
quartic-transformer |
0.0.14 |
Quartic Transformer |
2024-10-06 01:08:27 |
neuralstockprophet |
0.0.3 |
LSTM-ARIMA with attention mechanisms and multiplicative decomposition for sophisticated stock forecasting. |
2024-09-06 08:34:38 |
graph-attention-student |
0.18.2 |
MEGAN: Multi Explanation Graph Attention Network |
2024-08-09 16:18:16 |
scpram |
0.0.3 |
scPRAM accurately predicts single-cell gene expression perturbation response based on attention mechanism |
2024-05-24 05:49:25 |
pyfoal |
1.0.1 |
Python forced aligner |
2024-04-12 23:12:24 |