Name | Version | Summary | date |
savis |
0.4.3 |
A sentence-level visualization tool for attention in large language models |
2024-12-16 11:12:47 |
slot-attention |
1.4.0 |
Implementation of Slot Attention in Pytorch |
2024-08-20 18:35:29 |
AttentionMechanism |
1.0.2 |
This repository contains an implementation of many attention mechanism models. |
2024-08-12 10:27:32 |
keras-cv-attention-models |
1.4.1 |
Tensorflow keras computer vision attention models. Alias kecam. https://github.com/leondgarse/keras_cv_attention_models |
2024-04-10 05:31:21 |
kecam |
1.4.1 |
Tensorflow keras computer vision attention models. Alias kecam. https://github.com/leondgarse/keras_cv_attention_models |
2024-04-10 05:31:19 |
fusilli |
1.2.3 |
Comparing multi-modal data fusion methods. Don't be silly, use Fusilli! |
2024-02-01 12:27:40 |
linformer |
0.2.3 |
Linformer implementation in Pytorch |
2024-01-05 20:40:37 |
agent-attention-pytorch |
0.1.7 |
Agent Attention - Pytorch |
2023-12-22 22:39:35 |
scann-model |
1.0 |
SCANN - Self-Consistent Atention-based Neural Network |
2023-12-17 16:30:48 |
warp-attention |
0.1.9 |
Warp attention: hardware efficient implementation of scaled dot product attention. |
2023-10-07 13:53:07 |
bisum |
0.1.1 |
binary sparse and dense tensor partial-tracing |
2023-07-27 00:31:51 |
vision-xformer |
0.2.0 |
Vision Xformers |
2023-05-10 22:25:40 |
ospark |
0.1.4 |
Ospark is an opensource for quickly builder of former model series. |
2023-04-19 13:52:43 |
x-Metaformer |
0.3.1 |
A PyTorch implementation of "MetaFormer Baselines" with optional extensions. |
2023-04-14 10:38:47 |
perceiver-ar-pytorch |
0.0.10 |
Perceiver AR |
2023-04-06 19:20:04 |
local-attention-tf |
0.0.3 |
Local attention, window with lookback, for Image, Audio and language modeling |
2023-03-26 07:21:46 |
Attention-and-Transformers |
0.0.15 |
Building attention mechanisms and Transformer models from scratch. Alias ATF. |
2022-12-17 19:13:12 |
phyaat |
0.0.3 |
PhyAAt: Physiology of Auditory Attention |
2022-07-21 12:45:52 |
reformer-pytorch |
1.4.4 |
Reformer, the Efficient Transformer, Pytorch |
2021-11-06 23:09:02 |
axial-attention |
0.6.1 |
Axial Attention |
2021-08-26 01:15:01 |