Name | Version | Summary | date |
sibila |
0.4.5 |
Structured queries from local or online LLM models |
2024-06-21 16:11:02 |
belt-nlp |
1.1.0 |
BELT (BERT For Longer Texts). BERT-based text classification model for processing texts longer than 512 tokens. |
2024-06-19 12:48:26 |
openllm-core |
0.5.7 |
OpenLLM Core: Core components for OpenLLM. |
2024-06-14 02:49:51 |
openllm-client |
0.5.7 |
OpenLLM Client: Interacting with OpenLLM HTTP/gRPC server, or any BentoML server. |
2024-06-14 02:49:50 |
taxonerd |
1.5.4 |
A Python package and CLI tool based on spaCy for detecting mentions of taxonomic entities in text |
2024-06-12 08:11:47 |
querent |
3.1.2 |
The Asynchronous Data Dynamo and Graph Neural Network Catalyst |
2024-06-10 11:13:54 |
spacyrerank |
0.0.6 |
Rank phrases and text based on query by leveraging hugging-face models. |
2024-05-17 06:44:03 |
self-reasoning-tokens-pytorch |
0.0.4 |
Self Reasoning Tokens |
2024-05-05 18:01:34 |
audiolm-superfeel |
2.1.7 |
AudioLM - Language Modeling Approach to Audio Generation from Google Research - Pytorch |
2024-05-04 12:58:19 |
soundstorm-superfeel |
0.4.5 |
SoundStorm - Efficient Parallel Audio Generation from Google Deepmind, in Pytorch |
2024-05-04 10:33:55 |
REaLTabFormer |
0.1.7 |
A novel method for generating tabular and relational data using language models. |
2024-04-28 18:00:11 |
s5-pytorch |
0.2.1 |
S5 - Simplified State Space Layers for Sequence Modeling - Pytorch |
2024-04-26 09:39:13 |
deformable-attention |
0.0.19 |
Deformable Attention - from the paper "Vision Transformer with Deformable Attention" |
2024-04-23 23:45:52 |
graph-transformer |
0.2.1 |
This is the implementation of Graph Transformer (https://www.ijcai.org/proceedings/2021/0214.pdf) |
2024-04-20 11:09:13 |
inseq |
0.6.0 |
Interpretability for Sequence Generation Models 🔍 |
2024-04-13 13:37:37 |
andromeda-torch |
0.0.9 |
Andromeda - Pytorch |
2024-03-21 22:02:59 |