Name | Version | Summary | date |
nagisa-bert |
0.0.4 |
A BERT model for nagisa |
2023-12-23 07:24:39 |
toolformer-pytorch |
0.0.30 |
Toolformer - Pytorch |
2023-12-21 05:02:20 |
multivision |
0.1.9 |
Create Object Segmentation Labels |
2023-12-19 12:41:54 |
bampe-weights |
0.1.0 |
An alternative approach to building foundational generative AI models with visualizations using Blender |
2023-12-18 05:01:45 |
kosmosx |
0.3.1 |
Transformers at zeta scales |
2023-12-16 21:26:18 |
romanian-embeddings |
0.4 |
Romanian language embeddings using transformers and ONNX. |
2023-12-12 13:40:03 |
parti-pytorch |
0.2.0 |
Parti - Pathways Autoregressive Text-to-Image Model - Pytorch |
2023-12-08 14:49:01 |
autolabel4seg |
0.0.4 |
Create Object Segmentation Labels |
2023-12-07 08:11:29 |
grazier |
0.1.1 |
A tool for calling (and calling out to) large language models. |
2023-12-06 07:20:09 |
mypackage-falahgs |
1.1.1 |
Create Object Segmentation Labels |
2023-12-05 14:57:19 |
falahgsauto |
1.3.4 |
Create Object Segmentation Labels |
2023-12-05 14:17:48 |
simple-hierarchical-transformer |
0.2.0 |
Simple Hierarchical Transformer |
2023-12-01 02:27:41 |
eventdetector-ts |
1.1.0 |
EventDetector introduces a universal event detection method for multivariate time series. Unlike traditional deep-learning methods, it's regression-based, requiring only reference events. The robust stacked ensemble, from Feed-Forward Neural Networks to Transformers, ensures accuracy by mitigating biases. The package supports practical implementation, excelling in detecting events with precision, validated across diverse domains. |
2023-11-28 15:31:31 |
minerva-torch |
0.0.1 |
Transformers at zeta scales |
2023-11-25 09:34:27 |
attention-sinks |
0.4.0 |
Extend LLMs to infinite length without sacrificing efficiency and performance, without retraining |
2023-11-23 12:11:05 |
flash-attention-softmax-n |
0.3.2 |
CUDA and Triton implementations of Flash Attention with SoftmaxN. |
2023-11-21 14:15:29 |
nlpbaselines |
0.0.49 |
Quickly establish strong baselines for NLP tasks |
2023-11-14 17:25:59 |
tril |
0.2.1 |
Transformers Reinforcement and Imitation Learning Library |
2023-11-13 19:12:44 |
llama-client-aic |
1.0.8 |
python client for redten - a platform for building and testing distributed, self-hosted LLMs with native RAG and Reinforcement Learning with Human Feedback (RLHF) https://api.redten.io |
2023-11-10 20:32:10 |
JakeSilbersteinMachineLearning |
0.2.14 |
Non-Functional Machine Learning Library |
2023-11-09 18:38:03 |