Name | Version | Summary | date |
simple-hierarchical-transformer |
0.2.0 |
Simple Hierarchical Transformer |
2023-12-01 02:27:41 |
eventdetector-ts |
1.1.0 |
EventDetector introduces a universal event detection method for multivariate time series. Unlike traditional deep-learning methods, it's regression-based, requiring only reference events. The robust stacked ensemble, from Feed-Forward Neural Networks to Transformers, ensures accuracy by mitigating biases. The package supports practical implementation, excelling in detecting events with precision, validated across diverse domains. |
2023-11-28 15:31:31 |
minerva-torch |
0.0.1 |
Transformers at zeta scales |
2023-11-25 09:34:27 |
attention-sinks |
0.4.0 |
Extend LLMs to infinite length without sacrificing efficiency and performance, without retraining |
2023-11-23 12:11:05 |
flash-attention-softmax-n |
0.3.2 |
CUDA and Triton implementations of Flash Attention with SoftmaxN. |
2023-11-21 14:15:29 |
REaLTabFormer |
0.1.5 |
A novel method for generating tabular and relational data using language models. |
2023-11-20 03:18:21 |
nlpbaselines |
0.0.49 |
Quickly establish strong baselines for NLP tasks |
2023-11-14 17:25:59 |
multimodal-transformers |
0.3.0 |
Multimodal Extension Library for PyTorch HuggingFace Transformers |
2023-11-14 16:26:54 |
tril |
0.2.1 |
Transformers Reinforcement and Imitation Learning Library |
2023-11-13 19:12:44 |
llama-client-aic |
1.0.8 |
python client for redten - a platform for building and testing distributed, self-hosted LLMs with native RAG and Reinforcement Learning with Human Feedback (RLHF) https://api.redten.io |
2023-11-10 20:32:10 |
JakeSilbersteinMachineLearning |
0.2.14 |
Non-Functional Machine Learning Library |
2023-11-09 18:38:03 |
JakeSilberstein-ML |
0.2.11 |
Non-Functional Machine Learning Library |
2023-11-08 22:28:12 |
JSML |
0.2.10 |
Non-Functional Machine Learning Library |
2023-11-08 19:40:19 |
genai_stack |
0.2.6 |
End-to-End Secure & Private Generative AI for All |
2023-11-08 16:22:53 |
datawords |
0.7.4 |
A library to work with text data |
2023-11-08 09:58:49 |
lightning-gpt |
0.1.1 |
GPT training in Lightning |
2023-11-01 12:23:33 |
span-marker |
1.5.0 |
Named Entity Recognition using Span Markers |
2023-10-31 11:43:56 |
optimum-deepsparse |
0.1.0.dev1 |
Optimum DeepSparse is an extension of the Hugging Face Transformers library that integrates the DeepSparse inference runtime. DeepSparse offers GPU-class performance on CPUs, making it possible to run Transformers and other deep learning models on commodity hardware with sparsity. Optimum DeepSparse provides a framework for developers to easily integrate DeepSparse into their applications, regardless of the hardware platform. |
2023-10-26 02:02:45 |
torchscale |
0.3.0 |
Transformers at any scale |
2023-10-20 09:23:58 |
zorro-pytorch |
0.1.1 |
Zorro - Pytorch |
2023-10-20 01:42:53 |