Name | Version | Summary | date |
optimum-nvidia |
0.1.0b6 |
Optimum Nvidia is the interface between the Hugging Face Transformers and NVIDIA GPUs. " |
2024-04-11 21:13:38 |
wordslab-llms |
0.0.3 |
Python library for easy and efficient inference and fine tuning of popular open source llms |
2024-04-01 17:06:34 |
nano-askllm |
0.2.3 |
Unofficial implementation of the Ask-LLM paper 'How to Train Data-Efficient LLMs', arXiv:2402.09668. |
2024-03-22 14:50:06 |
nendo-plugin-textgen |
0.1.1 |
A text generation plugin using local LLMs or other text generation methods. Builds on top of `transformers` by Hugging Face. |
2024-03-14 09:17:28 |
local-llm-function-calling |
0.1.23 |
A tool for generating function arguments and choosing what function to call with local LLMs |
2024-03-12 12:29:43 |
transformers-stream-generator |
0.0.5 |
This is a text generation method which returns a generator, streaming out each token in real-time during inference, based on Huggingface/Transformers. |
2024-03-11 14:18:02 |
cprex |
0.3.0 |
Chemical Properties Relation Extraction |
2024-03-08 16:47:08 |
muse-maskgit-pytorch |
0.3.4 |
MUSE - Text-to-Image Generation via Masked Generative Transformers, in Pytorch |
2024-02-24 15:45:14 |
universalmodels |
0.0.6 |
A series of wrappers to allow for multiple AI model sources to behave as huggingface transformers models |
2024-02-17 20:15:27 |
medpalm |
0.2.0 |
MedPalm - Pytorch |
2024-02-17 09:06:15 |
auto-gptq |
0.7.0 |
An easy-to-use LLMs quantization package with user-friendly apis, based on GPTQ algorithm. |
2024-02-16 12:52:41 |
chemical-converters |
0.0.1 |
Chemical-Converters, developed by Knowledgator, showcases our technological capabilities in the chemical domain with entry-level models for a glimpse into potential applications. It's collection of tools for converting one chemical format into another. You can choose any model at HuggingFace trained to do specific convertion and setup your pipeline with the framework. |
2024-02-14 19:22:38 |
h-transformer-1d |
0.1.9 |
H-Transformer 1D - Pytorch |
2024-02-12 14:53:11 |
recurrent-memory-transformer-pytorch |
0.5.6 |
Recurrent Memory Transformer - Pytorch |
2024-02-11 18:29:35 |
scikit-transformers |
0.3.1 |
scikit-transformers is a very usefull package to enable and provide custom transformers such as LogColumnTransformer, BoolColumnTransformers and others fancy transformers. |
2024-02-09 23:42:52 |
infer-camembert |
0.2.0 |
Python implementation for text classification inference with CamemBERT fine-tuned models |
2024-02-08 09:52:19 |
codebook-features |
0.1.2 |
Sparse and discrete interpretability tool for neural networks |
2024-02-05 22:09:52 |
qwen |
0.1.1 |
Qwen VL - Pytorch |
2024-01-29 18:49:16 |
palme |
0.1.2 |
palme - Pytorch |
2024-01-29 18:46:46 |
fttjax |
0.1.1 |
Feature Tokenizer + Transformer - JAX |
2024-01-08 17:37:06 |