Name | Version | Summary | date |
transformers |
4.48.1 |
State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow |
2025-01-20 16:36:07 |
t-draft-123 |
4.48.0.dev0 |
State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow |
2025-01-07 14:25:43 |
spark-nlp |
5.5.2 |
John Snow Labs Spark NLP is a natural language processing library built on top of Apache Spark ML. It provides simple, performant & accurate NLP annotations for machine learning pipelines, that scale easily in a distributed environment. |
2024-12-18 16:04:11 |
sagemaker-huggingface-inference-toolkit |
2.4.1 |
Open source library for running inference workload with Hugging Face Deep Learning Containers on Amazon SageMaker. |
2024-10-18 08:11:50 |
divyanx-transformers |
4.45.0.dev0 |
State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow |
2024-08-21 09:24:22 |
transformers-backport |
4.33.4 |
transformers backport - version 4.33.x |
2024-07-04 23:15:53 |
nm-transformers-nightly |
1.7.0.20240304 |
State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow |
2024-03-05 13:39:02 |
nm-transformers |
1.5.1.42301 |
State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow |
2023-08-23 01:27:39 |
together-worker |
0.1.22 |
|
2023-08-17 04:29:12 |
together-web3 |
0.1.5 |
|
2023-04-28 17:38:31 |
transformers-machinify |
4.27.0.dev0 |
State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow |
2023-03-25 00:41:42 |
ckls-test-lib |
4.2.7 |
John Snow Labs Spark NLP is a natural language processing library built on top of Apache Spark ML. It provides simple, performant & accurate NLP annotations for machine learning pipelines, that scale easily in a distributed environment. |
2023-01-23 15:14:00 |
in-transformers |
1.0.0 |
State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow |
2023-01-19 01:56:21 |
simplerepresentations |
0.0.4 |
Easy-to-use text representations extraction library based on the Transformers library. |
2020-01-03 15:33:31 |
pytorch-transformers |
1.2.0 |
Repository of pre-trained NLP Transformer models: BERT & RoBERTa, GPT & GPT-2, Transformer-XL, XLNet and XLM |
2019-09-04 11:36:39 |