Name | Version | Summary | date |
---|---|---|---|
forgellm | 0.4.7 | A comprehensive toolkit for end-to-end continued pre-training, fine-tuning, monitoring, testing and publishing of language models with MLX-LM | 2025-07-20 11:38:46 |
tsdae | 1.1.0 | Tranformer-based Denoising AutoEncoder for Sentence Transformers Unsupervised pre-training. | 2024-05-26 11:48:12 |
nano-askllm | 0.2.3 | Unofficial implementation of the Ask-LLM paper 'How to Train Data-Efficient LLMs', arXiv:2402.09668. | 2024-03-22 14:50:06 |
hour | day | week | total |
---|---|---|---|
111 | 1787 | 10447 | 309705 |