Name | Version | Summary | date |
triton-model-analyzer |
1.44.0 |
Triton Model Analyzer is a tool to profile and analyze the runtime performance of one or more models on the Triton Inference Server |
2024-09-27 17:03:49 |
figaro |
1.7.1 |
FIGARO: Fast Inference for GW Astronomy, Research & Observations |
2024-09-25 15:38:02 |
anemoi-inference |
0.2.0 |
A package to hold various functions to support training of ML models. |
2024-09-25 12:38:24 |
finsim |
1.0.0 |
Financial simulation and inference |
2024-09-23 03:45:55 |
triton-model-navigator |
0.12.0 |
Triton Model Navigator: An inference toolkit for optimizing and deploying machine learning models and pipelines on the Triton Inference Server and PyTriton. |
2024-09-10 13:08:57 |
lightweight-mcnnm |
1.0.2 |
Leightweight Implementation of Athey et al. (2021)'s MC-NNM estimator |
2024-09-01 16:21:15 |
emd-falsify |
1.0.1 |
Original implementation of the EMD (empirical model discrepancy) model comparison criterion |
2024-08-09 22:34:50 |
optimum-tpu |
0.1.5 |
Optimum TPU is the interface between the Hugging Face Transformers library and Google Cloud TPU devices. |
2024-08-08 14:56:25 |
everai-builtin-autoscaler |
0.1.52 |
everai built in auto scaler |
2024-08-02 08:01:09 |
tsdate |
0.2.1 |
Infer node ages from a tree sequence topology. |
2024-07-31 10:55:58 |
everai-autoscaler |
0.1.23 |
base library of everai autoscaler |
2024-07-31 10:19:38 |
optimum-benchmark |
0.4.0 |
Optimum-Benchmark is a unified multi-backend utility for benchmarking Transformers, Timm, Diffusers and Sentence-Transformers with full support of Optimum's hardware optimizations & quantization schemes. |
2024-07-31 08:35:53 |
hidet |
0.4.1 |
Hidet: a compilation-based DNN inference framework. |
2024-07-30 03:50:43 |
digitalhub-runtime-nefertem |
0.6.0 |
Nefertem runtime for DHCore |
2024-07-25 08:29:36 |
lampe |
0.9.0 |
Likelihood-free AMortized Posterior Estimation with PyTorch |
2024-07-20 21:52:17 |
tsinfer |
0.3.3 |
Infer tree sequences from genetic variation data. |
2024-07-17 10:52:08 |
tritony |
0.0.17 |
Tiny configuration for Triton Inference Server |
2024-07-11 08:21:24 |
expert-ceylon |
2.0.0a2 |
Expert Systems for Python Maintained by Ceylon AI |
2024-07-03 18:44:38 |
azcausal |
0.2.4 |
Casual Inference |
2024-06-27 15:12:05 |
forsys |
1.0.1 |
A tissue stress and cell pressure inference package with dynamical information |
2024-06-26 05:06:47 |