Name | Version | Summary | date |
orientLoss |
0.0.8 |
orientLoss is a classification loss function @pytorch. It less prone to overfitting compared to crossEntropy. Compatible with random vector/embeddings with direction-indicating categoriesăprobabilisticăoneHot as target. |
2025-08-17 18:01:56 |
nncase-kpu |
2.10.0 |
kpu plug-in for nncase |
2025-08-08 01:46:49 |
nncase |
2.10.0 |
A neural network compiler for AI accelerators |
2025-08-08 01:42:00 |
glassInit |
0.0.6 |
glassInit is a way to initialize the parameters of a neural network layer @pytorch. It is suitable for shortcut connections where tensor sizes need to be adjusted. The flow of information between forward and backward propagation is maximized while adjusting the tensor size. |
2025-08-06 13:26:50 |
Atiny |
0.0.4 |
Atiny is a gradient based PyTorch optimizer. Atiny uses half of Adam's memory to achieve results that are not inferior to Adam. |
2025-08-03 05:29:32 |
fftLoss |
0.0.1 |
The fftLoss @PyTorch is a frequency domain loss function that prevents the problem of weak frequency components being suppressed by strong frequency components when using a regular loss function. |
2025-07-30 13:41:24 |
FWU |
0.0.3 |
FWU is a series of PyTorch-based neural network units. It has higher information utilization, prevents neuronal necrosis, and eliminates the need to add additionall activation functions. |
2025-07-30 10:27:36 |
RRPad |
0.0.2 |
ReplicationReflectionPad combines ReplicationPad and ReflectionPad to prevent errors from being reported due to small input sizes when using non-constant padding types. |
2025-07-30 06:25:35 |
Afine |
0.0.2 |
Afine is a gradient based PyTorch optimizer. |
2025-01-17 14:02:41 |
hox |
1.2.0 |
Lightweight neual network library project. |
2024-12-13 23:13:44 |
modelconv |
0.3.1 |
Converter for neural models into various formats. |
2024-11-12 15:11:50 |
terge |
0.1.1 |
An easy-to-use Python library for merging PyTorch models. |
2024-06-13 05:05:57 |