SoftmaxCrossEntropyLoss#
- class vulkpy.nn.SoftmaxCrossEntropyLoss#
Bases:
CrossEntropyLoss
Softmax Cross Entropy Loss
See also
vulkpy.nn.Softmax
Softmax layer
vulkpy.nn.CrossEntropyLoss
Cross Entropy loss without Softmax
Methods Summary
Methods Documentation
- __call__(x: Array, y: Array) Array #
Compute Loss
- Parameters:
x (vulkpy.Array) – Batch input features
y (vulkpy.Array) – Batch labels/targets
- Returns:
loss – Loss
- Return type:
- backward() Array #
Backward
- Returns:
loss – Batch gradients
- Return type:
Notes
\[dx = \rm{softmax}(x) - y\]Warning
Generally, users should not call this method directly. Use
grad()
instead, where reduction scale is corrected.
- forward(x: Array, y: Array) Array #
Forward
- Parameters:
x (vulkpy.Array) – Batch input features
y (vulkpy.Array) – Batch labels
- Returns:
loss – Loss
- Return type:
Notes
\[L = - f _{\text{reduce}} (y_i \log (\rm{softmax}(x) _i))\]Warning
Generally, users should not call this method directly. Use
__call__
instead, where input / output are stored for training.
- grad() Array #
Compute Gradients
- Returns:
dx – Batch gradients of dL/dx
- Return type:
Notes
This method calculates gradients for the last
__call__(x, y)
.
- __init__(*args, **kwargs)#
Initialize Softmax Cross Entropy Loss
- Parameters:
reduce ({"mean", "sum"}) – Reduction method over batch. The default is
"mean"
.