CrossEntropyLoss#

class vulkpy.nn.CrossEntropyLoss#

Bases: ReduceLoss

Cross Entropy Loss

Methods Summary

__call__(x, y)

Compute Loss

backward()

Backward

forward(x, y)

Forward

grad()

Compute Gradients

Methods Documentation

__call__(x: Array, y: Array) Array#

Compute Loss

Parameters:
Returns:

loss – Loss

Return type:

vulkpy.Array

backward() Array#

Backward

Returns:

loss – Batch gradients

Return type:

vulkpy.Array

Notes

\[dx = \frac{-y}{x + \epsilon}\]

Warning

Generally, users should not call this method directly. Use grad() instead, where reduction scale is corrected.

forward(x: Array, y: Array) Array#

Forward

Parameters:
Returns:

loss – Cross Entropy Loss

Return type:

vulkpy.Array

Notes

\[L = - f _{\text{reduce}} ( y_i \log (x_i) )\]

Warning

Generally, users should not call this method directly. Use __call__ instead, where input / output are stored for training.

grad() Array#

Compute Gradients

Returns:

dx – Batch gradients of dL/dx

Return type:

vulkpy.Array

Notes

This method calculates gradients for the last __call__(x, y).

__init__(*args, **kwargs)#

Initialize Cross Entropy Loss

Parameters:

reduce ({"mean", "sum"}, optional) – Reduction method over batch. The default is "mean".