SmartEngine  1.6.0
Public Attributes | List of all members
SmartEngine::RegularizationLossInfo Struct Reference

Regularization tries to keep the weight values from exploding during gradient descent by adding L1 and L2 loss of the weights. More...

#include <GradientDescentTrainer.h>

Public Attributes

float L1Weights = 0.0f
 L1 loss multiplier applied to weights. More...
 
float L1Biases = 0.0f
 L1 loss multiplier applied to biases. More...
 
float L2Weights = 1e-6f
 L2 loss multiplier applied to weights. More...
 
float L2Biases = 1e-6f
 L2 loss multiplier applied to biases. More...
 

Detailed Description

Regularization tries to keep the weight values from exploding during gradient descent by adding L1 and L2 loss of the weights.

L1 loss is defined as Sum(Abs(Weight_i)) L2 loss is defined as Sum(Weight_i ^ 2) / 2 The total regularization loss is L1 + L2

Member Data Documentation

◆ L1Biases

float SmartEngine::RegularizationLossInfo::L1Biases = 0.0f

L1 loss multiplier applied to biases.

◆ L1Weights

float SmartEngine::RegularizationLossInfo::L1Weights = 0.0f

L1 loss multiplier applied to weights.

◆ L2Biases

float SmartEngine::RegularizationLossInfo::L2Biases = 1e-6f

L2 loss multiplier applied to biases.

◆ L2Weights

float SmartEngine::RegularizationLossInfo::L2Weights = 1e-6f

L2 loss multiplier applied to weights.