7 #include "LossTrainer.h"
11 class IGradientDescentTrainer;
14 struct GradientDescentTrainerCInfo :
public LossTrainerCInfo
133 virtual void Train(
int generationCount) = 0;
151 SMARTENGINE_EXPORT
void GradientDescentTrainer_Train(ObjPtr
object,
int generationCount);
152 SMARTENGINE_EXPORT
void GradientDescentTrainer_Reset(ObjPtr
object);
float L2Weights
L2 loss multiplier applied to weights.
Definition: GradientDescentTrainer.h:49
SMARTENGINE_EXPORT ObjectPtr< IGradientDescentTrainer > CreateGradientDescentTrainer(const GradientDescentTrainerCInfo &cinfo)
Creates an instance of IGradientDescentTrainer
GradientDescentTrainingAlgorithm algorithm
The training algorithm to use
Definition: GradientDescentTrainer.h:78
Trains a set of networks using gradient descent. Training with GradientDescentTrainer requires a Loss...
Definition: GradientDescentTrainer.h:119
Regularization tries to keep the weight values from exploding during gradient descent by adding L1 an...
Definition: GradientDescentTrainer.h:35
virtual void Reset()=0
Resets the internal state of the trainer.
GradientDescentTrainingAlgorithm
Definition: GradientDescentTrainer.h:58
float learnRate
ADAM optimizer learn rate
Definition: GradientDescentTrainer.h:94
virtual void Train(int generationCount)=0
Synchronously trains for the specified number of iterations
Data used to construct an IGradientDescentTrainer instance
Definition: GradientDescentTrainer.h:18
Smart pointer to an IObject. Automatic ref counting.
Definition: ObjectPtr.h:16
float L1Biases
L1 loss multiplier applied to biases.
Definition: GradientDescentTrainer.h:44
GradientDescentTrainer training info
Definition: GradientDescentTrainer.h:74
Definition: A2CTrainer.h:10
Base class for NeuralNetwork loss trainers
Definition: LossTrainer.h:81
float clipGradients
If this value is greater than 0, the gradients will be clipped to the range [-ClipGradients,...
Definition: GradientDescentTrainer.h:89
float epsilon
ADAM optimizer epsilon value
Definition: GradientDescentTrainer.h:109
float beta1
ADAM optimizer beta1 value
Definition: GradientDescentTrainer.h:99
ILoss * loss
The loss of the network output to train against.
Definition: GradientDescentTrainer.h:22
float beta2
ADAM optimizer beta2 value
Definition: GradientDescentTrainer.h:104
The loss of a NeuralNetwork is computed using the formula (Expected Ouput - Actual Output)^2 The mean...
Definition: Loss.h:39
float L1Weights
L1 loss multiplier applied to weights.
Definition: GradientDescentTrainer.h:39
float L2Biases
L2 loss multiplier applied to biases.
Definition: GradientDescentTrainer.h:54
RegularizationLossInfo regularizationLoss
The regularization loss parameters to apply.
Definition: GradientDescentTrainer.h:83
@ Adam
Good default choice