NNFS
Neural network library from scratch
Loading...
Searching...
No Matches
NNFS::Adagrad Class Reference

Adagrad optimizer (Adaptive Gradient) More...

#include <Adagrad.hpp>

Inheritance diagram for NNFS::Adagrad:
[legend]

Public Member Functions

 Adagrad (double lr, double decay=0.0, double epsilon=1e-7)
 Construct a new Adagrad object.
 
void update_params (std::shared_ptr< Dense > &layer)
 Update the parameters of the layer.
 
- Public Member Functions inherited from NNFS::Optimizer
 Optimizer (double lr, double decay)
 Construct a new Optimizer object.
 
virtual ~Optimizer ()=default
 Basic destructor.
 
virtual void update_params (std::shared_ptr< Dense > &layer)=0
 Update the parameters of the layer.
 
void pre_update_params ()
 Pre-update parameters (e.g. learning rate decay)
 
void post_update_params ()
 Post-update parameters (e.g. increase iteration count)
 
double & current_lr ()
 Get the current learning rate.
 
int & iterations ()
 Get current iteration count.
 

Additional Inherited Members

- Protected Attributes inherited from NNFS::Optimizer
const double _lr
 
double _current_lr
 
int _iterations
 
double _decay
 

Detailed Description

Adagrad optimizer (Adaptive Gradient)

This class implements the Adagrad optimizer.

Constructor & Destructor Documentation

◆ Adagrad()

NNFS::Adagrad::Adagrad ( double  lr,
double  decay = 0.0,
double  epsilon = 1e-7 
)
inline

Construct a new Adagrad object.

Parameters
lrLearning rate
decayLearning rate decay (default: 0.0)
epsilonEpsilon value to avoid division by zero (default: 1e-7)

Member Function Documentation

◆ update_params()

void NNFS::Adagrad::update_params ( std::shared_ptr< Dense > &  layer)
inlinevirtual

Update the parameters of the layer.

Parameters
[in,out]layerLayer to update

Implements NNFS::Optimizer.


The documentation for this class was generated from the following file: