NNFS
Neural network library from scratch
Loading...
Searching...
No Matches
NNFS::Adam Class Reference

Adam optimizer - Adaptive Moment Estimation, one of the most popular and efficient gradient-based optimization algorithms. More...

#include <Adam.hpp>

Inheritance diagram for NNFS::Adam:
[legend]

Public Member Functions

 Adam (double lr=1e-3, double decay=.0, double epsilon=1e-7, double beta_1=.9, double beta_2=.999)
 Construct a new Adam object.
 
void update_params (std::shared_ptr< Dense > &layer)
 Update the parameters of the layer.
 
- Public Member Functions inherited from NNFS::Optimizer
 Optimizer (double lr, double decay)
 Construct a new Optimizer object.
 
virtual ~Optimizer ()=default
 Basic destructor.
 
virtual void update_params (std::shared_ptr< Dense > &layer)=0
 Update the parameters of the layer.
 
void pre_update_params ()
 Pre-update parameters (e.g. learning rate decay)
 
void post_update_params ()
 Post-update parameters (e.g. increase iteration count)
 
double & current_lr ()
 Get the current learning rate.
 
int & iterations ()
 Get current iteration count.
 

Additional Inherited Members

- Protected Attributes inherited from NNFS::Optimizer
const double _lr
 
double _current_lr
 
int _iterations
 
double _decay
 

Detailed Description

Adam optimizer - Adaptive Moment Estimation, one of the most popular and efficient gradient-based optimization algorithms.

This class implements the Adam optimizer.

Constructor & Destructor Documentation

◆ Adam()

NNFS::Adam::Adam ( double  lr = 1e-3,
double  decay = .0,
double  epsilon = 1e-7,
double  beta_1 = .9,
double  beta_2 = .999 
)
inline

Construct a new Adam object.

Parameters
lrLearning rate (default: 1e-3)
decayLearning rate decay (default: 0.0)
epsilonEpsilon value to avoid division by zero (default: 1e-7)
beta_1Exponential decay rate for the first moment estimates (default: 0.9)
beta_2Exponential decay rate for the second moment estimates (default: 0.999)

Member Function Documentation

◆ update_params()

void NNFS::Adam::update_params ( std::shared_ptr< Dense > &  layer)
inlinevirtual

Update the parameters of the layer.

Parameters
[in,out]layerLayer to update

Implements NNFS::Optimizer.


The documentation for this class was generated from the following file: