|
NNFS
Neural network library from scratch
|
Base class for all optimizers. More...
#include <Optimizer.hpp>
Public Member Functions | |
| Optimizer (double lr, double decay) | |
| Construct a new Optimizer object. | |
| virtual | ~Optimizer ()=default |
| Basic destructor. | |
| virtual void | update_params (std::shared_ptr< Dense > &layer)=0 |
| Update the parameters of the layer. | |
| void | pre_update_params () |
| Pre-update parameters (e.g. learning rate decay) | |
| void | post_update_params () |
| Post-update parameters (e.g. increase iteration count) | |
| double & | current_lr () |
| Get the current learning rate. | |
| int & | iterations () |
| Get current iteration count. | |
Protected Attributes | |
| const double | _lr |
| double | _current_lr |
| int | _iterations |
| double | _decay |
Base class for all optimizers.
This class is the base class for all optimizers. It provides the interface for all optimizers.
|
inline |
Construct a new Optimizer object.
| lr | Learning rate |
| decay | Learning rate decay (default: 0.0) |
|
virtualdefault |
Basic destructor.
|
inline |
Get the current learning rate.
|
inline |
Get current iteration count.
|
inline |
Post-update parameters (e.g. increase iteration count)
|
inline |
Pre-update parameters (e.g. learning rate decay)
|
pure virtual |
Update the parameters of the layer.
| [in,out] | layer | Layer to update |
Implemented in NNFS::Adagrad, NNFS::Adam, NNFS::RMSProp, and NNFS::SGD.
|
protected |
|
protected |
|
protected |
|
protected |