Adagrad optimizer (Adaptive Gradient)
More...
#include <Adagrad.hpp>
Adagrad optimizer (Adaptive Gradient)
This class implements the Adagrad optimizer.
◆ Adagrad()
NNFS::Adagrad::Adagrad |
( |
double |
lr, |
|
|
double |
decay = 0.0 , |
|
|
double |
epsilon = 1e-7 |
|
) |
| |
|
inline |
Construct a new Adagrad object.
- Parameters
-
lr | Learning rate |
decay | Learning rate decay (default: 0.0) |
epsilon | Epsilon value to avoid division by zero (default: 1e-7) |
◆ update_params()
void NNFS::Adagrad::update_params |
( |
std::shared_ptr< Dense > & |
layer | ) |
|
|
inlinevirtual |
Update the parameters of the layer.
- Parameters
-
[in,out] | layer | Layer to update |
Implements NNFS::Optimizer.
The documentation for this class was generated from the following file: