ReLU activation function.
More...
#include <ReLU.hpp>
|
| | ReLU () |
| | Construct a new ReLU object.
|
| |
| void | forward (Eigen::MatrixXd &out, const Eigen::MatrixXd &x) override |
| | Forward pass of the ReLU activation function.
|
| |
| void | backward (Eigen::MatrixXd &out, const Eigen::MatrixXd &dx) override |
| | Backward pass of the ReLU activation function.
|
| |
| | Activation (ActivationType activation_type) |
| | Construct a new Activation object.
|
| |
| | Layer (LayerType type) |
| | Construct a new Layer object.
|
| |
| virtual | ~Layer ()=default |
| | Basic destructor.
|
| |
| virtual void | forward (Eigen::MatrixXd &out, const Eigen::MatrixXd &x)=0 |
| | Forward pass of the layer.
|
| |
| virtual void | backward (Eigen::MatrixXd &out, const Eigen::MatrixXd &dx)=0 |
| | Backward pass of the layer.
|
| |
ReLU activation function.
This class implements the ReLU activation function.
◆ ReLU()
Construct a new ReLU object.
◆ backward()
| void NNFS::ReLU::backward |
( |
Eigen::MatrixXd & |
out, |
|
|
const Eigen::MatrixXd & |
dx |
|
) |
| |
|
inlineoverridevirtual |
Backward pass of the ReLU activation function.
- Parameters
-
| [out] | out | Input gradient |
| [in] | dx | Output gradient |
Implements NNFS::Layer.
◆ forward()
| void NNFS::ReLU::forward |
( |
Eigen::MatrixXd & |
out, |
|
|
const Eigen::MatrixXd & |
x |
|
) |
| |
|
inlineoverridevirtual |
Forward pass of the ReLU activation function.
- Parameters
-
| [out] | out | Output of the ReLU activation function |
| [in] | x | Input to the ReLU activation function |
Implements NNFS::Layer.
The documentation for this class was generated from the following file: