|
NNFS
Neural network library from scratch
|
Classes | |
| class | Activation |
| Base class for all activation functions. More... | |
| class | Adagrad |
| Adagrad optimizer (Adaptive Gradient) More... | |
| class | Adam |
| Adam optimizer - Adaptive Moment Estimation, one of the most popular and efficient gradient-based optimization algorithms. More... | |
| class | CCE |
| Cross-entropy loss function. More... | |
| class | CCESoftmax |
| Cross-entropy loss function with softmax activation. More... | |
| class | Dense |
| Dense layer. More... | |
| class | Layer |
| Base class for all layers. More... | |
| class | Loss |
| Base class for all loss functions. More... | |
| class | Metrics |
| Metrics class. More... | |
| class | Model |
| Abstract base class for the model in a neural network. More... | |
| class | NeuralNetwork |
| A neural network model. More... | |
| class | Optimizer |
| Base class for all optimizers. More... | |
| class | ReLU |
| ReLU activation function. More... | |
| class | RMSProp |
| Root Mean Square Propagation optimizer. More... | |
| class | SGD |
| Stochastic Gradient Descent optimizer. More... | |
| class | Sigmoid |
| Sigmoid activation function. More... | |
| class | Softmax |
| Softmax activation function. More... | |
| class | Tanh |
Enumerations | |
| enum class | ActivationType { RELU , SIGMOID , TANH , SOFTMAX , NONE } |
| Enum class for activation types. More... | |
| enum class | LayerType { DENSE , ACTIVATION } |
| Enum class for layer types. More... | |
| enum class | LossType { CCE , CCE_SOFTMAX } |
| Enum class for loss types. More... | |
|
strong |
|
strong |
|
strong |