FlexNN 1
Fully connected neural network built from scratch with flexible n-layer design and multiple activations.
|
Represents a single layer in a neural network. More...
#include <Layer.h>
Public Member Functions | |
Layer (int inputSize, int outputSize, const std::string &activationFunction="relu") | |
Constructor for the Layer class. | |
Eigen::MatrixXd | getWeights () const |
Getters for weights. | |
Eigen::VectorXd | getBiases () const |
Getters for biases. | |
void | updateWeights (const Eigen::MatrixXd &dW, const Eigen::VectorXd &db, double learningRate) |
Update weights and biases. | |
std::pair< Eigen::MatrixXd, Eigen::MatrixXd > | forward (const Eigen::MatrixXd &input) |
Forward pass through the layer. | |
Eigen::MatrixXd | backward (const Eigen::MatrixXd &nextW, const Eigen::MatrixXd &nextdZ, const Eigen::MatrixXd &currZ) |
Backward pass through the layer. | |
Represents a single layer in a neural network.
The Layer class encapsulates the properties and methods required for a neural network layer, including weights, biases, forward and backward passes, and weight updates.
This class is designed to be flexible and can be used with different activation functions. It supports both relu and softmax activation functions by default, but can be extended to include others.
|
inline |
Constructor for the Layer class.
Initializes the layer with random weights and biases.
inputSize | The size of the input to this layer. |
outputSize | The size of the output from this layer (also the number of neurons of this layer). |
activationFunction | The activation function to be used in this layer (default is "relu"). |
Eigen::MatrixXd FlexNN::Layer::backward | ( | const Eigen::MatrixXd & | nextW, |
const Eigen::MatrixXd & | nextdZ, | ||
const Eigen::MatrixXd & | currZ | ||
) |
Backward pass through the layer.
This method computes the gradient of the loss with respect to the inputs of this layer given the gradients from the next layer.
nextW | The weights of the next layer. |
nextdZ | The gradients from the next layer. |
currZ | The linear output (Z) of this layer. |
std::pair< Eigen::MatrixXd, Eigen::MatrixXd > FlexNN::Layer::forward | ( | const Eigen::MatrixXd & | input | ) |
Forward pass through the layer.
This method computes the output of the layer given an input matrix. It applies the activation function to the linear combination of inputs and weights.
input | The input data for the forward pass. |
|
inline |
Getters for biases.
These methods return the biases of the layer.
|
inline |
Getters for weights.
These methods return the weights of the layer.
|
inline |
Update weights and biases.
This method updates the weights and biases of the layer using the provided gradients and a specified learning rate.
dW | The gradient of the weights. |
db | The gradient of the biases. |
learningRate | The learning rate for updating the weights and biases. |