-
Eigen library
-
Move it to the path
/usr/local/include/eigen3
(or create a symbolic link):sudo ln -s $INSTALLATION_PATH$ /usr/local/include/eigen3
-
C++ 20
indicators
library
Convolutional3D conv3d{
N_Filters,
{3, 3, 1},
activation::relu,
{28, 28, 1}};
To better understand the dimensions, one can use the .print_structure()
method.
For example, here the output of the layer would be a Tensor with shape
conv3d.print_structure();
Shape input_shape{4, 16};
Shape grid{2, 2};
Shape stride{2, 2};
MaxPool2D maxPool2D{input_shape, grid, stride};
Shape input_shape{20, 20, 16};
Shape grid{2, 2, 1};
Shape stride{2, 2, 1};
MaxPool3D maxPool3D{input_shape, grid, stride};
This is the example of a 3D Max Pooling layer. The output of the layer would be a Tensor with shape
Flattening layers are used when one needs to reshape a multidimensional Tensor into a 1D Tensor, or simply a Vector.
These layers are essential when one wants to combine Convolutional
and Dense
layers in a network.
Eigen::Tensor<double, 2> input_tensor{4, 16};
FlatteningLayer2D flattening2D;
Eigen::VectorXd flattened = flattening2D.flatten(input_tensor);
Eigen::Tensor<double, 2> original_tensor = flattening2D.back_to_tensor(flattened);
Eigen::Tensor<double, 3> input_tensor(4, 16, 3);
FlatteningLayer3D flattening3D;
Eigen::VectorXd flattened = flattening3D.flatten(input_tensor);
Eigen::Tensor<double, 3> original_tensor = flattening3D.back_to_tensor(flattened);
In both examples, the input_tensor
and original_tensor
are equal.
Testing
To test the library yourself, you can download CIFAR10
dataset
sh CIFAR10/download_cifar10.sh