The content of this repository is a reproduction of the paper « On joint parameterizations of linear and nonlinear functionals in neural networks ».
The topic of "joint parameterizations of linear and nonlinear functionals in neural networks" refers to the representation of both linear and nonlinear functions within a single neural network. This is important because many real-world problems (like speech recognition, financial forecasting or predictive maintenance) require a combination of linear and nonlinear functions to be modeled effectively. By using a joint parameterization, a neural network can capture both linear and nonlinear relationships within the data, leading to improved model accuracy and performance.
Published: 18 June 2023
Contributors: Wassim Chakroun, Ayoub Damak
Supervisor: Dominique Pastor
activation functions, rectified parametric sigmoid units, RePSKU, RePSHU, RePSU, (p)MISH, (p)SWISH
The datasets used in this work to train and evaluate the various models are as follows:
- MNIST of handwritten digits
- CIFAR-10 of 10 classes (airplanes, cars, birds, cats, deer, dogs, frogs, horses, ships, and trucks)
- Generalized Fractional Brownian Field (GFBF): generated from convolution operations over modulated fractional Brownian fields