Skip to content

FannToolExpanded Features

eddybogosian edited this page Mar 27, 2020 · 6 revisions

FannToolExpanded Features

This is a non-exhaustive list of FannToolExpanded features. These include not only added features but also already existing features from the original project.


Log

The log outputs current Epoch, Training Error and if set Testing Error. Not only that, but it will also show you the up-to-date results of Detection Functions.

Graphic

The "Graphic" tab contains the graph showing the evolution of the training error rate through the training epochs.

Fine Tuning Options

The Fine Tuning tab contains the parameters which are going to be used during training and testing. Here's a brief explanation:

  • Desired Error (MSE)
  • Bit Fail Limit
  • Connection Rate
  • Learning Rate
  • Error Function
  • Hidden Activation Steepness
  • Output Activation Steepness
  • Momentum
  • Quickdrop Decay Factor
  • Quickdrop Mu Factor
  • RPROP Increase Factor
  • RPROP Decrease Factor
  • RPROP Minimum Step-Size
  • RPROP Maximum Step-Size
  • Initialize the weights This will create random weights for the connections in the Neural Network
  • Overtraining Caution System Turning this option on will mean that the FannToolExpanded will both train and test the Neural Network. You must have a test file for this to work.
  • Shuffle Train Data Selecting this will make the program shuffle the lines in your training data set. This will only shift lines, not the inputs.

Cascade Tuning

These are the options for the Cascade Training Method.

Detect Functions

There are two detect functions, Optimum Training Algorithm, and Optimum Activation Function.

The Detect Optimum Training Algorithm function will train the supplied dataset for 2'000 Epochs in each of the 5 Training Algorithms:

Fann_Train_Incremental, Batch, Rprop, Quickprop, and Sarprop.

At the end of the training session, the log will output in red which training method had the lowest error rate.

The Detect Optimum Activation Functions will try all combinations of the Activation Functions for the Hidden Nodes and Output nodes. The program will do this by training for 2'000 Epochs sequentially all 13 Activation Functions:

Fann_Linear, Sigmoid, Sigmoid_Stepwise, Sigmoid_Symmetric, Sigmoid_Symmetric_Stepwise, Gaussian, Gaussian_Symmetric, Elliot, Elliot_Symetric, Linear_Piece, Linear_Piece, Symmetric, Sin_Symmetric, and Cos_Symmetric.

Just as in the other detection functions, after trying all combinations, the program will output in the log which combination of Activation Functions had the lowest error rate.

Clone this wiki locally