From 220db379dac97b911188647738ce1a2799faab86 Mon Sep 17 00:00:00 2001 From: Luigi Selmi Date: Wed, 6 Sep 2023 18:18:52 +0200 Subject: [PATCH] minor change --- python/iia/crop_production.ipynb | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/python/iia/crop_production.ipynb b/python/iia/crop_production.ipynb index 09cb20e..d3d3880 100755 --- a/python/iia/crop_production.ipynb +++ b/python/iia/crop_production.ipynb @@ -235,10 +235,10 @@ "\n", "* Model: neural network with one hidden layer using the tanh activation function (see [scikit-learn Multi-layer Perceptron](https://scikit-learn.org/stable/modules/neural_networks_supervised.html))\n", "* Cost function: quadratic loss (or mean squared error) with momentum regularization\n", + "* Training procedure: [leave-one-out](https://scikit-learn.org/stable/modules/cross_validation.html#leave-one-out-loo) cross validation (one example used for test, a subset (10% random examples) used for validation and the rest for training) with [early stopping](https://scikit-learn.org/stable/auto_examples/linear_model/plot_sgd_early_stopping.html)\n", "* Optimization algorithm: stochastic gradient descent (fixed learning rate) or Quasi-Newton BFGS (see scikit-learn Multi-layer Perceptron [algorithms](https://scikit-learn.org/stable/modules/neural_networks_supervised.html#algorithms)) \n", "* Data pre-processing: mean normalization \n", - "* Training procedure: [leave-one-out](https://scikit-learn.org/stable/modules/cross_validation.html#leave-one-out-loo) cross validation (one example used for test, a subset (10% random examples) used for validation and the rest for training) with [early stopping](https://scikit-learn.org/stable/auto_examples/linear_model/plot_sgd_early_stopping.html)\n", - "* [Ensemble method](https://scikit-learn.org/stable/modules/ensemble.html): neural network model with different weights initialization and validation set \n", + "* [Ensemble method](https://scikit-learn.org/stable/modules/ensemble.html): neural network models with different weights initialization and validation set (bootstrap replicates: sampling with replacement from the training dataset after the test set has been selected)\n", "\n", "\n", "We create an MLP with one hidden layer to approximate a function that represents the yeld of a crop (maize or millet) depending on the yearly values of precipitation, temperature, number of hours with temperatures above 30°C, amount of nitrogen fertilizer, and amount of manure fertilizer. We use a 4-5-1 architecture. We can compute the preactivation of the hidden layer by a matrix multiplications between the units of the input layer and the units of the hidden layer.\n", @@ -274,7 +274,7 @@ "metadata": {}, "source": [ "## Ensemble method\n", - "The methodology can be described algorithmically. The same procedure is used for the two ensembles, one for maize model (4 input units, 5 hidden, 1 output) and one for millet model (5 input units, 5 hidden, 1 output)\n" + "The methodology can be described algorithmically. The same procedure is used for the two ensembles, one for maize model (4 input units, 5 hidden, 1 output) and one for millet model (5 input units, 5 hidden, 1 output). The construction of the predictors from the bootstrap samples can be done in parallel on a multicore CPU.\n" ] }, {