Skip to content

Commit

Permalink
CircleCI build openturns 16553
Browse files Browse the repository at this point in the history
  • Loading branch information
CircleCI committed Mar 20, 2024
1 parent 4506a3c commit aea7ccc
Show file tree
Hide file tree
Showing 308 changed files with 6,464 additions and 2,948 deletions.
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
"""
Kriging :configure the optimization solver
Kriging: configure the optimization solver
==========================================
"""
# %%
Expand All @@ -25,7 +25,7 @@
# Often, the parameter :math:`{\bf \theta}` is a scale parameter.
# This step involves an optimization algorithm.
#
# All these parameters are estimated with the `GeneralLinearModelAlgorithm` class.
# All these parameters are estimated with the :class:`~openturns.GeneralLinearModelAlgorithm` class.
#
# The estimation of the :math:`{\bf \theta}` parameters is the step which has the highest CPU cost.
# Moreover, the maximization of likelihood may be associated with difficulties e.g. many local maximums or even the non convergence of the optimization algorithm.
Expand Down Expand Up @@ -75,7 +75,7 @@
II.setDescription("I")

# %%
# Finally, we define the dependency using a `NormalCopula`.
# Finally, we define the dependency using a :class:`~openturns.NormalCopula`.

# %%
dim = 4 # number of inputs
Expand All @@ -89,7 +89,8 @@
# --------------------------------

# %%
# We consider a simple Monte-Carlo sampling as a design of experiments. This is why we generate an input sample using the `getSample` method of the distribution.
# We consider a simple Monte-Carlo sampling as a design of experiments.
# This is why we generate an input sample using the `getSample` method of the distribution.
# Then we evaluate the output using the `model` function.

# %%
Expand All @@ -102,9 +103,9 @@
# --------------------

# %%
# In order to create the kriging metamodel, we first select a constant trend with the `ConstantBasisFactory` class.
# In order to create the kriging metamodel, we first select a constant trend with the :class:`~openturns.ConstantBasisFactory` class.
# Then we use a squared exponential covariance model.
# Finally, we use the `KrigingAlgorithm` class to create the kriging metamodel,
# Finally, we use the :class:`~openturns.KrigingAlgorithm` class to create the kriging metamodel,
# taking the training sample, the covariance model and the trend basis as input arguments.

# %%
Expand Down Expand Up @@ -132,9 +133,10 @@
krigingMetamodel = result.getMetaModel()

# %%
# The `run` method has optimized the hyperparameters of the metamodel.
# The :meth:`~openturns.KrigingAlgorithm.run` method has optimized the hyperparameters of the metamodel.
#
# We can then print the constant trend of the metamodel, which have been estimated using the least squares method.
# We can then print the constant trend of the metamodel, which have been
# estimated using the least squares method.

# %%
result.getTrendCoefficients()
Expand All @@ -151,7 +153,10 @@
# ---------------------------

# %%
# The `getOptimizationAlgorithm` method returns the optimization algorithm used to optimize the :math:`{\bf \theta}` parameters of the `SquaredExponential` covariance model.
# The :meth:`~openturns.KrigingAlgorithm.getOptimizationAlgorithm` method
# returns the optimization algorithm used to optimize the
# :math:`{\bf \theta}` parameters of the
# :class:`~openturns.SquaredExponential` covariance model.

# %%
solver = algo.getOptimizationAlgorithm()
Expand All @@ -164,7 +169,10 @@
solverImplementation.getClassName()

# %%
# The `getOptimizationBounds` method returns the bounds. The dimension of these bounds correspond to the spatial dimension of the covariance model.
# The :meth:`~openturns.KrigingAlgorithm.getOptimizationBounds` method
# returns the bounds.
# The dimension of these bounds correspond to the spatial dimension of
# the covariance model.
# In the metamodeling context, this correspond to the input dimension of the model.

# %%
Expand All @@ -182,7 +190,7 @@
print(ubounds)

# %%
# The `getOptimizeParameters` method returns `True` if these parameters are to be optimized.
# The :meth:`~openturns.KrigingAlgorithm.getOptimizeParameters` method returns `True` if these parameters are to be optimized.

# %%
isOptimize = algo.getOptimizeParameters()
Expand All @@ -195,7 +203,8 @@

# %%
# The starting point of the optimization is based on the parameters of the covariance model.
# In the following example, we configure the parameters of the covariance model to the arbitrary values `[12.,34.,56.,78.]`.
# In the following example, we configure the parameters of the covariance model to
# the arbitrary values `[12.0, 34.0, 56.0, 78.0]`.

# %%
covarianceModel = ot.SquaredExponential([12.0, 34.0, 56.0, 78.0], [1.0])
Expand Down Expand Up @@ -226,15 +235,16 @@

# %%
# It is sometimes useful to completely disable the optimization of the parameters.
# In order to see the effect of this, we first initialize the parameters of the covariance model with the arbitrary values `[12.,34.,56.,78.]`.
# In order to see the effect of this, we first initialize the parameters of
# the covariance model with the arbitrary values `[12.0, 34.0, 56.0, 78.0]`.

# %%
covarianceModel = ot.SquaredExponential([12.0, 34.0, 56.0, 78.0], [91.0])
algo = ot.KrigingAlgorithm(X_train, Y_train, covarianceModel, basis)
algo.setOptimizationBounds(scaleOptimizationBounds) # Trick B

# %%
# The `setOptimizeParameters` method can be used to disable the optimization of the parameters.
# The :meth:`~openturns.KrigingAlgorithm.setOptimizeParameters` method can be
# used to disable the optimization of the parameters.

# %%
algo.setOptimizeParameters(False)
Expand Down Expand Up @@ -361,25 +371,31 @@ def printCovarianceParameterChange(covarianceModel1, covarianceModel2):
# ----------------------------------------

# %%
# The following example checks the robustness of the optimization of the kriging algorithm with respect to
# the optimization of the likelihood function in the covariance model parameters estimation.
# We use a `MultiStart` algorithm in order to avoid to be trapped by a local minimum.
# Furthermore, we generate the design of experiments using a `LHSExperiments`, which guarantees that the points will fill the space.
# The following example checks the robustness of the optimization of the
# kriging algorithm with respect to the optimization of the likelihood
# function in the covariance model parameters estimation.
# We use a :class:`~openturns.MultiStart` algorithm in order to avoid to be trapped by a local minimum.
# Furthermore, we generate the design of experiments using a
# :class:`~openturns.LHSExperiments`, which guarantees that the points
# will fill the space.

# %%
sampleSize_train = 10
X_train = myDistribution.getSample(sampleSize_train)
Y_train = model(X_train)

# %%
# First, we create a multivariate distribution, based on independent `Uniform` marginals which have the bounds required by the covariance model.
# First, we create a multivariate distribution, based on independent
# :class:`~openturns.Uniform` marginals which have the bounds required
# by the covariance model.

# %%
distributions = [ot.Uniform(lbounds[i], ubounds[i]) for i in range(dim)]
boundedDistribution = ot.JointDistribution(distributions)

# %%
# We first generate a Latin Hypercube Sampling (LHS) design made of 25 points in the sample space. This LHS is optimized so as to fill the space.
# We first generate a Latin Hypercube Sampling (LHS) design made of 25 points in the sample space.
# This LHS is optimized so as to fill the space.

# %%
K = 25 # design size
Expand All @@ -393,7 +409,8 @@ def printCovarianceParameterChange(covarianceModel1, covarianceModel2):
starting_points.getSize()

# %%
# We can check that the minimum and maximum in the sample correspond to the bounds of the design of experiment.
# We can check that the minimum and maximum in the sample correspond to the
# bounds of the design of experiment.

# %%
print(lbounds, ubounds)
Expand All @@ -402,14 +419,14 @@ def printCovarianceParameterChange(covarianceModel1, covarianceModel2):
starting_points.getMin(), starting_points.getMax()

# %%
# Then we create a `MultiStart` algorithm based on the LHS starting points.
# Then we create a :class:`~openturns.MultiStart` algorithm based on the LHS starting points.

# %%
solver.setMaximumIterationNumber(10000)
multiStartSolver = ot.MultiStart(solver, starting_points)

# %%
# Finally, we configure the optimization algorithm so as to use the `MultiStart` algorithm.
# Finally, we configure the optimization algorithm so as to use the :class:`~openturns.MultiStart` algorithm.

# %%
algo = ot.KrigingAlgorithm(X_train, Y_train, covarianceModel, basis)
Expand Down
Binary file not shown.
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Introduction\n\nTwo fundamental objects in the library are:\n\n* `Point`: a multidimensional point in $D$ dimensions ($\\in \\mathbb{R}^D$) ;\n* `Sample`: a multivariate sample made of $N$ points in $D$ dimensions.\n\n\n"
"## Introduction\n\nTwo fundamental objects in the library are:\n\n* `Point`: a multidimensional point in $d$ dimensions ($\\in \\mathbb{R}^d$) ;\n* `Sample`: a multivariate sample made of $n$ points in $d$ dimensions.\n\n\n"
]
},
{
Expand Down Expand Up @@ -108,14 +108,14 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## The `Sample` class\n\nThe `Sample` class represents a multivariate sample made of $N$ points in $\\mathbb{R}^D$.\n\n* $D$ is the *dimension* of the sample,\n* $N$ is the *size* of the sample.\n\n\nA `Sample` can be seen as an array of with $N$ rows and $D$ columns.\n\n*Remark.* The :class:`~openturns.ProcessSample` class can be used to manage a sample of stochastic processes.\n\n"
"## The `Sample` class\n\nThe `Sample` class represents a multivariate sample made of $n$ points in $\\mathbb{R}^d$.\n\n* $d$ is the *dimension* of the sample,\n* $n$ is the *size* of the sample.\n\n\nA `Sample` can be seen as an array of with $n$ rows and $d$ columns.\n\n*Remark.* The :class:`~openturns.ProcessSample` class can be used to manage a sample of stochastic processes.\n\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The script below creates a `Sample` with size $N=5$ and dimension $D=3$.\n\n"
"The script below creates a `Sample` with size $n=5$ and dimension $d=3$.\n\n"
]
},
{
Expand Down Expand Up @@ -249,7 +249,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"We see that:\n\n* the `row` is a `Point`,\n* the `column` is a `Sample`.\n\nThis is consistent with the fact that, in a dimension $D$ `Sample`, a row is a $D$-dimensional `Point`.\n\n"
"We see that:\n\n* the `row` is a `Point`,\n* the `column` is a `Sample`.\n\nThis is consistent with the fact that, in a dimension $d$ `Sample`, a row is a $d$-dimensional `Point`.\n\n"
]
},
{
Expand All @@ -270,6 +270,13 @@
"data.getMarginal([0, 2])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Set a row or a column of a `Sample`\n\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand All @@ -285,7 +292,32 @@
},
"outputs": [],
"source": [
"sample = ot.Sample([[1.0, 2.0], [3.0, 4.0], [5.0, 6.0]])\np = [8.0, 10.0]\nsample[2, :] = p\nsample"
"sample = ot.Sample([[1.0, 2.0], [3.0, 4.0], [5.0, 6.0]])\nsample"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Set the third row: this must be a `Point` or must be convertible to.\n\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"p = [8.0, 10.0]\nsample[2, :] = p\nsample"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Set the second column: this must be a `Sample` or must be convertible to.\n\n"
]
},
{
Expand All @@ -299,6 +331,24 @@
"sample = ot.Sample([[1.0, 2.0], [3.0, 4.0], [5.0, 6.0]])\ns = ot.Sample([[3.0], [5.0], [7.0]])\nsample[:, 1] = s\nsample"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Sometimes, we want to set a column with a list of floats.\nThis can be done using the :meth:`~openturns.Sample.BuildFromPoint` static method.\n\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"sample = ot.Sample([[1.0, 2.0], [3.0, 4.0], [5.0, 6.0]])\ns = ot.Sample.BuildFromPoint([3.0, 5.0, 7.0])\nsample[:, 1] = s\nsample"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down Expand Up @@ -469,7 +519,7 @@
},
"outputs": [],
"source": [
"type(array)"
"print(type(array))"
]
},
{
Expand Down Expand Up @@ -570,7 +620,39 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"If we do not set the optional `size` parameter, the library cannot solve the\ncase and an `InvalidArgumentException` is generated.\nMore precisely, the code::\n\n sample = ot.Sample(u)\n\nproduces the exception::\n\n TypeError: InvalidArgumentException : Invalid array dimension: 1\n\n\n"
"When there is an ambiguous case, the library cannot solve the\nissue and an `InvalidArgumentException` is generated.\n\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"More precisely, the code:\n\n```\nsample = ot.Sample(u)\n```\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"produces the exception:\n\n```\nTypeError: InvalidArgumentException : Invalid array dimension: 1\n```\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"In order to solve that problem, we can use the :meth:`~openturns.Sample.BuildFromPoint`\nstatic method.\n\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"sample = ot.Sample.BuildFromPoint([ui for ui in u])\nsample"
]
}
],
Expand Down
Loading

0 comments on commit aea7ccc

Please sign in to comment.