Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mohammad moving adaptive X-over-n-mutation #2344

Open
wants to merge 116 commits into
base: devel
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
116 commits
Select commit Hold shift + click to select a range
061058e
NSGA-II implementation with properly printing optimal solutions at th…
JunyungKim Feb 19, 2023
ab4315c
Unnecessary changes in DataSet.py have been removed.
JunyungKim Feb 19, 2023
8b7f5d3
Unnecessary changes in DataSet.py have been removed.
JunyungKim Feb 19, 2023
3fcde82
ZDT test is added.
JunyungKim Feb 22, 2023
15debe4
Optimizer.py and RavenSampled.py are updated after having regression …
JunyungKim Feb 24, 2023
64510df
minor update on Optimizer.py
JunyungKim Feb 24, 2023
b1f0c3f
temporary fix, not the way I want
Jimmy-INL Mar 11, 2023
52389c3
NSGA-II testing fiels (multiSum wConstratint and ZDT1) are added.
JunyungKim Mar 13, 2023
391b9c3
moving models, xmls, and trying to resolve GD after converting object…
Jimmy-INL Mar 14, 2023
da9e0dd
fixing simulated annealing to accept a list of objectives
Jimmy-INL Mar 21, 2023
1fd2175
fixing rook to compare infs
Jimmy-INL Mar 22, 2023
7cedf83
Merge branch 'junyung-Mohammad-NSGAII' into JunyungKim-junyung-Mohamm…
Jimmy-INL Mar 22, 2023
305c2ac
making one mod in RAVENSAmpled
Jimmy-INL Apr 1, 2023
c820eea
making self._minMax a list
Jimmy-INL Apr 3, 2023
21bf42d
erroring out if type is not in ['min', 'max']
Jimmy-INL Apr 3, 2023
e639803
updating HERON to b316024
Jimmy-INL Apr 3, 2023
12e11f0
Merge branch 'devel' into enablingMinMaxList
Jimmy-INL Apr 3, 2023
be64a4d
updating dependencies
Jimmy-INL Apr 4, 2023
ccde4d9
Merge branch 'enablingMinMaxList' of github.com:Jimmy-INL/raven into …
Jimmy-INL Apr 4, 2023
95682a1
removing a trailing space
Jimmy-INL Apr 4, 2023
c3688e2
removing windows line endings
Jimmy-INL Apr 4, 2023
e25cc37
change to unix ending
Jimmy-INL Apr 5, 2023
f0d1412
adding the zdt_model.py
Jimmy-INL Apr 5, 2023
c2ca46e
converting zdt to unix line endings
Jimmy-INL Apr 5, 2023
1f1b969
Juan's change to simulateData for the interface
Jimmy-INL Apr 6, 2023
c7aebf3
resolving diff based on different batch Size, thanks @wangcj05
Jimmy-INL Apr 6, 2023
64e97a9
converting SimukateData.py to unix line endings
Jimmy-INL Apr 8, 2023
b29661b
regolding to print all batches in MOO
Jimmy-INL Apr 11, 2023
9626956
slight mods
Jimmy-INL Apr 12, 2023
34d5cb2
regolding and reverting inf in fitness
Jimmy-INL Apr 12, 2023
e0df314
trying to add all outputs to the rlz
Jimmy-INL Apr 12, 2023
c0476f7
adding everything to bestPoint
Jimmy-INL Apr 13, 2023
81dc580
chenging type==str to len(self._objectVar) == 1
Jimmy-INL Apr 13, 2023
3f27965
removing unnecessary if statement, this needs revisiting
Jimmy-INL Apr 18, 2023
facf74e
modifying reverting cycle length to its value not the inverse
Jimmy-INL Apr 20, 2023
a92049c
simulateData updating cost model.
Jun 12, 2023
0faeb9c
minor change is made in ZDT1 test.
JunyungKim Jul 15, 2023
e9ea9a2
Merge branch 'enablingMinMaxList' of https://github.com/Jimmy-INL/rav…
JunyungKim Jul 17, 2023
dbad22c
myConstraints for MultiSum is updated.
JunyungKim Jul 27, 2023
699b3de
Two issues are resolved: population and objective value mismatch, min…
JunyungKim Aug 8, 2023
8cffedb
minor things are corrected. Nothing important.
JunyungKim Aug 8, 2023
9f4eecd
Additional minor changes are made. Nothing important.
JunyungKim Aug 8, 2023
2487621
Additional minor change is made. Nothing important.
JunyungKim Aug 8, 2023
3657634
fitness data structure is changed from data xarray to dataSet. It wor…
Aug 30, 2023
285575f
single objective optimization works well with three different types o…
Aug 31, 2023
7707f67
NSGA-II improvement is in progress.
Sep 4, 2023
a32a45c
fitness-based NSGA-II is in progress. min-min is working well with to…
Sep 6, 2023
a9577f4
NSGA-II fitness-based rank and CD calcuration is completed. Temporary…
Sep 7, 2023
9b42d7d
minor bugs are fixed.
Sep 10, 2023
f6ecb5f
Every type of fitness is now working with newly updated GA interface …
Sep 17, 2023
8a26285
multi-objective optimization using invLinear and logistics now works.
Sep 21, 2023
51eb867
constraint handling for single and multi objective optimization in _u…
Sep 21, 2023
061c3bc
1. If-else statement for survivorSelection in _useRealization is remo…
Sep 25, 2023
59d43e1
1. Mohammad's comments are reflected; 2. Unneccesary if-else statemen…
Sep 26, 2023
8fb32c3
1. missing descriptions of self are added.
Sep 26, 2023
4dc0e57
tournamentSelection method in parentSelectors.py is enhanced followin…
JunyungKim Oct 14, 2023
4f457fe
tournamemntSelection for multi-objective is completed. RouletteWheel …
JunyungKim Oct 15, 2023
9d27568
simpleKnapsackTournament optOut file is regoldened. Final solution is…
JunyungKim Oct 15, 2023
255b58f
Comments from Mohammad are reflected.
JunyungKim Jan 29, 2024
f1ad2b3
Minor fixes to the fitness though a list of objective and penalty wei…
Jimmy-INL Jan 29, 2024
fae31be
Merge branch 'Junyung-Jimmy-enablingMinMaxList' into JunyungLatest-en…
JunyungKim Jan 29, 2024
ea59893
Merge pull request #3 from Jimmy-INL/JunyungLatest-enablingMinMaxList
JunyungKim Jan 29, 2024
d23ef44
test file for multi-objective optimization changed: the number of ite…
JunyungKim Jan 29, 2024
df6b98d
Junyung-Jimmy-enablingMinMaxList_vf_desk is merged to most-updated de…
Feb 26, 2024
f564f29
devel is merged with enabling MinMaxList_vf_desk.
Mar 4, 2024
02a961e
Modifications are done: All unit tests and GeneticAlgorithms-related …
JunyungKim Mar 6, 2024
ed460f9
SimulateData.py is now identical with the one from devel branch.
JunyungKim Mar 6, 2024
f339bf3
GeneticAlgorithm.py is updated. new file beale_flipped2.py is added. …
JunyungKim Mar 6, 2024
261799a
Issues that RAVEN could not catch error when non-rankNCrowdingBased s…
JunyungKim Mar 20, 2024
677b474
RAVEN Manual related changes only are made.
JunyungKim Apr 2, 2024
cf67660
Minor changes are made. Functionally identical, just for readibility …
JunyungKim Apr 2, 2024
916eda0
two methods related to survivorSelectors are moved to survivorSelecto…
JunyungKim Apr 2, 2024
366974e
Some comments are left in fitness.py for future reference. invLinear …
JunyungKim Apr 2, 2024
ceb701d
Some comments are left in fitness.py for future reference. invLinear …
JunyungKim Apr 2, 2024
35e65e7
Some comments are added/deleted.
JunyungKim Apr 2, 2024
c8ac5c9
some files in NSGAII folder which are already relocated to other fold…
JunyungKim Apr 2, 2024
ce0aaad
commentations and code cleaning is dnoe in GeneticAlgorithm.py. Funct…
JunyungKim Apr 2, 2024
cfd5b31
rlzDict in def _resolveNewGenerationMulti is updated to avoid SIMULAT…
JunyungKim Apr 3, 2024
9d8941c
user manual related update - Equation correction
JunyungKim Apr 3, 2024
a055d68
Merge branch 'devel' into Junyung-Jimmy-NSGAII-ManualUpdate-DefectsFix
JunyungKim Apr 3, 2024
580cad8
user manual related update - Equation correction
JunyungKim Apr 3, 2024
06f4a46
very minor change made for user manuel.
JunyungKim Apr 3, 2024
a903939
survivorSelection.py is created.
JunyungKim Apr 3, 2024
8482a39
contaminated HERON and TEAL is now back to RAVEN.
May 23, 2024
97a20bc
the branch, Junyung-Jimmy-NSGAII-ManualUpdate-DefectsFix, is now merg…
May 23, 2024
372f384
trailing whitespaces are removed.
May 23, 2024
289094e
Issues created due to having objective variable type be List are part…
May 23, 2024
7b66055
MultiObjective_Beale-Bealeflipped is added.
May 23, 2024
a255665
adding gold files and initial fix in RavenSampled.py
Jimmy-INL Jun 4, 2024
52717ae
relaxing image rel error for one test and changing setuptools in the …
Jimmy-INL Jun 4, 2024
0abf956
readding TEAL to .gitmodules
Jimmy-INL Jun 7, 2024
5e3d977
resolving dependencies conflict
Jimmy-INL Jun 7, 2024
460b79b
adding missing dockstrings
Jimmy-INL Jun 8, 2024
7916f3a
removing adaptive mutation and crossover
Jimmy-INL Jun 8, 2024
b5c7e02
adding __init__.py for the pip install
Jimmy-INL Jun 10, 2024
bc9ab9f
updating xarray version to 2024
Jimmy-INL Jun 10, 2024
f9272a4
updating xarray version to 2023.12
Jimmy-INL Jun 11, 2024
beaba5a
updating xarray version to 2023.12
Jimmy-INL Jun 11, 2024
163eb87
updating xarray version to 2023.12
Jimmy-INL Jun 11, 2024
eadd2d4
relaxing images rel error
Jimmy-INL Jun 12, 2024
bfc0157
trying to make it print first iteration after moving function in surv…
Jimmy-INL Jun 19, 2024
ca2bd63
pushing the gold files for a new test
Jimmy-INL Jun 19, 2024
e52c609
removing commented lines and relaxing image tolerance
Jimmy-INL Jun 20, 2024
66822ee
removing a white space
Jimmy-INL Jun 20, 2024
729187c
relaxing image diff
Jimmy-INL Jun 20, 2024
ba8fc08
writing only final result for beale
Jimmy-INL Jul 12, 2024
9f77186
writing only final result for beale
Jimmy-INL Jul 14, 2024
8c9c81f
adding adaptive tests and some regolds
Jimmy-INL Jul 28, 2024
3f354fc
pushing tests
Jimmy-INL Jul 28, 2024
ee02ae1
regolding
Jimmy-INL Jul 28, 2024
98cfe81
pushing tests
Jimmy-INL Jul 29, 2024
b32986b
renameing the adaptive test
Jimmy-INL Jul 29, 2024
c418b8c
fixing beale and making it only writes final to be consistent with devel
Jimmy-INL Jul 29, 2024
6b41a5b
modifying the bloody xsd
Jimmy-INL Aug 3, 2024
46384e0
escaping underscores in tests
Jimmy-INL Aug 5, 2024
3c6da21
tests manual fix
Jimmy-INL Aug 6, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 2 additions & 4 deletions dependencies.xml
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ Note all install methods after "main" take
<scikit-learn>1.0</scikit-learn>
<pandas/>
<!-- Note most versions of xarray work, but some (such as 0.20) don't -->
<xarray>2023</xarray>
<xarray>2023.12</xarray>
<netcdf4 source="pip">1.6</netcdf4>
<matplotlib>3.5</matplotlib>
<statsmodels>0.13</statsmodels>
Expand Down Expand Up @@ -84,11 +84,9 @@ Note all install methods after "main" take
<ipopt skip_check='True' optional='True'/>
<cyipopt optional='True'/>
<pyomo-extensions source="pyomo" skip_check='True' optional='True'/>
<setuptools/>
<!-- This is because liblapack 3.9.0 build 21 is broken (and can probably be removed if there ever is a build 22). This can also be removed when scipy is updated to version 1.12 -->
<liblapack skip_check='True' os='linux' machine='x86_64'>3.9.0=20_linux64_openblas</liblapack>
<liblapack skip_check='True' os='windows' machine='x86_64'>3.9.0=20_win64_mkl</liblapack>
<liblapack skip_check='True' os='mac' machine='x86_64'>3.9.0=20_osx64_openblas</liblapack>
<setuptools>69</setuptools> <!-- ray 2.6 can't be installed with setuptools 70 -->
<!-- source="mamba" are the ones installed when mamba is installed -->
<mamba source="mamba" skip_check='True'/>
<serpentTools optional='True' source="pip"/>
Expand Down
68 changes: 64 additions & 4 deletions developer_tools/XSDSchemas/Optimizers.xsd
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
<?xml version="1.0"?>
<?xml version="1.0" encoding="UTF-8"?>
<xsd:schema version="1.0" xmlns:xsd="http://www.w3.org/2001/XMLSchema" elementFormDefault="qualified">
<!-- *********************************************************************** -->
<!-- *********************************************************************** -->
<!-- Optimizers -->
<!-- *********************************************************************** -->
<xsd:complexType name="OptimizerData">
Expand Down Expand Up @@ -227,19 +227,79 @@
</xsd:complexType>
<xsd:complexType name="crossoverType">
<xsd:all>
<xsd:element name="crossoverProb" type="xsd:float" />
<xsd:element name="crossoverProb" type="crossoverProbType" />
</xsd:all>
<xsd:attribute name="type" type="xsd:string" />
</xsd:complexType>

<!-- Define a new complex type for crossoverProbType -->
<xsd:complexType name="crossoverProbType">
<xsd:simpleContent>
<xsd:extension base="crossoverProbValueType">
<xsd:attribute name="type" type="crossoverProbTypeAttr" use="optional"/>
</xsd:extension>
</xsd:simpleContent>
</xsd:complexType>

<!-- Define the simple type for the crossoverProb value -->
<xsd:simpleType name="crossoverProbValueType">
<xsd:union memberTypes="xsd:float crossoverAdaptiveType"/>
</xsd:simpleType>

<!-- Define the simple type for the adaptive crossover probability -->
<xsd:simpleType name="crossoverAdaptiveType">
<xsd:restriction base="xsd:string">
<xsd:enumeration value="linear"/>
<xsd:enumeration value="quadratic"/>
</xsd:restriction>
</xsd:simpleType>

<!-- Define the attribute type for the crossoverProb type -->
<xsd:simpleType name="crossoverProbTypeAttr">
<xsd:restriction base="xsd:string">
<xsd:enumeration value="static"/>
<xsd:enumeration value="adaptive"/>
</xsd:restriction>
</xsd:simpleType>

<xsd:complexType name="mutationType">
<xsd:all>
<xsd:element name="mutationProb" type="xsd:float" />
<xsd:element name="mutationProb" type="mutationProbType" />
<xsd:element name="locs" type="xsd:string" minOccurs="0" maxOccurs="1"/>
</xsd:all>
<xsd:attribute name="type" type="xsd:string" />
</xsd:complexType>

<!-- Define a new complex type for mutationProbType -->
<xsd:complexType name="mutationProbType">
<xsd:simpleContent>
<xsd:extension base="mutationProbValueType">
<xsd:attribute name="type" type="mutationProbTypeAttr" use="optional"/>
</xsd:extension>
</xsd:simpleContent>
</xsd:complexType>

<!-- Define the simple type for the mutationProb value -->
<xsd:simpleType name="mutationProbValueType">
<xsd:union memberTypes="xsd:float mutationAdaptiveType"/>
</xsd:simpleType>

<!-- Define the simple type for the adaptive mutation probability -->
<xsd:simpleType name="mutationAdaptiveType">
<xsd:restriction base="xsd:string">
<xsd:enumeration value="linear"/>
<xsd:enumeration value="quadratic"/>
</xsd:restriction>
</xsd:simpleType>

<!-- Define the attribute type for the mutationProb type -->
<xsd:simpleType name="mutationProbTypeAttr">
<xsd:restriction base="xsd:string">
<xsd:enumeration value="static"/>
<xsd:enumeration value="adaptive"/>
</xsd:restriction>
</xsd:simpleType>

<xsd:complexType name="reproductionType">
<xsd:all>
<xsd:element name="crossover" type="crossoverType" />
Expand Down
14 changes: 7 additions & 7 deletions doc/user_manual/generated/generateOptimizerDoc.py
Original file line number Diff line number Diff line change
Expand Up @@ -152,7 +152,7 @@ def insertSolnExport(tex, obj):
</samplerInit>

<GAparams>
<populationSize>20</populationSize>
<populationSize>10</populationSize>
<parentSelection>rouletteWheel</parentSelection>
<reproduction>
<crossover type="onePointCrossover">
Expand All @@ -177,32 +177,32 @@ def insertSolnExport(tex, obj):

<variable name="x1">
<distribution>uniform_dist_woRepl_1</distribution>
<initial>1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20</initial>
<initial>1,2,3,4,5,6,7,8,9,10</initial>
</variable>

<variable name="x2">
<distribution>uniform_dist_woRepl_1</distribution>
<initial>2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,1</initial>
<initial>2,3,4,5,6,7,8,9,10,1</initial>
</variable>

<variable name="x3">
<distribution>uniform_dist_woRepl_1</distribution>
<initial>3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,1,2</initial>
<initial>3,4,5,6,7,8,9,10,1,2</initial>
</variable>

<variable name="x4">
<distribution>uniform_dist_woRepl_1</distribution>
<initial>4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,1,2,3</initial>
<initial>4,5,6,7,8,9,10,1,2,3</initial>
</variable>

<variable name="x5">
<distribution>uniform_dist_woRepl_1</distribution>
<initial>5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,1,2,3,4</initial>
<initial>5,6,7,8,9,10,1,2,3,4</initial>
</variable>

<variable name="x6">
<distribution>uniform_dist_woRepl_1</distribution>
<initial>6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,1,2,3,4,5</initial>
<initial>6,7,8,9,10,1,2,3,4,5</initial>
</variable>

<objective>ans</objective>
Expand Down
29 changes: 15 additions & 14 deletions ravenframework/Optimizers/BayesianOptimizer.py
Original file line number Diff line number Diff line change
Expand Up @@ -150,6 +150,7 @@ def __init__(self):
self._paramSelectionOptions = {'ftol':1e-10, 'maxiter':200, 'disp':False} # Optimizer options for hyperparameter selection
self._externalParamOptimizer = 'fmin_l_bfgs_b' # Optimizer for external hyperparameter selection
self._resetModel = False # Reset regression model if True
self._canHandleMultiObjective = False # boolean indicator whether optimization is a sinlge-objective problem or a multi-objective problem

def handleInput(self, paramInput):
"""
Expand Down Expand Up @@ -232,8 +233,8 @@ def initialize(self, externalSeeding=None, solutionExport=None):
elif len(self._model.supervisedContainer[0].target) != 1:
self.raiseAnError(RuntimeError, f'Only one target allowed when using GPR ROM for Bayesian Optimizer! '
f'Received {len(self._model.supervisedContainer[0].target)}')
elif self._objectiveVar not in self._model.supervisedContainer[0].target:
self.raiseAnError(RuntimeError, f'GPR ROM <target> should be obective variable: {self._objectiveVar}, '
elif self._objectiveVar[0] not in self._model.supervisedContainer[0].target:
self.raiseAnError(RuntimeError, f'GPR ROM <target> should be obective variable: {self._objectiveVar[0]}, '
f'Received {self._model.supervisedContainer[0].target}')

if self._resetModel:
Expand Down Expand Up @@ -265,8 +266,8 @@ def initialize(self, externalSeeding=None, solutionExport=None):
trainingData = self.normalizeData(trainingData)
for varName in self.toBeSampled.keys():
self._trainingInputs[0][varName] = list(trainingData[varName])
self._trainingTargets.append(list(trainingData[self._objectiveVar]))
self.raiseAMessage(f"{self._model.name} ROM has been already trained with {len(trainingData[self._objectiveVar])} samples!",
self._trainingTargets.append(list(trainingData[self._objectiveVar[0]]))
self.raiseAMessage(f"{self._model.name} ROM has been already trained with {len(trainingData[self._objectiveVar[0]])} samples!",
"This pre-trained ROM will be used by Optimizer to evaluate the next best point!")
# retrieving the best solution is based on the acqusition function's utility
# Constraints are considered in the following method.
Expand Down Expand Up @@ -333,7 +334,7 @@ def _useRealization(self, info, rlz):
# Add new inputs and model evaluations to the dataset
for varName in list(self.toBeSampled):
self._trainingInputs[traj][varName].extend(getattr(rlz, varName).values)
self._trainingTargets[traj].extend(getattr(rlz, self._objectiveVar).values)
self._trainingTargets[traj].extend(getattr(rlz, self._objectiveVar[0]).values)
# Generate posterior with training data
self._generatePredictiveModel(traj)
self._resolveMultiSample(traj, rlz, info)
Expand All @@ -343,10 +344,10 @@ def _useRealization(self, info, rlz):
# Add new input and model evaluation to the dataset
for varName in list(self.toBeSampled):
self._trainingInputs[traj][varName].append(rlz[varName])
self._trainingTargets[traj].append(rlz[self._objectiveVar])
self._trainingTargets[traj].append(rlz[self._objectiveVar[0]])
# Generate posterior with training data
self._generatePredictiveModel(traj)
optVal = rlz[self._objectiveVar]
optVal = rlz[self._objectiveVar[0]]
self._resolveNewOptPoint(traj, rlz, optVal, info)

# Use acquisition to select next point
Expand Down Expand Up @@ -555,7 +556,7 @@ def _trainRegressionModel(self, traj):

for varName in list(self.toBeSampled):
trainingSet[varName] = np.asarray(self._trainingInputs[traj][varName])
trainingSet[self._objectiveVar] = np.asarray(self._trainingTargets[traj])
trainingSet[self._objectiveVar[0]] = np.asarray(self._trainingTargets[traj])
self._model.train(trainingSet)
# NOTE It would be preferrable to use targetEvaluation;
# however, there does not appear a built in normalization method and as
Expand Down Expand Up @@ -596,8 +597,8 @@ def _evaluateRegressionModel(self, featurePoint):
# Evaluating the regression model
resultsDict = self._model.evaluate(featurePoint)
# NOTE only allowing single targets, needs to be fixed when multi-objective optimization is added
mu = resultsDict[self._objectiveVar]
std = resultsDict[self._objectiveVar+'_std']
mu = resultsDict[self._objectiveVar[0]]
std = resultsDict[self._objectiveVar[0]+'_std']
return mu, std

# * * * * * * * * * * * *
Expand Down Expand Up @@ -627,7 +628,7 @@ def _resolveMultiSample(self, traj, rlz, info):
for index in range(info['batchSize']):
for varName in rlzVars:
singleRlz[varName] = getattr(rlz, varName)[index].values
optVal = singleRlz[self._objectiveVar]
optVal = singleRlz[self._objectiveVar[0]]
self._resolveNewOptPoint(traj, singleRlz, optVal, info)
singleRlz = {} # FIXME is this necessary?
self.raiseADebug(f'Multi-sample resolution completed')
Expand Down Expand Up @@ -664,7 +665,7 @@ def _resolveNewOptPoint(self, traj, rlz, optVal, info):
currentPoint = {}
for decisionVarName in list(self.toBeSampled):
currentPoint[decisionVarName] = rlz[decisionVarName]
rlz[self._objectiveVar] = self._evaluateRegressionModel(currentPoint)[0][0]
rlz[self._objectiveVar[0]] = self._evaluateRegressionModel(currentPoint)[0][0]
self.raiseADebug('*' * 80)
if acceptable in ['accepted', 'first']:
# record history
Expand All @@ -675,13 +676,13 @@ def _resolveNewOptPoint(self, traj, rlz, optVal, info):
# If the last recommended solution point is the same, update the expected function value
if all(old[var] == xStar[var] for var in list(self.toBeSampled)):
newEstimate = copy.copy(old)
newEstimate[self._objectiveVar] = muStar
newEstimate[self._objectiveVar[0]] = muStar
self._optPointHistory[traj].append((newEstimate, info))
else:
newRealization = copy.copy(old)
for var in list(self.toBeSampled):
newRealization[var] = xStar[var]
newRealization[self._objectiveVar] = muStar
newRealization[self._objectiveVar[0]] = muStar
else:
self.raiseAnError(f'Unrecognized acceptability: "{acceptable}"')

Expand Down
Loading