Releases: SMTorg/smt
Releases · SMTorg/smt
1.2.0
- Add EGO optimization with GEKPLS model (#340, #346, thanks @Laurentww)
- Breaking change: Remove scikit-learn < 0.22 support for KPLS surrogates family
- Remove Python 3.6 from CI tests as it has reached its end-of-life date (#342).
- Fix MOE when test data are specified (#347)
- Fix MFK to make it work even with one fidelity (#339, #341)
- Fix Kriging based surrogates to allow constant function modeling (#338)
- Fix KPLS automatic determination of components number and update notebook (#335)
1.1.0
- Mixed integer surrogate enhancements (thanks @Paul-Saves)
- Add number of components estimation in KPLS surrogate models (#325)
- Add ordered variables management in mixed integer surrogates (#326, #327). Deprecation warning: INT type is deprecated and superseded by ORD type.
- Update version for the GOWER distance model. (#330)
- Implement generalization of the homoscedastic hypersphere kernel from Pelamatti et al. (#330)
- Refactor MixedInteger (#328, #330)
- Add
propagate_uncertainty
option in MFK method (#320 thanks @anfelopera) :- when True the variances of lower fidelity levels are taken into account.
- Add LHS expansion method (#303, #323 thanks @rconde1997)
- MOE: Fix computation of errors when choosing expert surrogates (#334)
- Breaking Changes:
- In EGO SMT,
UCB
criteria mistakenly named regarding the litterature is renamedLCB
! (#321) - In MixedInteger surrogate:
use_gower_distance=True
option replaced bycategorical_kernel=GOWER
- In EGO SMT,
- Documentation:
- Add collab links in Tutorial README (#322)
- Add notebook about MFK with noise handling (#320)
- Fix typos (#320, #321)
1.0.0
It is a good time to release SMT 1.0 (just after 0.9!).
The SMT architecture has shown to be useful and resilient since the 0.2 version presented in the article (more additions than actual breaking changes since then). Special thanks to @bouhlelma and @hwangjt and thanks to all contributors.
This is a smooth transition from SMT 0.9, with small additions and bug fixes:
- Add
random_state
option toNestedLHS
for result reproducibility (#296 thanks @anfelopera) - Add
use_gower_distance
option to EGO to use the Gower distance kernel
instead of continuous relaxation in presence of mixed integer variables (#299 thanks @Paul-Saves ) - Fix kriging based bug to allow
n_start=1
(#301) - Workaround PLS changes in
scikit-learn 0.24
which impact KPLS surrogate model family (#306) - Add documentation about saving and loading surrogate models (#308)
0.9.0
- Mixture of Experts improvements: (#282 thanks @jbussemaker, #283)
- add variance prediction API (ie.
predict_variances()
) which is enabled whenvariances_support
option is set - add
MOESurrogateModel
class which adaptsMOE
to theSurrogateModel
interface - allow selection of experts to be part of the mixture (see
allow
/deny
options) MOE.AVAILABLE_EXPERTS
lists all possible expertsenabled_experts
property of an MOE instance lists possible experts wrtderivatives/variances_support
andallow/deny
options.
- add variance prediction API (ie.
- Sampling Method interface refactoring: (#284 thanks @LDAP)
- create an intermediate
ScaledSamplingMethod
class to be the base class for sampling methods
which generate samples in the [0, 1] hypercube - allow future implementation of sampling methods generating samples direcly in the input space (i.e. within xlimits)
- create an intermediate
- Use of Gower distance in kriging based mixed integer surrogate: (#289 thanks @raul-rufato )
- add
use_gower_distance
option toMixedIntegerSurrogate
- add
gower
correlation model to kriging based surrogate - see MixedInteger notebook for usage
- add
- Improve kriging based surrogates with multistart method (#293 thanks @Paul-Saves )
- run several hyperparameter optimizations taking the best result
- number of optimization is controlled by
n_start
new option (default 10)
- Update documentation for MOE and SamplingMethod (#285)
- Fixes (#279, #281)
0.8.0
- Noise API changes for Kriging based surrogates (#276, #257 thanks @anfelopera):
- add a new tutorial notebook on how to deal with noise in SMT
- rename
noise
asnoise0
option and is now a list of values - add option
use_het_noise
to manage heteroscedastic noise, - improve noise management for MFK (different noise by level),
- add option
nugget
to enable the handling of numerical instabilitily - matern kernel documentation
- Add
predict_variance_derivatives
API (#256 , #259 thanks @Paul-Saves)- add spatial derivatives for Kriging based surrogates
- fix respect of parameters bounds in Kriging based surrogates
- Notebooks updates (#262, #275 thanks @NatOnera, #277 thanks @Paul-Saves )
- Kriging based surrogates refactoring (#261 thanks @anfelopera)
- inheritance changes: MFKPLS -> MFK, KPLSK, GEKPLS -> KPLS
- improve noise options consistency
- improve options validity checking
- Code quality (#264, #267, #268 thanks @LDAP):
- use of abc metaclass to enforce developer API
- type hinting
- add 'build system' specification and requirements.txt for tests, setup cleanup
0.7.1
- allow noise evaluation for Kriging based surrogate (#251)
- fix optimizer bounds in Kriging based surrogate (#252)
- fix MFK parameterization by level (#252)
- add
random_state
option to LHS sampling method for test repeatability (#253) - add
random_state
option to EGO application for test repeatability (#255) - cleanup tests (#255)
Marginal Gaussian Process surrogate model
- add Marginal Gaussian Process surrogate models(#236, thanks @repriem)
- add Matern kernels for kriging based surrogates (#236, thanks @repriem)
- add gradient based optimization for hyperparameters in kriging based surrogates: new
hyper_opt
option to specify TNC Scipy gradient based optimizer, gradient-free Cobyla optimizer remains the default. (#236, thanks @repriem) - add
MixedIntegerContext
documentation (#234 ) - fix bug in
mixed_integer::unfold_with_enum_mask
(#233 )
Mixed Integer Sampling Method and Surrogate
- Application: Mixed integer sampling methods and surrogates (#229)
- handling of categorical and integer variables in Kriging (#219, thanks @Paul-Saves)
- handling of categorical and integer variables in EGO optimizer (#220, thanks @Paul-Saves)
- remove initial doe returned value from EGO optimize method (#224)
- drop Python 2.7 (#215, #227)
- fix MFK variance computation (#211)
- fix MOE experts selection (#223)
- fix MOE RMTS usage (#225)
- fix QP as used in run_examples (#226)
MFKPLSK bug fix
- fix bug when
eval_noise
isTrue
Fix packaging bug
- add
packaging
dependency in setup