Skip to content

Releases: JuliaNLSolvers/Optim.jl

v1.0.0

05 Sep 22:17
cf2750e
Compare
Choose a tag to compare

Optim v1.0.0

Diff since v0.22.0

Closed issues:

  • usage with JuMP? (#107)
  • Generalize Optim.jl to be an interface for nonlinear optimization (#309)
  • Multidimensional arrays (#399)
  • ProgressMeter (#442)
  • Missing docstrings (#469)
  • L-BFGS-(B) (#521)
  • Finite differencing should respect box constraints (#541)
  • Todo: SAMIN (#555)
  • Some questions about convergence assessment and results (#574)
  • Going below sqrt(eps) (#631)
  • Project manifold tangent when we reset search direction (#649)
  • Rename optimize to minimize? (#685)
  • Preallocate caches in Fminbox to avoid unnecessary value_gradient!! calls (#704)
  • No default objective type for IPNewton (#711)
  • Fminbox() using parameters outside of the box specified (#712)
  • only_fgh! doesn't work if algorithm does not need Hessian (#718)
  • Bug in Stiefel_CholQR (#752)
  • Intermittent assertion error in HagerZhang (#802)
  • Success when none of convergence measures is satisfied (#806)
  • feature request: default to gradient-based method when user passes fg! (#816)
  • Use AbstractConstrainedOptimizer as the parent type for all the constrained optimizers? (#818)
  • Type instability (#820)
  • Confusing behavior of f_abstol with Fminbox (#821)
  • Univariate optimization does not stop when callback returns true (#822)
  • Optim: No method matching iterate (#824)
  • Brent still broken for flat functions (#827)
  • Comment doesn't really seem to reflect logic in code (#830)
  • Computing the inverse of Hessian in Newton's method (#832)
  • Fminbox optimization does not exit when callback returns true (#834)
  • Particle Swarm algorithm fails in very simple objective function (#835)
  • ParticleSwarm does not respect Optim.Options(iterations=x) (#836)
  • Missing website (#841)
  • ERROR: ArgumentError: Value and slope at step length = 0 must be finite. (#842)
  • Problems with BigFloat (#844)
  • indexing issue with ArrayParition arguments (#848)
  • NealderMead recalculates same points: (#852)
  • Functor Support with Univariate analysis (#853)
  • Warning (again) on LinearAlgebra.dot (#855)

Merged pull requests:

  • WIP: Corrected Stiefel_CholQR (#753) (@jagot)
  • Add callback stops for univariate optimization. (#831) (@pkofod)
  • Test fgh! (#838) (@pkofod)
  • Update .travis.yml (#839) (@pkofod)
  • Bump FillArrays version (#840) (@JeffFessler)
  • Make promote_obj noop for ipnewton and twicediffed (#845) (@pkofod)
  • Add methods for (Not)InplaceObjective to optimize when not method is … (#846) (@pkofod)
  • Allow that f can increase from iteration to iteration (#847) (@pkofod)
  • Fix project_tangent! and retract! args in manual (#849) (@thisrod)
  • Don't re-create the cache in every iteration of Fminbox as it calls the objective too often (#850) (@pkofod)
  • Don't use Function signatures. (#854) (@pkofod)
  • Fix some outstanding Fminbox issues (#856) (@pkofod)

v0.22.0

26 Jun 14:10
0794f44
Compare
Choose a tag to compare

Optim v0.22.0

Diff since v0.21.0

Closed issues:

  • Complex optimization example of documentary not working (#817)

Merged pull requests:

v0.21.0

30 Apr 14:06
f7ddb71
Compare
Choose a tag to compare

Optim v0.21.0

Diff since v0.20.6

Closed issues:

  • Document KrylovTrustRegion (#737)
  • Inconsistencies with fg! in Krylov vs LBFGS (#738)
  • New minor release for StatsBase v0.33 compatibility? (#803)
  • numerical values for alphaguess (#805)
  • V1.4 tests fail (#810)

Merged pull requests:

  • only_* constructors for newton-krylov (#742) (@tlienart)
  • Support creation of TwiceDifferentiable from IPNewton methods (#808) (@timholy)
  • Skip updates if dx_dg is non-positive in BFGS (#809) (@pkofod)
  • Use inversediagonal in precon test. (#811) (@pkofod)
  • Add simple InitialStatic alpha choice by providing just a number to t… (#812) (@pkofod)

v0.20.6

02 Apr 07:06
a5e61c7
Compare
Choose a tag to compare

Optim v0.20.6

Diff since v0.20.5

Closed issues:

  • Gradients not working correctly with Double64 type from DoubleFloats (#761)
  • Unused return value from update_state!() in optimize() (#789)
  • brent returns spurious minimum (#798)
  • It seems like dot is being incorrectly overwritten (#801)

Merged pull requests:

v0.20.5

27 Mar 09:21
d018c39
Compare
Choose a tag to compare
Update Project.toml

v0.20.4

14 Mar 07:10
v0.20.4
3edf9b1
Compare
Choose a tag to compare

v0.20.4 (2020-03-14)

Diff since v0.20.3

Closed issues:

  • Least square minimization (#797)

Merged pull requests:

v0.20.3

10 Mar 07:33
v0.20.3
a103513
Compare
Choose a tag to compare

v0.20.3 (2020-03-10)

Diff since v0.20.2

Merged pull requests:

v0.20.2

03 Mar 12:11
v0.20.2
1e0ab0a
Compare
Choose a tag to compare

v0.20.2 (2020-03-03)

Diff since v0.20.1

Closed issues:

  • Maximizing a complicated nonlinear logliklihood function (#787)
  • Nelder-Mead stopping condition (#759)
  • Roadmap-ish stuff (#662)
  • linesearch warning (#579)
  • Trust-Region code does not handle the hard case (#540)
  • Finite Difference: Remove Calculus when DiffEqDiffTools Hessians are implemented (#519)
  • Type instability in result object (#510)
  • Feature request: Callback function gets passed current best solution (#452)
  • optimize with autodiff accumulates calls counter (#443)

Merged pull requests:

  • Use Compat.jl for dot. (#791) (pkofod)
  • Fix swapped relative and absolute tolerances (#788) (gwater)

v0.20.1

30 Jan 17:21
v0.20.1
74794f2
Compare
Choose a tag to compare

v0.20.1 (2020-01-30)

Diff since v0.20.0

Merged pull requests:

v0.20.0

23 Jan 22:57
v0.20.0
75ab088
Compare
Choose a tag to compare

v0.20.0 (2020-01-23)

Diff since v0.19.7

Closed issues:

  • OneDifferenciable strange error (#769)
  • Parallel Auto Differentiation (#760)
  • Documentation suggests no support for global optimizaiton (#731)

Merged pull requests: