Releases: JuliaManifolds/Manopt.jl
Releases · JuliaManifolds/Manopt.jl
v0.5.3
Manopt v0.5.3
Added
StopWhenChangeLess
,StopWhenGradientChangeLess
andStopWhenGradientLess
can now use the new idea (ManifoldsBase.jl 0.15.18) of different outer norms on manifolds with components like power and product manifolds and all others that support this from theManifolds.jl
Library, likeEuclidean
.
Merged pull requests:
- Make
max_stepsize
more user friendly (#416) (@kellertuer) - Introduce outer norms to three stopping criteria (#417) (@kellertuer)
v0.5.2
Manopt v0.5.2
Added
- three new symbols to easier state to record the
:Gradient
, the:GradientNorm
, and the:Stepsize
.
Changed
- fix a few typos in the documentation
- improved the documentation for the initial guess of
ArmijoLinesearchStepsize
.
Merged pull requests:
- CompatHelper: bump compat for JLD2 to 0.5 for package docs, (keep existing compat) (#411) (@github-actions[bot])
- update readme logo with text. (#412) (@kellertuer)
- Fix a few typos on ALM docs. (#413) (@kellertuer)
- Improve a few places in the docs. (#415) (@kellertuer)
v0.5.1
Manopt v0.5.1
Changed
- slightly improves the test for the
ExponentialFamilyProjection
text on the about page.
Added
- the
proximal_point
method.
Merged pull requests:
- Adjust text for ExponentialFamilyProjection.jl (#409) (@bvdmitri)
- Proximal Point Method (#410) (@kellertuer)
Closed issues:
- PPA on Stiefel (#252)
v0.5.0
Manopt v0.5.0
This breaking release is mainly concerned with stability and usability
- all interfaces have been unified, especially orders of arguments and names of keywords
- for gradient rules like CG or average gradient, and stepsizes like the Armijo linesearch, specifying the manifold (yet again) is no longer necessary thanks to an idea from Dmtry
- the documentation has been reworked to using a glossary internally
- we now use Aqua.jl to avoid ambiguities
- we are back to supporting Julia 1.6 again after this rework as well.
For a full list of breaking changes see the Changelog.md.
Merged pull requests:
- Rework high level interfaces (#392) (@kellertuer)
- Rework presentation of keywords (#393) (@kellertuer)
- Bump tarides/changelog-check-action from 2 to 3 (#405) (@dependabot[bot])
Closed issues:
v0.4.69
Manopt v0.4.69
Changed
- Improved performance of Interior Point Newton Method.
Merged pull requests:
- Improve performance of IPM (#404) (@mateuszbaran)
v0.4.68
Manopt v0.4.68
Added
- an Interior Point Newton Method, the
interior_point_newton
- a
conjugate_residual
Algorithm to solve a linear system on a tangent space. ArmijoLinesearch
now allows for additionaladditional_decrease_condition
andadditional_increase_condition
keywords to add further conditions to accept additional conditions when to accept an decreasing or increase of the stepsize.- add a
DebugFeasibility
to have a debug print about feasibility of points in constrained optimisation employing the newis_feasible
function - add a
InteriorPointCentralityCondition
check that can be added for step candidates within the line search ofinterior_point_newton
- Add Several new functors
- the
LagrangianCost
,LagrangianGradient
,LagrangianHessian
, that based on a constrained objective allow to construct the hessian objective of its Lagrangian - the
CondensedKKTVectorField
and itsCondensedKKTVectorFieldJacobian
, that are being used to solve a linear system withininterior_point_newton
- the
KKTVectorField
as well as itsKKTVectorFieldJacobian
and ``KKTVectorFieldAdjointJacobian` - the
KKTVectorFieldNormSq
and itsKKTVectorFieldNormSqGradient
used within the Armijo line search ofinterior_point_newton
- the
- New stopping criteria
- A
StopWhenRelativeResidualLess
for theconjugate_residual
- A
StopWhenKKTResidualLess
for theinterior_point_newton
- A
Merged pull requests:
- The Riemannian Interior Point Newton Method (#399) (@kellertuer)
Closed issues:
- Refine storage of subsolver states (#403)
v0.4.67
Manopt v0.4.67
Merged pull requests:
- fix typos (#398) (@spaette)
- max_stepsize improvements (#402) (@mateuszbaran)
Closed issues:
- typos (#397)
v0.4.66
v0.4.65
Manopt v0.4.65
Changed
- refactor stopping criteria to not store a
sc.reason
internally, but instead only
generate the reason (and hence allocate a string) when actually asked for a reason.
Merged pull requests:
- CompatHelper: bump compat for DocumenterInterLinks to 1 for package docs, (keep existing compat) (#394) (@github-actions[bot])
- Refactor get_reason (#395) (@kellertuer)
Closed issues:
- Built-in stopping criterions allocate new error strings on each check, even when not required (#389)
v0.4.64
Manopt v0.4.64
Added
- Remodel the constraints and their gradients into separate
VectorGradientFunctions
to reduce code duplication and encapsulate the inner model of these functions and their gradients - Introduce a
ConstrainedManoptProblem
to model different ranges for the gradients in the
newVectorGradientFunction
s beyond the defaultNestedPowerRepresentation
- introduce a
VectorHessianFunction
to also model that one can provide the vector of Hessians
to constraints - introduce a more flexible indexing beyond single indexing, to also include arbitrary ranges
when accessing vector functions and their gradients and hence also for constraints and
their gradients.
Changed
- Remodel
ConstrainedManifoldObjective
to store anAbstractManifoldObjective
internally instead of directlyf
andgrad_f
, allowing also Hessian objectives
therein and implementing access to this Hessian - Fixed a bug that Lanczos produced NaNs when started exactly in a minimizer, since we divide by the gradient norm.
Deprecated
- deprecate
get_grad_equality_constraints(M, o, p)
, useget_grad_equality_constraint(M, o, p, :)
from the more flexible indexing instead.
Merged pull requests:
- Modularise Constraints (#386) (@kellertuer)
- Fix Initial condition on Lanczos (#391) (@kellertuer)
Closed issues: