Skip to content

Commit

Permalink
The Riemannian Interior Point Newton Method (#399)
Browse files Browse the repository at this point in the history
* Implemented conjugate residual method as abstract solver
* implement the interior point method
* introduce several KKT vector field related functions
* Introduce a ClosedFormSolverState for consistency

---------
Co-authored-by: mstokkenes <[email protected]>
Co-authored-by: Mateusz Baran <[email protected]>
  • Loading branch information
kellertuer authored Aug 2, 2024
1 parent 21dd646 commit 0b05c8c
Show file tree
Hide file tree
Showing 47 changed files with 3,271 additions and 254 deletions.
5 changes: 5 additions & 0 deletions .zenodo.json
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,11 @@
"name": "Kjemsås, Even Stephansen",
"type": "ProjectMember"
},
{
"affiliation": "NTNU Trondheim",
"name": "Stokkenes, Markus A.",
"type": "ProjectMember"
},
{
"name": "Daniel VandenHeuvel",
"type": "other",
Expand Down
18 changes: 18 additions & 0 deletions Changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,24 @@ All notable Changes to the Julia package `Manopt.jl` will be documented in this
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.4.68] – August 2, 2024

### Added

* an Interior Point Newton Method, the `interior_point_newton`
* a `conjugate_residual` Algorithm to solve a linear system on a tangent space.
* `ArmijoLinesearch` now allows for additional `additional_decrease_condition` and `additional_increase_condition` keywords to add further conditions to accept additional conditions when to accept an decreasing or increase of the stepsize.
* add a `DebugFeasibility` to have a debug print about feasibility of points in constrained optimisation employing the new `is_feasible` function
* add a `InteriorPointCentralityCondition` check that can be added for step candidates within the line search of `interior_point_newton`
* Add Several new functors
* the `LagrangianCost`, `LagrangianGradient`, `LagrangianHessian`, that based on a constrained objective allow to construct the hessian objective of its Lagrangian
* the `CondensedKKTVectorField` and its `CondensedKKTVectorFieldJacobian`, that are being used to solve a linear system within `interior_point_newton`
* the `KKTVectorField` as well as its `KKTVectorFieldJacobian` and ``KKTVectorFieldAdjointJacobian`
* the `KKTVectorFieldNormSq` and its `KKTVectorFieldNormSqGradient` used within the Armijo line search of `interior_point_newton`
* New stopping criteria
* A `StopWhenRelativeResidualLess` for the `conjugate_residual`
* A `StopWhenKKTResidualLess` for the `interior_point_newton`

## [0.4.67] – July 25, 2024

### Added
Expand Down
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "Manopt"
uuid = "0fc0a36d-df90-57f3-8f93-d78a9fc72bb5"
authors = ["Ronny Bergmann <[email protected]>"]
version = "0.4.67"
version = "0.4.68"

[deps]
ColorSchemes = "35d6a980-a343-548e-a6ea-1d62b119f2f4"
Expand Down
2 changes: 2 additions & 0 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -168,13 +168,15 @@ makedocs(;
"Chambolle-Pock" => "solvers/ChambollePock.md",
"CMA-ES" => "solvers/cma_es.md",
"Conjugate gradient descent" => "solvers/conjugate_gradient_descent.md",
"Conjugate Residual" => "solvers/conjugate_residual.md",
"Convex bundle method" => "solvers/convex_bundle_method.md",
"Cyclic Proximal Point" => "solvers/cyclic_proximal_point.md",
"Difference of Convex" => "solvers/difference_of_convex.md",
"Douglas—Rachford" => "solvers/DouglasRachford.md",
"Exact Penalty Method" => "solvers/exact_penalty_method.md",
"Frank-Wolfe" => "solvers/FrankWolfe.md",
"Gradient Descent" => "solvers/gradient_descent.md",
"Interior Point Newton" => "solvers/interior_point_Newton.md",
"Levenberg–Marquardt" => "solvers/LevenbergMarquardt.md",
"Nelder–Mead" => "solvers/NelderMead.md",
"Particle Swarm Optimization" => "solvers/particle_swarm.md",
Expand Down
1 change: 1 addition & 0 deletions docs/src/about.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ The following people contributed
* Even Stephansen Kjemsås contributed to the implementation of the [Frank Wolfe Method](solvers/FrankWolfe.md) solver
* Mathias Ravn Munkvold contributed most of the implementation of the [Adaptive Regularization with Cubics](solvers/adaptive-regularization-with-cubics.md) solver
* [Tom-Christian Riemer](https://www.tu-chemnitz.de/mathematik/wire/mitarbeiter.php) implemented the [trust regions](solvers/trust_regions.md) and [quasi Newton](solvers/quasi_Newton.md) solvers.
* [Markus A. Stokkenes](https://www.linkedin.com/in/markus-a-stokkenes-b41bba17b/) contributed most of the implementation of the [Interior Point Newton Method](solvers/interior_point_Newton.md)
* [Manuel Weiss](https://scoop.iwr.uni-heidelberg.de/author/manuel-weiß/) implemented most of the [conjugate gradient update rules](@ref cg-coeffs)

as well as various [contributors](https://github.com/JuliaManifolds/Manopt.jl/graphs/contributors) providing small extensions, finding small bugs and mistakes and fixing them by opening [PR](https://github.com/JuliaManifolds/Manopt.jl/pulls)s.
Expand Down
8 changes: 8 additions & 0 deletions docs/src/extensions.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,14 @@ Manopt.max_stepsize(::TangentBundle, ::Any)
mid_point
```

Internally, `Manopt.jl` provides the two additional functions to choose some
Euclidean space when needed as

```@docs
Manopt.Rn
Manopt.Rn_default
```

## JuMP.jl

Manopt can be used using the [JuMP.jl](https://github.com/jump-dev/JuMP.jl) interface.
Expand Down
19 changes: 18 additions & 1 deletion docs/src/plans/objective.md
Original file line number Diff line number Diff line change
Expand Up @@ -211,6 +211,16 @@ It might be beneficial to use the adapted problem to specify different ranges fo
ConstrainedManoptProblem
```

as well as the helper functions

```@docs
AbstractConstrainedFunctor
AbstractConstrainedSlackFunctor
LagrangianCost
LagrangianGradient
LagrangianHessian
```

#### Access functions

```@docs
Expand All @@ -223,9 +233,16 @@ get_grad_equality_constraint
get_grad_inequality_constraint
get_hess_equality_constraint
get_hess_inequality_constraint
is_feasible
```

#### Internal functions

```@docs
Manopt.get_feasibility_status
```

### A vectorial cost function
### Vectorial objectives

```@docs
Manopt.AbstractVectorFunction
Expand Down
7 changes: 7 additions & 0 deletions docs/src/plans/stepsize.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,13 @@ Tangent bundle with the Sasaki metric has 0 injectivity radius, so the maximum s
`Hyperrectangle` also has 0 injectivity radius and an estimate based on maximum of dimensions along each index is used instead.
For manifolds with corners, however, a line search capable of handling break points along the projected search direction should be used, and such algorithms do not call `max_stepsize`.

Some solvers have a different iterate from the one used for linesearch. Then the following state can be used to wrap
these locally

```@docs
StepsizeState
```

## Literature

```@bibliography
Expand Down
22 changes: 22 additions & 0 deletions docs/src/references.bib
Original file line number Diff line number Diff line change
Expand Up @@ -341,6 +341,16 @@ @article{DuranMoelleSbertCremers:2016
% --- E
%
%
@article{El-BakryTapiaTsuchiyaZhang:1996,
AUTHOR = {El-Bakry, A. S. and Tapia, R. A. and Tsuchiya, T. and Zhang, Y.},
DOI = {10.1007/bf02275347},
JOURNAL = {Journal of Optimization Theory and Applications},
NUMBER = {3},
PAGES = {507–541},
TITLE = {On the formulation and theory of the Newton interior-point method for nonlinear programming},
VOLUME = {89},
YEAR = {1996}
}

% --- F
%
Expand Down Expand Up @@ -532,6 +542,18 @@ @article{Karcher:1977
% --- L
%
%
@article{LaiYoshise:2024,
AUTHOR = {Lai, Zhijian and Yoshise, Akiko},
DOI = {10.1007/s10957-024-02403-8},
EPRINT = {2203.09762},
EPRINTTYPE = {arXiv},
JOURNAL = {Journal of Optimization Theory and Applications},
NUMBER = {1},
PAGES = {433–469},
TITLE = {Riemannian Interior Point Methods for Constrained Optimization on Manifolds},
VOLUME = {201},
YEAR = {2024}
}
@article{LausNikolovaPerschSteidl:2017,
AUTHOR = {Laus, F. and Nikolova, M. and Persch, J. and Steidl, G.},
YEAR = {2017},
Expand Down
34 changes: 34 additions & 0 deletions docs/src/solvers/conjugate_residual.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
# Conjugate Residual Solver in a Tangent space

```@meta
CurrentModule = Manopt
```

```@docs
conjugate_residual
```

## State

```@docs
ConjugateResidualState
```

## Objetive

```@docs
SymmetricLinearSystemObjective
```

## Additional stopping criterion

```@docs
StopWhenRelativeResidualLess
```

## Literature

```@bibliography
Pages = ["conjugate_residual.md"]
Canonical=false
```
21 changes: 19 additions & 2 deletions docs/src/solvers/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -88,16 +88,24 @@ For these you can use

* The [Augmented Lagrangian Method](augmented_Lagrangian_method.md) (ALM), where both `g` and `grad_g` as well as `h` and `grad_h` are keyword arguments, and one of these pairs is mandatory.
* The [Exact Penalty Method](exact_penalty_method.md) (EPM) uses a penalty term instead of augmentation, but has the same interface as ALM.
* The [Interior Point Newton Method](interior_point_Newton.md) (IPM) rephrases the KKT system of a constrained problem into an Newton iteration being performed in every iteration.
* [Frank-Wolfe algorithm](FrankWolfe.md), where besides the gradient of ``f`` either a closed form solution or a (maybe even automatically generated) sub problem solver for ``\operatorname*{arg\,min}_{q ∈ C} ⟨\operatorname{grad} f(p_k), \log_{p_k}q⟩`` is required, where ``p_k`` is a fixed point on the manifold (changed in every iteration).

# Alphabetical list List of algorithms
## On the tangent space

* [Conjugate Residual](conjugate_residual.md) a solver for a linear system ``\mathcal A[X] + b = 0`` on a tangent space.
* [Steihaug-Toint Truncated Conjugate-Gradient Method](truncated_conjugate_gradient_descent.md) a solver for a constrained problem defined on a tangent space.


## Alphabetical list List of algorithms

| Solver | Function | State |
|:---------|:----------------|:---------|
| [Adaptive Regularisation with Cubics](adaptive-regularization-with-cubics.md) | [`adaptive_regularization_with_cubics`](@ref) | [`AdaptiveRegularizationState`](@ref) |
| [Augmented Lagrangian Method](augmented_Lagrangian_method.md) | [`augmented_Lagrangian_method`](@ref) | [`AugmentedLagrangianMethodState`](@ref) |
| [Chambolle-Pock](ChambollePock.md) | [`ChambollePock`](@ref) | [`ChambollePockState`](@ref) |
| [Conjugate Gradient Descent](conjugate_gradient_descent.md) | [`conjugate_gradient_descent`](@ref) | [`ConjugateGradientDescentState`](@ref) |
| [Conjugate Residual](conjugate_residual.md) | [`conjugate_residual`](@ref) | [`ConjugateResidualState`](@ref) |
| [Convex Bundle Method](convex_bundle_method.md) | [`convex_bundle_method`](@ref) | [`ConvexBundleMethodState`](@ref) |
| [Cyclic Proximal Point](cyclic_proximal_point.md) | [`cyclic_proximal_point`](@ref) | [`CyclicProximalPointState`](@ref) |
| [Difference of Convex Algorithm](@ref solver-difference-of-convex) | [`difference_of_convex_algorithm`](@ref) | [`DifferenceOfConvexState`](@ref) |
Expand All @@ -106,6 +114,7 @@ For these you can use
| [Exact Penalty Method](exact_penalty_method.md) | [`exact_penalty_method`](@ref) | [`ExactPenaltyMethodState`](@ref) |
| [Frank-Wolfe algorithm](FrankWolfe.md) | [`Frank_Wolfe_method`](@ref) | [`FrankWolfeState`](@ref) |
| [Gradient Descent](gradient_descent.md) | [`gradient_descent`](@ref) | [`GradientDescentState`](@ref) |
| [Interior Point Newton](interior_point_Newton.md) | [`interior_point_Newton`](@ref) | |
| [Levenberg-Marquardt](LevenbergMarquardt.md) | [`LevenbergMarquardt`](@ref) | [`LevenbergMarquardtState`](@ref) | ``f = \sum_i f_i`` ``\operatorname{grad} f_i`` (Jacobian)|
| [Nelder-Mead](NelderMead.md) | [`NelderMead`](@ref) | [`NelderMeadState`](@ref) |
| [Particle Swarm](particle_swarm.md) | [`particle_swarm`](@ref) | [`ParticleSwarmState`](@ref) |
Expand Down Expand Up @@ -189,4 +198,12 @@ also use the third (lowest level) and just call

```
solve!(problem, state)
```
```

### Closed-form subsolvers

If a subsolver solution is available in closed form, `ClosedFormSubSolverState` is used to indicate that.

```@docs
Manopt.ClosedFormSubSolverState
```
48 changes: 48 additions & 0 deletions docs/src/solvers/interior_point_Newton.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
# Interior Point Newton method

```@meta
CurrentModule = Manopt
```

```@docs
interior_point_Newton
interior_point_Newton!
```

## State

```@docs
InteriorPointNewtonState
```

## Subproblem functions

```@docs
CondensedKKTVectorField
CondensedKKTVectorFieldJacobian
KKTVectorField
KKTVectorFieldJacobian
KKTVectorFieldAdjointJacobian
KKTVectorFieldNormSq
KKTVectorFieldNormSqGradient
```

## Helpers

```@docs
InteriorPointCentralityCondition
Manopt.calculate_σ
```

## Additional stopping criteria

```@docs
StopWhenKKTResidualLess
```

## References

```@bibliography
Pages = ["interior_point_Newton.md"]
Canonical=false
```
2 changes: 2 additions & 0 deletions docs/styles/config/vocabularies/Manopt/accept.txt
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,7 @@ Jasa
Jax
JuMP.jl
kwargs
Lai
Levenberg
Lagrangian
Lanczos
Expand Down Expand Up @@ -112,5 +113,6 @@ tridiagonal
Weinmann
Willem
Wolfe
Yoshise
Yuan
Zhang
6 changes: 5 additions & 1 deletion ext/ManoptManifoldsExt/ManoptManifoldsExt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,9 @@ import Manopt:
get_gradient!,
set_manopt_parameter!,
reflect,
reflect!
reflect!,
Rn,
Rn_default
using LinearAlgebra: cholesky, det, diag, dot, Hermitian, qr, Symmetric, triu, I, Diagonal
import ManifoldsBase: copy, mid_point, mid_point!

Expand All @@ -24,6 +26,8 @@ else
using ..Manifolds
end

Rn(::Val{:Manifolds}, args...; kwargs...) = Euclidean(args...; kwargs...)

const NONMUTATINGMANIFOLDS = Union{Circle,PositiveNumbers,Euclidean{Tuple{}}}
include("manifold_functions.jl")
include("ChambollePockManifolds.jl")
Expand Down
Loading

2 comments on commit 0b05c8c

@kellertuer
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator register

Release notes:

Added

  • an Interior Point Newton Method, the interior_point_newton
  • a conjugate_residual Algorithm to solve a linear system on a tangent space.
  • ArmijoLinesearch now allows for additional additional_decrease_condition and additional_increase_condition keywords to add further conditions to accept additional conditions when to accept an decreasing or increase of the stepsize.
  • add a DebugFeasibility to have a debug print about feasibility of points in constrained optimisation employing the new is_feasible function
  • add a InteriorPointCentralityCondition check that can be added for step candidates within the line search of interior_point_newton
  • Add Several new functors
    • the LagrangianCost, LagrangianGradient, LagrangianHessian, that based on a constrained objective allow to construct the hessian objective of its Lagrangian
    • the CondensedKKTVectorField and its CondensedKKTVectorFieldJacobian, that are being used to solve a linear system within interior_point_newton
    • the KKTVectorField as well as its KKTVectorFieldJacobian and ``KKTVectorFieldAdjointJacobian`
    • the KKTVectorFieldNormSq and its KKTVectorFieldNormSqGradient used within the Armijo line search of interior_point_newton
  • New stopping criteria
    • A StopWhenRelativeResidualLess for the conjugate_residual
    • A StopWhenKKTResidualLess for the interior_point_newton

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Registration pull request created: JuliaRegistries/General/112306

Tagging

After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version.

This will be done automatically if the Julia TagBot GitHub Action is installed, or can be done manually through the github interface, or via:

git tag -a v0.4.68 -m "<description of version>" 0b05c8cec17b44e6c62ae288628ba5eb98380a90
git push origin v0.4.68

Please sign in to comment.