-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added ExaModels.jl #57
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very cool @sshin23. I have a few hopefully easy updates. Once that is done I will merge into a branch for a test on PGLib and then merge into master.
Also, I will need to know to set Ipopt's linear solver to ma27
. I am guessing it would be the same what that we did it for NLPModels, see https://github.com/lanl-ansi/rosetta-opf/blob/cli/nlpmodels.jl#L325
examodels.jl
Outdated
result = NLPModelsIpopt.ipopt(model) | ||
|
||
cost = result.objective | ||
|
||
x = result.solution | ||
g = ExaModels.NLPModels.cons(model, x) | ||
|
||
constraint_violation = sum( | ||
mapreduce.( x->max(x,0), max, [ | ||
x - model.meta.uvar, | ||
model.meta.lvar - x, | ||
g - model.meta.ucon, | ||
model.meta.lcon - g | ||
]) | ||
) | ||
|
||
feasible = constraint_violation <= constraint_tol |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Request for revision, is there a way we can extract the termination status returned from Ipopt? I think that would be the most consistent way to set the "feasible" flag here. You might have a look at https://github.com/lanl-ansi/rosetta-opf/blob/main/nlpmodels.jl as a reference. Here I did,
output = NLPModelsIpopt.ipopt(nlp)
cost = output.objective
feasible = (output.primal_feas <= 1e-6)
because I could not find out how to extract Ipopt's termination status. If it is not possible to get Ipopt's termination status from NLPModelsIpopt, then maybe this is the best solution we can have for now?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
output.status == :first_order
should be equivalent to JuMP.termination_status(model) == MOI.Status
. I'll make changes accordingly.
@ccoffrin right. ExaModels is built on |
* Added ExaModels.jl (#57) * Update project files * added examodels to testing --------- Co-authored-by: Oscar Dowson <[email protected]> * fix solution building --------- Co-authored-by: Sungho Shin <[email protected]> Co-authored-by: Oscar Dowson <[email protected]>
Hey @ccoffrin, Thanks for the great project!
This PR adds a benchmark script for ExaModels.jl. We're planning to add a built-in timer for different callbacks, so hopefully, we can uncomment the callback timing part soon.