Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Eliminate bounds relaxation in WaterTAP Ipopt #1162

Merged
merged 12 commits into from
Oct 5, 2023

Conversation

bknueven
Copy link
Contributor

@bknueven bknueven commented Oct 4, 2023

Replaces #1140

@avdudchenko reports in consistencies in the way domain=pyo.NonNegativeReals vs bounds=(0, None). I tracked this down to an issue in the way "bounds relaxation" is handled in the WaterTAP wrapper for Ipopt.

This PR would resolve this issue by turning off bounds relaxation altogether for the WaterTAP wrapper for Ipopt.

Summary/Motivation:

By default, Ipopt relaxes the bounds internally, e.g., (0, 1) becomes (-1e-10, 1 + 1e+10). Because of WaterTAP's use of variable scaling, the ipopt-watertap wrapper was doing this operation itself. Unfortunately, the wrapper contained a bug wherein explicit bounds of (0, None) would get relaxed to (-1e-10, None) but domain=pyo.NonNegativeReals would not get relaxed, and would be represented as (0, None) within Ipopt.

Because these should be consistent, this PR would opt to turn off bounds relaxation entirely within the ipopt-watertap wrapper. The main affect of this is that Ipopt will always return a solution strictly inside the specified bounds. Therefore, it is generally not a good idea to specify implied bounds on variables -- a few commits (19a7dec, 3650135, 5e7d621) remove or loosen bounds on variables for model stability.

Changes proposed in this PR:

  • Set bounds_relax_factor == 0 to be the default behavior for ipopt-watertap
  • Remove now redundant code and tests from ipopt-watertap
  • Adjust models and test as needed for the changed default settings.

Legal Acknowledgement

By contributing to this software project, I agree to the following terms and conditions for my contribution:

  1. I agree my contributions are submitted under the license terms described in the LICENSE.txt file at the top level of this directory.
  2. I represent I am authorized to make the contributions and grant the license. If my employer has rights to intellectual property that includes these contributions, I represent that I have received permission to make contributions and grant the required license on behalf of that employer.

@codecov
Copy link

codecov bot commented Oct 4, 2023

Codecov Report

All modified lines are covered by tests ✅

Comparison is base (5b582df) 94.75% compared to head (c449a2e) 94.89%.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1162      +/-   ##
==========================================
+ Coverage   94.75%   94.89%   +0.13%     
==========================================
  Files         341      341              
  Lines       34549    34524      -25     
==========================================
+ Hits        32737    32760      +23     
+ Misses       1812     1764      -48     
Files Coverage Δ
watertap/core/plugins/solvers.py 99.10% <100.00%> (+0.56%) ⬆️
watertap/unit_models/electrodialysis_1D.py 96.92% <ø> (ø)
watertap/unit_models/mvc/components/compressor.py 100.00% <ø> (ø)
watertap/unit_models/reverse_osmosis_base.py 96.81% <ø> (ø)

... and 5 files with indirect coverage changes

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Contributor

@avdudchenko avdudchenko left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this a positive change to WT as in my tests it results in more consistent solving behavior on complex flow sheets - e.g. If it solves, it will generally solve consistently, and changing scaling, bounds etc. produces immediate effects, while with current main implementation changes to bound in scaling/initialization might help solve one state, but model would be marginally stable after wards. This is anecdotal observation rather then a quantitative one. My hypothesis on this is replacing 0 bound would allow small negative values, that could drive instability in some flow-sheets and this was primary impact on using Reals vs. NonNegativeReals -this behavior was really an issue if model was marginally scaled, or not well conditioned. I seen this directly when initializing NF-DSPM-DE model with wrong initial guess, as I would get warning of value being set below the lower bound of 0 to tiny negative value (-1e-15, or 1e-25) this some time would not impact the model and it would solve fine, some times it would break.

P.S, Looks like updates to MCAS and NF scaling made those flowsheet solve with this change.

@@ -946,13 +946,16 @@ def run_case2(
init_options = {**solver.options}
init_options["tol"] = init_tol
init_options["constr_viol_tol"] = init_tol
init_options["ma27_pivtol"] = 5e-1
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have we tested this as a global default? Or does that break many flowsheets?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have not tested it extensively. I think I will explore it as a next step.

Ipopt will increase this automatically if it thinks it's "needed", so an in-between would be to set ma27_pivtolmax higher than the current default of 1e-04.

@@ -105,7 +105,7 @@ def test_solution(self, model):
value(model.fs.unit.inlet.flow_mass_comp[t, j]), rel=1e-5
) == value(model.fs.unit.outlet.flow_mass_comp[t, j])

assert pytest.approx(8.4491446e-11, rel=1e-5) == value(
assert pytest.approx(1.8449145e-10, rel=1e-5) == value(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For all intents and purposes, this is in our eyes 0, should we not typically test these as value has to be <=1e-8 ? which is linked to tolerances of our constraints?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, probably. But there are lots of small value comparisons in the ZO unit model tests, so I choose to preserve this for consistency.

watertap/examples/chemistry/tests/test_pure_water_pH.py Outdated Show resolved Hide resolved
Copy link
Contributor

@TimBartholomew TimBartholomew left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@@ -658,7 +658,7 @@ def build(self):
self.flowsheet().time,
self.diluate.length_domain,
initialize=0.9,
bounds=(0, 1),
bounds=(0, 1 + 1e-10),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What happened without this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it were possible to eliminate these bounds that would probably be better.

@@ -148,7 +148,7 @@ def test_mvc():
solver = get_solver()
initialize(m, solver=solver)

results = solver.solve(m, tee=False)
results = solver.solve(m, tee=True)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we set back to False?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, because pytest suppresses the output by default, and if the test fails we always want to look at the solver logs.

Which is part of the reason I left some other debugging outputs in the tests as well. I don't see any point in not having them immediately when needed.

@lbibl
Copy link
Contributor

lbibl commented Oct 5, 2023

Changes on ED files look ok to me.

@ksbeattie ksbeattie merged commit eb50c18 into watertap-org:main Oct 5, 2023
24 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bugfix Priority:High High Priority Issue or PR
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants