-
Notifications
You must be signed in to change notification settings - Fork 274
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add calculation of dely for components of a composite model #831
Conversation
Codecov Report
@@ Coverage Diff @@
## master #831 +/- ##
==========================================
- Coverage 93.71% 93.68% -0.03%
==========================================
Files 9 9
Lines 3483 3501 +18
==========================================
+ Hits 3264 3280 +16
- Misses 219 221 +2
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. |
azure-pipelines.yml
Outdated
python -m pip install --upgrade build pip wheel | ||
python -m pip install asteval==0.9.22 numpy==1.18 scipy==1.4.0 uncertainties==3.1.4 | ||
python -m pip install --upgrade pip wheel | ||
pip install setuptools==57.5.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
arggghhh.... seuptools is breaking things again :( It looks like NumPy pins their version to 59.2.0, but even if I try that I still see failures in the CI - similar to here in this PR.
I'll try and fix that, so don't worry about / waste time with that part - hopefully soon and then I'll rebase/push to this PR and it should be good to go...
@reneeotten I kind of think we actually have a serious problem here. When I see errors my default reaction has become "oh, something in the test setup is in error", either some value in our own tests (as with 99b13a4 or with the CI system itself. I trust that running the tests locally detect will identify problems with the code. But it kind of seems like passing the Azure tests is more about "getting the Azure tests to pass" and rarely has anything to do with our code. Honestly, that could all be restated as: I have lost confidence in our tests. That is not good. I am going to merge this PR and release 1.0.4 by Nov 28. Tests that are failing before they even attempt to install lmfit will be removed. |
This adds calculation of dely for model components to ModelResult.eval_uncertainties, addressing #761
It replaces #826
Type of Changes
Tested on
Python: 3.9.13 | packaged by conda-forge | (main, May 27 2022, 17:01:00)
[Clang 13.0.1 ]
lmfit: 1.0.3.post76+g44eb363.d20221120, scipy: 1.9.3, numpy: 1.23.4, asteval: 0.9.28, uncertainties: 3.1.7
Verification
Have you