Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add calculation of dely for components of a composite model #831

Merged
merged 5 commits into from
Nov 21, 2022

Conversation

newville
Copy link
Member

This adds calculation of dely for model components to ModelResult.eval_uncertainties, addressing #761

It replaces #826

Type of Changes
  • Bug fix
  • New feature
  • Refactoring / maintenance
  • Documentation / examples
Tested on

Python: 3.9.13 | packaged by conda-forge | (main, May 27 2022, 17:01:00)
[Clang 13.0.1 ]

lmfit: 1.0.3.post76+g44eb363.d20221120, scipy: 1.9.3, numpy: 1.23.4, asteval: 0.9.28, uncertainties: 3.1.7

Verification

Have you

  • included docstrings that follow PEP 257?
  • referenced existing Issue and/or provided relevant link to mailing list?
  • verified that existing tests pass locally?
  • verified that the documentation builds locally?
  • squashed/minimized your commits and written descriptive commit messages?
  • added or updated existing tests to cover the changes?
  • updated the documentation and/or added an entry to the release notes (doc/whatsnew.rst)?
  • added an example?

@codecov
Copy link

codecov bot commented Nov 21, 2022

Codecov Report

Merging #831 (997c0bd) into master (44eb363) will decrease coverage by 0.02%.
The diff coverage is 92.30%.

@@            Coverage Diff             @@
##           master     #831      +/-   ##
==========================================
- Coverage   93.71%   93.68%   -0.03%     
==========================================
  Files           9        9              
  Lines        3483     3501      +18     
==========================================
+ Hits         3264     3280      +16     
- Misses        219      221       +2     
Impacted Files Coverage Δ
lmfit/model.py 91.19% <92.30%> (-0.06%) ⬇️

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

python -m pip install --upgrade build pip wheel
python -m pip install asteval==0.9.22 numpy==1.18 scipy==1.4.0 uncertainties==3.1.4
python -m pip install --upgrade pip wheel
pip install setuptools==57.5.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

arggghhh.... seuptools is breaking things again :( It looks like NumPy pins their version to 59.2.0, but even if I try that I still see failures in the CI - similar to here in this PR.

I'll try and fix that, so don't worry about / waste time with that part - hopefully soon and then I'll rebase/push to this PR and it should be good to go...

@newville
Copy link
Member Author

@reneeotten I kind of think we actually have a serious problem here. When I see errors my default reaction has become "oh, something in the test setup is in error", either some value in our own tests (as with 99b13a4 or with the CI system itself.

I trust that running the tests locally detect will identify problems with the code. But it kind of seems like passing the Azure tests is more about "getting the Azure tests to pass" and rarely has anything to do with our code.

Honestly, that could all be restated as: I have lost confidence in our tests. That is not good.

I am going to merge this PR and release 1.0.4 by Nov 28. Tests that are failing before they even attempt to install lmfit will be removed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants