Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

loaded ModelResult bug (eval_uncertainty() not working) #909

Closed
edr-choi opened this issue Aug 12, 2023 · 1 comment
Closed

loaded ModelResult bug (eval_uncertainty() not working) #909

edr-choi opened this issue Aug 12, 2023 · 1 comment

Comments

@edr-choi
Copy link

DO NOT IGNORE

READ THESE INSTRUCTIONS FULLY. IF YOU DO NOT, YOUR ISSUE WILL BE CLOSED.

If you have not submitted a GitHub Issue to lmfit before, read this first.

DO NOT USE GitHub Issues for questions, it is only for bugs in the lmfit code!

Issues here are concerned with errors or problems in the lmfit code. We use it as our bug
tracker. There are other places to get support and help with using lmfit.

If you think something is an Issue, it probably is not an Issue. If the behavior you
want to report involves a fit that runs to completion without raising an exception but
that gives a result that you think is incorrect, that is almost certainly not an Issue.

Use the mailing list or GitHub discussions
page
for questions about lmfit or things
you think might be problems. We don't feel obligated to spend our free time helping
people who do not respect our chosen work processes, so if you ignore this advice and post
a question as a GitHub Issue anyway, it is quite likely that your Issue will be closed and
not answered. If you have any doubt at all, do NOT submit an Issue.

To submit an Issue, you MUST provide ALL of the following information. If you delete any
of these sections, your Issue may be closed. If you think one of the sections does not
apply to your Issue, state that explicitly. We will probably disagree with you and insist
that you provide that information. If we have to ask for it twice, we will expect it to be
correct and prompt.

First Time Issue Code

Yes, I read the instructions and I am sure this is a GitHub Issue.

Description

Calling the function eval_uncertainty() produces an Attribute Error, but only when a ModelResult object is loaded. Otherwise, the function works as expected on an existing ModelResult.

Matt Newville confirmed this bug on the lmfit mailing list (linked below) and mentioned this solution:
File "..../lmfit/model.py", line 1598, in eval_uncertainty
for comp in self.components:
Should be:
for comp in self.model.components:

A Minimal, Complete, and Verifiable example
x = np.array([1, 2, 3, 4, 5])
y = np.array([1, 2, 3, 4, 5])

model = LinearModel()
result = model.fit(y, params, x = x)
save_modelresult(result, "save_result.sav")
print(result.eval_uncertainty(x = x) # THIS WORKS

loaded_result = load_modelresult("save_result.sav")
print(loaded_result.eval_uncertainty(x = x)) # THIS THROWS ATTRIBUTE ERROR
Fit report:
Error message:
File "..../lmfit/model.py", line 1598, in eval_uncertainty
    for comp in self.components:

AttributeError: 'ModelResult' object has no attribute 'components'
Version information

lmfit: 1.2.2, scipy: 1.11.1, numpy: 1.25.1,asteval: 0.9.31, uncertainties: 3.1.7

Link(s)

https://groups.google.com/g/lmfit-py/c/1Brvz0YamOs/m/u_1g_bubCwAJ?utm_medium=email&utm_source=footer

@newville newville mentioned this issue Aug 12, 2023
12 tasks
@newville
Copy link
Member

This should be fixed with #910, so I'll close this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants