Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Splits integration / examples / unit tests #204

Merged
merged 18 commits into from
Feb 23, 2024

Conversation

BradyPlanden
Copy link
Member

This PR adds the following to close #139,

  • Split integration and unit tests
  • update pytest --examples
  • additional scheduled workflow trigger
  • add test_parameterisation cost/optimiser test matrix
  • tightens the performance assertions for integration tests

It also,

  • updates default max_unchanged_iterations from 25 to 5. This value was selected as a trade-off between robustness and performance, open to other defaults however.
  • adds _evaulateS1 to the RootMeanSquared cost function

Copy link

codecov bot commented Feb 16, 2024

Codecov Report

Attention: Patch coverage is 81.81818% with 2 lines in your changes are missing coverage. Please review.

Project coverage is 94.29%. Comparing base (8a17a99) to head (54e97f5).
Report is 14 commits behind head on develop.

❗ Current head 54e97f5 differs from pull request most recent head 29c6ff2. Consider uploading reports for the commit 29c6ff2 to get more accurate results

Files Patch % Lines
pybop/costs/fitting_costs.py 80.00% 2 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##           develop     #204      +/-   ##
===========================================
- Coverage    94.70%   94.29%   -0.41%     
===========================================
  Files           44       32      -12     
  Lines         1887     1614     -273     
===========================================
- Hits          1787     1522     -265     
+ Misses         100       92       -8     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@BradyPlanden
Copy link
Member Author

@NicolaCourtier @martinjrobins this PR splits the tests into unit / integration / examples / plots and now uses xdist to gain quite a speed up.

Unforunately, we have been relying on examples to boost our coverage which has resulted in the large drop we are experiencing. I'm keen to move away from using examples as we shouldn't include asserts within them, but that leads us to the current situation. My preference moving forward would be to take the one time hit and merge this PR once it is reviewed, then add new tests to increase our coverage in the PR's leading up to our next release (the one after 24.2). Does that sound fine to everyone?

@NicolaCourtier
Copy link
Member

Hi @BradyPlanden, quick question- so, which tests are still included in the coverage? When is test_examples used?

@BradyPlanden
Copy link
Member Author

BradyPlanden commented Feb 22, 2024

The coverage tests are defined in the noxfile located here. This runs the unit, integration, and plots tests as per below.

def coverage(session):
    session.install("-e", ".[all,dev]", silent=False)
    if PYBOP_SCHEDULED:
        session.run("pip", "install", f"pybamm=={PYBAMM_VERSION}", silent=False)
    session.run(
        "pytest",
        "--unit",
        "--integration",
        "--plots",
        "--cov",
        "--cov-report=xml",
    )

The current state of this PR doesn't have the examples running within the github actions, but I think adding them to the test_on_push workflow would be a good addition.

@NicolaCourtier
Copy link
Member

If test_examples were added to test_on_push, would this solve the coverage issue or is there another issue here? I mean, ideally everything should be covered by unit and integration tests, but I think it would be sensible to keep the coverage up until we have expanded the other tests. What do you think?

@BradyPlanden
Copy link
Member Author

BradyPlanden commented Feb 22, 2024

Unforunately, I don't think it would. It would ensure that we test the examples, but the current coverage values are inflated by including these example tests. While the examples excerise our code base, we don't have any tests on them, so it only tells us that they don't throw errors.

The examples could be added back to the coverage session, but I think that this would be a plaster on the current situation. We would lose the information needed to sort out the coverage (since codecov thinks these areas tested).

Given the above, two potential options are:

  • Add all the necessary tests to this PR to improve our coverage.
  • Merge this PR and open new PR's that add coverage (most of the missing areas seem to be in plots, so adding to Increase plotting capabilities #177 would be a good).

My preference would be the latter option, as adding all the tests to this PR would be quite combersome. Although, perhaps I'm missing an option! Open to other suggestions.

@NicolaCourtier
Copy link
Member

I think it's important we keep testing whether the examples throw any errors,. My preference would be to include test_examples in the coverage for now and merge this PR, and then update the tests and check coverage in a separate PR.

@martinjrobins
Copy link
Contributor

I think it's important we keep testing whether the examples throw any errors,. My preference would be to include test_examples in the coverage for now and merge this PR, and then update the tests and check coverage in a separate PR.

we can separate these two things though. Keep testing the examples, and just exclude the examples from the coverage reports? I think that including examples in coverage is a bit misleading as we're not really testing all the code that the examples run. Happy to take a coverage hit on this PR as long as we're adding more tests in subseqent PRs.

@BradyPlanden
Copy link
Member Author

BradyPlanden commented Feb 22, 2024

Excellent, @NicolaCourtier and I had a chat offline and we are both onboard with moving this forward andcmaking up the coverage in follow on PR's. Thanks for the discussion everyone!

@agriyakhetarpal, this is now ready for a review when you get a chance.

@BradyPlanden BradyPlanden linked an issue Feb 22, 2024 that may be closed by this pull request
Copy link
Member

@agriyakhetarpal agriyakhetarpal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, @BradyPlanden! I agree with Martin, the examples tests should be best placed out of the coverage report, since the rest of the tests should take care of that.

I will also suggest taking advantage of the recent Python 3.12 functionalities that benefit the coverage module: see python-pillow/Pillow#7820 for an example and the coverage module's release notes. This is experimental for now, but it does not hurt to try this out if the coverage tests will be slightly breezier!

.github/workflows/test_on_push.yaml Outdated Show resolved Hide resolved
.github/workflows/test_on_push.yaml Outdated Show resolved Hide resolved
noxfile.py Outdated Show resolved Hide resolved
noxfile.py Outdated Show resolved Hide resolved
noxfile.py Outdated Show resolved Hide resolved
noxfile.py Outdated Show resolved Hide resolved
BradyPlanden and others added 6 commits February 23, 2024 08:58
Co-authored-by: Agriya Khetarpal <74401230+agriyakhetarpal@users.noreply.github.com>
Co-authored-by: Agriya Khetarpal <74401230+agriyakhetarpal@users.noreply.github.com>
Co-authored-by: Agriya Khetarpal <74401230+agriyakhetarpal@users.noreply.github.com>
@BradyPlanden
Copy link
Member Author

Ready for another look @agriyakhetarpal. Let's take a look at the 3.12 coverage improvements in an new issue :)

Copy link
Member

@agriyakhetarpal agriyakhetarpal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @BradyPlanden, these changes are all good to me now!

NicolaCourtier and others added 2 commits February 23, 2024 11:38
* Create test_plots.py

* Parametrize test_models

* Add check on n_states

* Add test for json parameter set

* Add test for json export

* Add test for invalid max values

* Add optimisation tests

* Add observer tests

* Add tests on observer evaluate

* Add tests on invalid parameter inputs

* Add invalid sample size tests
@BradyPlanden BradyPlanden merged commit c482583 into develop Feb 23, 2024
29 checks passed
@BradyPlanden BradyPlanden deleted the 139-split-integration-unit-tests branch February 23, 2024 12:47
@agriyakhetarpal agriyakhetarpal mentioned this pull request Feb 25, 2024
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Revisit pytest-xdist for parallel testing Split tests into integration & unit
4 participants