Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use assertions rather than print statements in tests #166

Open
wants to merge 16 commits into
base: main
Choose a base branch
from

Conversation

jwallwork23
Copy link
Contributor

Closes #142.

As part of this PR, we introduce a module for test utils, e.g., printing test results and making assertions.

@jwallwork23 jwallwork23 added the testing Related to FTorch testing label Sep 13, 2024
@jwallwork23 jwallwork23 self-assigned this Sep 13, 2024
@jwallwork23 jwallwork23 marked this pull request as ready for review September 13, 2024 15:55
@jwallwork23
Copy link
Contributor Author

One discussion to have about these changes is whether we're happy that the examples are becoming more verbose. The versions without assertions will still be used in the docs but (e.g.) SimpleNet now has some extra stuff in there that might confuse things for beginners.

src/utils.f90 Outdated
Comment on lines 60 to 65
test_pass = (abs(got - expect) <= rtol_value * abs(expect))

if (print_result_value) then
write(message,'("relative tolerance = ", E11.4)') rtol_value
call test_print(test_name, message, test_pass)
end if
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We refactored the relative error check to avoid divide-by-zero type errors. However, a downside is that we can no longer print the relative error values upon error without further computation and special case handling that we wanted to avoid. Instead, I just print the relative tolerance used.

Perhaps we could just print the two values being compared?

@jwallwork23
Copy link
Contributor Author

Huh! The CI has spontaneously hit a shape error again
https://github.com/Cambridge-ICCS/FTorch/actions/runs/10880645825/job/30187925536

As suggested by @TomMelt, we should use valgrind to see if there's a memory leak.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
testing Related to FTorch testing
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Use assertions rather than printing in tests
1 participant