Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Otter Grade Write Individual Notebook Results #826

Merged
merged 2 commits into from
Aug 28, 2024
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Changelog

**v5.6.0 (unreleased):**

* Updated Otter Grade to write grading summary for each notebook per [#814](https://github.com/ucbds-infra/otter-grader/issues/814)
* Updated Otter Grade CSV to indicate which notebooks timeout per [#813](https://github.com/ucbds-infra/otter-grader/issues/813)
* Updated Otter Grade CSV to include the number of points per question in the first row
* Updated Otter Grade CSV to include total points column
Expand Down
1 change: 1 addition & 0 deletions otter/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -132,6 +132,7 @@ def generate_cli(*args, **kwargs):
@click.option("-a", "--autograder", default=defaults["autograder"], help="Path to autograder zip file")
@click.option("-o", "--output-dir", default=defaults["output_dir"], help="Directory to which to write output")
@click.option("--ext", default=defaults["ext"], type=click.Choice(_ALLOWED_EXTENSIONS), help="The extension to glob for submissions")
@click.option("--summaries", is_flag=True, help="Whether to write the otter run results for each graded notebook")
@click.option("--pdfs", is_flag=True, help="Whether to copy notebook PDFs out of containers")
@click.option("--containers", default=defaults["containers"], type=click.INT, help="Specify number of containers to run in parallel")
@click.option("--image", default=defaults["image"], help="A Docker image tag to use as the base image")
Expand Down
13 changes: 13 additions & 0 deletions otter/grade/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
from .utils import (
merge_csv,
prune_images,
POINTS_POSSIBLE_LABEL,
SCORES_DICT_FILE_KEY,
SCORES_DICT_PERCENT_CORRECT_KEY,
SCORES_DICT_TOTAL_POINTS_KEY,
Expand All @@ -33,6 +34,7 @@ def main(
autograder: str = "./autograder.zip",
containers: int = 4,
ext: str = "ipynb",
summaries: bool = False,
no_kill: bool = False,
image: str = "ubuntu:22.04",
pdfs: bool = False,
Expand Down Expand Up @@ -155,6 +157,17 @@ def main(
# write to CSV file
output_df.to_csv(os.path.join(output_dir, "final_grades.csv"), index=False)

# write score summaries to files
if summaries:
grading_summary_path = os.path.join(output_dir, "grading-summaries")
if not os.path.exists(grading_summary_path):
os.mkdir(grading_summary_path)
for df in grade_dfs:
df_dict = df.to_dict()
if df_dict['file'][0] != POINTS_POSSIBLE_LABEL:
with open(os.path.join(grading_summary_path, f"{df_dict['file'][0]}.txt"), mode="w") as f:
chrispyles marked this conversation as resolved.
Show resolved Hide resolved
f.write(df_dict["summary"][0])

# return percentage if a single file was graded
if len(paths) == 1 and os.path.isfile(paths[0]):
return output_df[SCORES_DICT_PERCENT_CORRECT_KEY][1]
4 changes: 4 additions & 0 deletions test/test_cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -479,6 +479,10 @@ def test_grade(mocked_grade, run_cli):
assert_cli_result(result, expect_error=False)
mocked_grade.assert_called_with(**{**std_kwargs, "ext": ext})

result = run_cli([*cmd_start, "--summaries"])
assert_cli_result(result, expect_error=False)
mocked_grade.assert_called_with(**{**std_kwargs, "summaries": True})

result = run_cli([*cmd_start, "--pdfs"])
assert_cli_result(result, expect_error=False)
mocked_grade.assert_called_with(**{**std_kwargs, "pdfs": True})
Expand Down
23 changes: 23 additions & 0 deletions test/test_grade/test_integration.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,8 @@ def cleanup_output(cleanup_enabled):
shutil.rmtree("test/submission_pdfs")
if os.path.exists(ZIP_SUBM_PATH):
os.remove(ZIP_SUBM_PATH)
if os.path.exists("test/grading-summaries"):
shutil.rmtree("test/grading-summaries")


@pytest.fixture(autouse=True, scope="module")
Expand Down Expand Up @@ -362,3 +364,24 @@ def test_config_overrides_integration():
got = got.reindex(sorted(got.columns), axis=1)
want = want.reindex(sorted(want.columns), axis=1)
assert got.equals(want)


@pytest.mark.slow
@pytest.mark.docker
def test_grade_summaries():
"""
Checks that are grading summaries are written to the disck
"""
grade(
sean-morris marked this conversation as resolved.
Show resolved Hide resolved
name = ASSIGNMENT_NAME,
paths = [FILE_MANAGER.get_path("notebooks")],
output_dir = "test/",
autograder = AG_ZIP_PATH,
summaries = True
)
for filename in os.listdir(FILE_MANAGER.get_path("notebooks")):
file_path = os.path.join("test/grading-summaries", filename)
if os.path.isfile(file_path):
assert os.path.exists(file_path)
with open(file_path, 'r') as file:
assert file.read()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Regardless of what's being written, this test should check that the contents are correct.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PTAL at this as well.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good deal -- checking the contents of summary files now

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not quite sure how you would mock this -- the summary files are written in grade/init.py main function. I am checking the files that are written to the disk. Maybe I am missing something here?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since the changes are in __init__.py, you can mock out the launch_containers function so it doesn't actually run but instead just returns some predetermined results. There's an example of this in test_config_overrides above (the mock is created with the @mock.patch(...) decorator and the return value is set by setting mocked_launch_grade.return_value.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, wait I am with you. Give me a second

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I rebased and forced the last commit to prevent another commit here.

I think I got it. Good deal

Loading