All Collections
For Instructors
Grading
Grading Jupyter Notebooks
Grading Jupyter Notebooks

Setting up your Jupyter Notebook for autograding

K
Written by Kevin Wesley
Updated over a week ago

This article is referring to the Jupyter Notebook lab type. If you are using the Jupyterlab lab type, please refer to this article here.

When autograding a Jupyter notebook you can always extract the source code using nbextract. Then use the Vocareum grading method just like any other assignment. For Python notebooks, the platform supports nbgrader flow for automatic assessment.

1. Select "Configure Workspace" and then Jupyter => Launch Jupyter Server. This will launch Jupyter notebook server from the "work" directory.

2. You can upload your notebook with or without the nbgrader tags at this point. If your notebook has the auto-grade tags already defined, you can skip to the next step. Otherwise you can tag the cells using the drop down menu which appears each cell:

Read-only - This cell will not be editable by the learner. Before auto-grading this cell will be replaced by the original cell in case the learner submission has any changes to this cell. Best place for instructors to place course content and directions.

Solution - Learners can edit this cell.

Autograder Tests - This cell is identified as a test block. The learner will not be able to edit the cell. When selected it, you will be asked to specify a name and points associated with that cell. When auto grading is executed, if an assertion is triggered in this cell, the learner will be assigned maximum points for this item, 0 otherwise.

3. After you have defined the notebook cells, you can select Jupyter=>Release Notebook(s). You will be presented with a dialogbox shown below. Select the options as follows -

- Extract rubric items from notebook(s) - You would need to select this most of the time. Platform will use the cell names for names of grading items.

- Delete unused grading item - If you select it then all unused items will be deleted. If you have manually added grading items, deselect it to prevent deletion of the grading items.

- Create resource link in starter code - if selected, a link is created in student work area to the resource directory.

- Overwrite existing grading script - If selected, platform will overwrite the contents of the grading script with an auto-generated one.

- Show traceback for errors - select this to show the traceback of the assertions that are triggered in the autograder cells

- Show html feedback report - If selected, "grading report" for students will be replaced with the HTML of the notebook that the student submitted. You can provide feedback to the student by assertion messages in the autograder cell or by selecting "Show traceback for errors". If false, then a text file is generated for the student report.

4. Select "Update" to deploy the changes. Students should now see the "released" notebook when they start the assignment. When they submit it, the grading script should autograde the cells.

Special tags in starter code

Any section of python cell starting with ### BEGIN SOLUTION will be stripped out until ### END SOLUTION is encountered and will be replaced with # YOUR SOLUTION HERE. You can replace "# YOUR SOLUTION HERE" with your own string by adding "TEMPLATE" attribute -

### BEGIN SOLUTION TEMPLATE=<string>

In the auto-grader test cell of the starter code, if the platform encounters ### BEGIN HIDDEN TESTS, the code following it will be removed until ### END HIDDEN TESTS is reached. Note that the removed section will be replaced before the auto-grader is run.

Generating Feedback

When auto-grading is run a file <notebook name>_feedback.html is generated. You can manually add the following function in the grading script to make this file as the report file which the student sees when they select Details=>View Grading Report.

vocReportFile <notebook name>_feedback.html

This file includes HTML representation of the original notebook submitted by the student. In addition the assertion messages generated by the auto-grade test block are included. You can use this to provide additional automated feedback to students. Only the message associated with the first assertion that is triggered is reported.

For example, your auto-graded test might include the following statement -

assert <condition>, "please check your work again"

print('Correct.')

If the tests fails then student would see the message "please check your work again" and "Correct" if the test succeeded.

Adding Data to Notebooks

You can always upload any data to the "publicdata" directory in "Standard View". This directory be referred to by the path ../resource/asnlib/publicdata in the notebook by the students.

Did this answer your question?