Grading with Jupyter notebook
For autograding Jupyter notebook you can always extract the source code using nbextract and then use the Vocareum grading method just like any other assignment.
For Python notebooks platform supports nbgrader flow for automatic assessment.
- Select "Configure Workspace" and then Jupyter=>Launch Jupyter Server. This will launch Jupyter notebook server from the "work" directory.
- You can upload your notebook with or without the nbgrader tags at this point. If your notebook as the auto-grade tags already defined, you can skip to the next step. Otherwise you can tag the cells using the drop down menu which appears before each cell -
- Solution - Students can edit this cell.
- Read-only - This cell will not be editable by the student. Before auto-grading this cell will be replaced by the original cell in case student submission has any changes to this cell.
- Auto grader tests - This cell is identified as a test block. The student will not be able to edit the cell. When you select it, you will be asked to specify a name and points associated with that cell. When auto grading is executed, if an assertion is triggered in this cell, the student will be assigned maximum points for this item, 0 otherwise
3. After you have created the notebook, you can select Jupyter=>Release Notebook(s). You will be presented with a dialogbox shown below. Select the options as follows -
- Extract rubric items from notebook(s) - You would need to select this most of the time. Platform will use the cell names for names of grading items.
- Delete unused grading item - If you select it then all unused items will be deleted. If you have manually added grading items, deselect it to prevent deletion of the grading items.
- Create resource link in starter code - if selected, a link is created in student work area to the resource directory.
- Overwrite existing grading script - If selected, platform will overwrite the contents of the grading script with an auto-generated one.
- Show traceback for errors - select this to show the traceback of the assertions that are triggered in the autograder cells
- Show html feedback report - If selected, "grading report" for students will be replaced with the HTML of the notebook that the student submitted. You can provide feedback to the student by assertion messages in the autograder cell or by selecting "Show traceback for errors". If false, then a text file is generated for the student report.
Select "Release" button after setting the options -
4. Select "Update" to deploy the changes. Students should now see the "released" notebook when they start the assignment. When they submit it, the grading script should autograde the cells.
Special tags in starter code
Any section of python cell starting with ### BEGIN SOLUTION will be stripped out until ### END SOLUTION is encountered and will be replaced with # YOUR SOLUTION HERE. You can replace "# YOUR SOLUTION HERE" with your own string by adding "TEMPLATE" attribute -
### BEGIN SOLUTION TEMPLATE=<string>
In the auto-grader test cell of the starter code, if the platform encounters ### BEGIN HIDDEN TESTS, the code following it will be removed until ### END HIDDEN TESTS is reached. Note that the removed section will be replaced before the auto-grader is run.
When auto-grading is run a file <notebook name>_feedback.html is generated. You can manually add the following function in the grading script to make this file as the report file which the student sees when they select Details=>View Grading Report.
vocReportFile <notebook name>_feedback.html
This file includes HTML representation of the original notebook submitted by the student. In addition the assertion messages generated by the auto-grade test block are included. You can use this to provide additional automated feedback to students. Only the message associated with the first assertion that is triggered is reported.
For example, your auto-graded test might include the following statement -
print('Your solution passed the tests.')