For autograding Jupyter notebook you can always extract the source code using nbextract and then use the Vocareum grading method just like any other assignment.

For Python notebooks platform supports nbgrader flow for automatic assessment.

3. After you have created the notebook, you can select Jupyter=>Release Notebook(s). You will be presented with a dialogbox shown below. Select the options as follows -

- Extract rubric items from notebook(s) - You would need to select this most of the time. Platform will use the cell names for names of grading items.

- Delete unused grading item - If you select it then all unused items will be deleted. If you have manually added grading items, deselect it to prevent deletion of the grading items.

- Create resource link in starter code - if selected, a link is created in student work area to the resource directory.

- Overwrite existing grading script - If selected, platform will overwrite the contents of the grading script with an auto-generated one.

- Show traceback for errors - select this to show the traceback of the assertions that are triggered in the autograder cells

- Show html feedback report - If selected, "grading report" for students will be replaced with the HTML of the notebook that the student submitted. You can provide feedback to the student by assertion messages in the autograder cell or by selecting "Show traceback for errors". If false, then a text file is generated for the student report.

Select "Release" button after setting the options -

4. Select "Update" to deploy the changes. Students should now see the "released" notebook when they start the assignment. When they submit it, the grading script should autograde the cells.

Special tags in starter code

Any section of python cell starting with ### BEGIN SOLUTION will be stripped out until ### END SOLUTION is encountered and will be replaced with # YOUR SOLUTION HERE. You can replace "# YOUR SOLUTION HERE" with your own string by adding "TEMPLATE" attribute -

### BEGIN SOLUTION TEMPLATE=<string>

In the auto-grader test cell of the starter code, if the platform encounters ### BEGIN HIDDEN TESTS, the code following it will be removed until ### END HIDDEN TESTS is reached. Note that the removed section will be replaced before the auto-grader is run.

Generating Feedback

When auto-grading is run a file <notebook name>_feedback.html is generated. You can manually add the following function in the grading script to make this file as the report file which the student sees when they select Details=>View Grading Report.

vocReportFile <notebook name>_feedback.html

This file includes HTML representation of the original notebook submitted by the student. In addition the assertion messages generated by the auto-grade test block are included. You can use this to provide additional automated feedback to students. Only the message associated with the first assertion that is triggered is reported.

For example, your auto-graded test might include the following statement -

assert <condition>, "please check your work again"
 print('Your solution passed the tests.')

If the tests fails then student would see the message "please check your work again" and "Your solution passed the tests" if the test succeeded.

Adding data to your notebooks

You can always upload any data to the "publicdata" directory in "Standard View". This directory be referred to by the path ../resource/asnlib/publicdata in the notebook by the students.

Did this answer your question?