Grading Jupyterlab

Setting up Jupyterlab for autograding

K
Written by Kevin Wesley
Updated over a week ago

Employing the autograde feature in Jupyterlab you can extract source code from your Jupyter Notebook using nbextract. Then use the Vocareum grading system.

1. Select Configure Workspace from the assignment page to open Jupyterlab.

2. If your notebook already has auto-grade tags defined, you can upload it directly to /home/labuser directory in the Jupyter Notebook environment. And skip the next step.

3. Once your notebook has been successfully uploaded open it to start tagging the cells for autograding.

Read-only - This cell will not be editable by the learner. Before auto-grading this cell will be replaced by the original cell in case the learner submission has any changes to this cell. Best place for instructors to place course content and directions.

Solution - Learners can edit this cell.

Autograder Tests - This cell is identified as a test block. The learner will not be able to edit the cell. When selected it, you will be asked to specify a name and points associated with that cell. When auto grading is executed, if an assertion is triggered in this cell, the learner will be assigned maximum points for this item, 0 otherwise.

4. After you have defined the notebook cells, you can select Generate Notebook(s). You will be presented with a dialogbox (shown below). Please read carefully and select the options as follows.

This process will strip out the code marked with BEGIN_SOLUTION and BEGIN_HIDDEN_TESTS pragmas and copy the resulting notebook to the Starter Code.

This will, optionally, extract nbgrader-style grading items and sync them with the Vocareum environment. New grading items will be added and existing ones with new scores will be updated. If no submissions have been made, an option to delete unused grading items will be available.

A grading script will also be generated (with any existing grading script being overwritten) unless that option is de-selected.

  • Notebook(s) to be released

  • Data Files -

  • Extract rubric items from notebook(s) - You would need to select this most of the time. Platform will use the "Autograder tests" cell names for names of grading items.

  • Delete unused grading item - If you select it then all unused items will be deleted. If you have manually added grading items, deselect it to prevent deletion of the grading items.

  • Create resource link in starter code - if selected, a link is created in student work area to the resource directory.

  • Overwrite existing grading script - If selected, platform will overwrite the contents of the grading script with an auto-generated one.

  • Show traceback for errors - select this to show the traceback of the assertions that are triggered in the autograder cells

  • Show html feedback report - If selected, "grading report" for students will be replaced with the HTML of the notebook that the student submitted. You can provide feedback to the student by assertion messages in the autograder cell or by selecting "Show traceback for errors". If false, then a text file is generated for the student report.

4. Select "Deploy" to deploy the changes. Students should now see the notebook when they start the assignment. When they submit their work, the grading script will autograde the cells.

Special tags in starter code

In a python cell set as "solution" any section of the cell starting with ### BEGIN SOLUTION will be stripped out until ### END SOLUTION is encountered.

#### YOUR CODE HERE will be seen by the learner between the begin and end solution tags. You can replace #### YOUR CODE HERE with your own string by adding the TEMPLATE attribute. ex: ### BEGIN SOLUTION TEMPLATE=<string>

In the auto-grader test cell of the starter code, if the platform encounters ### BEGIN HIDDEN TESTS, the code following it will be removed until ### END HIDDEN TESTS is reached. Note that the removed section will be replaced before the auto-grader is run.

Generating Feedback

When auto-grading is run a file <notebook name>_feedback.html is generated. You can manually add the following function in the grading script to make this file as the report file which the student sees when they select Details=>View Grading Report.

vocReportFile <notebook name>_feedback.html

This file includes HTML representation of the original notebook submitted by the student. In addition the assertion messages generated by the auto-grade test block are included. You can use this to provide additional automated feedback to students. Only the message associated with the first assertion that is triggered is reported.

For example, your auto-graded test might include the following statement

assert <condition>, "please check your work again"

print('Correct.')

If the tests fails then student would see the message "please check your work again" and "Correct" if the test succeeded.

Did this answer your question?