Skip to main content
All CollectionsFor InstructorsGrading
How to Set Up Grading in Jupyterlab
How to Set Up Grading in Jupyterlab

Setting up Jupyterlab for manual and autograding

K
Written by Kevin Wesley
Updated this week

Students can submit their work in Vocareum's Jupyterlab environment and have it be manually graded, auto-graded, or a combination of both. Below we will go over how to set up these grading features.

Manual Grading

Rubrics and Submissions

  1. From the Part settings of your Assignment, navigate to 'Rubrics' and select '+ Grading Criterion'. Add a name to your criterion and determine a Max Score. Select 'Save Part' when you are done.

  2. Now when you visit a student submission you will see the the rubrics options to grade. You can also leave a review comment the student will see once their submission has been graded.

Auto-Grading

Cells and Special Tags

Vocareum's Jupyterlab environment provides custom cell types, leveraging NB Grader, to introduce autograding to your assignment:

Autograded tests

  1. This cell is identified as a test block. The learner will not be able to edit the cell. When selected it, you will be asked to specify a name and points associated with that cell. When auto grading is executed, if an assertion is triggered in this cell, the learner will be assigned maximum points for this item, 0 otherwise.

Read-only

  1. This cell will not be editable by the learner. Before auto-grading this cell will be replaced by the original cell in case the learner submission has any changes to this cell. Best place for instructors to place course content and directions.

Manually graded answer

Autograded answer

Creating Autograded Tests

  1. Select Configure Workspace from the assignment page to open Jupyterlab.

  2. Right-click in the work folder to upload or create a new notebook

  3. Within your notebook identify the cells you will be using for autograded tests. Choose from the different grading cell types from the dropdown inside the cell.. For this example we will use 'Autograded tests'.

  4. After you have defined the notebook cells, you can select "Publish Notebook" at the top of your lab.

    You will be presented with a dialogbox (shown below). Please read carefully and select the options as follows:

    1. The following actions will be performed:

      • Place a modified copy of this notebook into /voc/startercode with the code between "SOLUTION" and "HIDDEN TESTS" pragmas removed.

        • An original copy of this notebook will be viewable by graders.

      • Copy any selected data files in the work area into the startercode folder.

      • If autograded cells are present:

        • Create or update Vocareum rubrics based on the names and points of autograded cells.

        • Generate a grading script for the autograder, with any existing script being overridden.

  5. Select "Start" to deploy the changes. Students should now see the notebook when they start the assignment. When they submit their work, the grading script will autograde the cells.

Special tags in starter code

In a python cell set as "solution" any section of the cell starting with ### BEGIN SOLUTION will be stripped out until ### END SOLUTION is encountered.

#### YOUR CODE HERE will be seen by the learner between the begin and end solution tags. You can replace #### YOUR CODE HERE with your own string by adding the TEMPLATE attribute. ex: ### BEGIN SOLUTION TEMPLATE=<string>

In the auto-grader test cell of the starter code, if the platform encounters ### BEGIN HIDDEN TESTS, the code following it will be removed until ### END HIDDEN TESTS is reached. Note that the removed section will be replaced before the auto-grader is run.

Generating Feedback

When auto-grading is run a file <notebook name>_feedback.html is generated. You can manually add the following function in the grading script to make this file as the report file which the student sees when they select Details=>View Grading Report.

vocReportFile <notebook name>_feedback.html

This file includes HTML representation of the original notebook submitted by the student. In addition the assertion messages generated by the auto-grade test block are included. You can use this to provide additional automated feedback to students. Only the message associated with the first assertion that is triggered is reported.

For example, your auto-graded test might include the following statement

assert <condition>, "please check your work again"

print('Correct.')

If the tests fails then student would see the message "please check your work again" and "Correct" if the test succeeded.

Did this answer your question?