Skip to main content

Using Custom Grading Scripts in Vocareum Labs

Configure a custom grading script to control how student submissions are evaluated, scored, and reported in Vocareum Notebook, VS Code, and JupyterLab assignments.

Written by Mary Gordanier
Updated this week

For Teachers & Admins

This article covers the grading script setup process and environment for Vocareum Notebook, VS Code, and JupyterLab assignments. The same mechanism applies to all three lab types.

If you are using Vocareum Notebook and deciding whether to use a custom script or the built-in nbgrader-based autograder, check out Using Custom Grading Scripts in Vocareum Notebook first. For code examples covering test-based, AI-powered, and hybrid grading approaches, refer to Custom Grading Script Examples.


How custom grading works

When a student submits an assignment, Vocareum:

  1. Creates a fresh, isolated grading environment.

  2. Copies all student-submitted files to /voc/work/ (the working directory).

  3. Copies all files from /voc/scripts/ into the environment.

  4. Runs /voc/scripts/grade.sh from /voc/work/.

  5. Records grades and displays the report to the student.

The grading script and all supporting files live in /voc/scripts/. The student's submitted files live in /voc/work/. Your script reads from /voc/work/ and writes its outputs to two environment variables. The path to the working directory is also available as the environment variable $VOC_HOME_DIR.

/voc/work/                 ← working directory (student files, $VOC_HOME_DIR)   ├── submit.py   └── ... /voc/scripts/              ← your files (deployed via Configure Workspace)   ├── grade.sh             ← entry point — Vocareum runs this   ├── grade.py             ← your grading logic (optional)   └── ...

Grade file

The grade file ($vocareumGradeFile) is a CSV where each line maps a rubric criterion to a score:

<Rubric Criterion Name>,<Score>

Criterion names must exactly match the names defined in your assignment rubric, including capitalization and spacing. Scores are numeric, up to the maximum defined for that criterion.

Report file

The report file ($vocareumReportFile) contains free-form text or HTML displayed to the student as feedback after grading.

By default, Vocareum also appends the grading script's stdout and stderr to the report. To suppress this and show only what your script writes explicitly, include the following line in the report file:

echo "VOC_NO_REPORT_OUTPUT" > "$vocareumReportFile"

Setting up the grading script

The following steps apply to Vocareum Notebook, VS Code, and JupyterLab assignments.

Step 1: Create the script files

Create grade.sh as the entry point. Vocareum executes this file on every submission. Supporting scripts (such as a grade.py) should also be created and called from grade.sh.

A minimal grade.sh:

#!/bin/bash SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" python3 "$SCRIPT_DIR/grade.py" "$vocareumGradeFile" "$vocareumReportFile"

Step 2: Upload scripts via Configure Workspace

All files in /voc/scripts/ are deployed through Configure Workspace in the assignment settings. Vocareum places these files in the grading environment automatically on each submission.

To upload your scripts:

  1. Open the assignment in the Vocareum instructor interface.

  2. Select Configure Workspace to open the lab environment.

  3. Place grade.sh and any supporting files in the /voc/scripts/ directory.

  4. Save and close the workspace.

Step 3: Define rubric criteria

Rubric criteria define the score categories your script writes to $vocareumGradeFile.

  1. Open the assignment settings and navigate to the Rubric section.

  2. Add one criterion for each score your script will report.

  3. Set the maximum points for each criterion.

  4. Confirm that the criterion names match exactly — including capitalization and spacing — what your script writes to the grade file.

Step 4: Test the script

Before publishing, test the grading script against a sample submission inside Configure Workspace.

  1. Place a representative student submission in /voc/work/.

  2. Set $vocareumGradeFile and $vocareumReportFile to writable temp paths, or run the script with test values.

  3. Run bash /voc/scripts/grade.sh and inspect both output files.

  4. Verify criterion names in the grade file match your rubric exactly.

  5. Verify the report file contains clear, correct feedback.

Step 5: Publish the assignment

Once tested, publish the assignment. Students will submit through the standard lab interface and each submission will trigger a fresh grading environment run.


Further reading

Did this answer your question?