For Teachers & Admins
This article covers the grading script setup process and environment for Vocareum Notebook, VS Code, and JupyterLab assignments. The same mechanism applies to all three lab types.
If you are using Vocareum Notebook and deciding whether to use a custom script or the built-in nbgrader-based autograder, check out Using Custom Grading Scripts in Vocareum Notebook first. For code examples covering test-based, AI-powered, and hybrid grading approaches, refer to Custom Grading Script Examples.
How custom grading works
When a student submits an assignment, Vocareum:
Creates a fresh, isolated grading environment.
Copies all student-submitted files to
/voc/work/(the working directory).Copies all files from
/voc/scripts/into the environment.Runs
/voc/scripts/grade.shfrom/voc/work/.Records grades and displays the report to the student.
The grading script and all supporting files live in /voc/scripts/. The student's submitted files live in /voc/work/. Your script reads from /voc/work/ and writes its outputs to two environment variables. The path to the working directory is also available as the environment variable $VOC_HOME_DIR.
/voc/work/ ← working directory (student files, $VOC_HOME_DIR) ├── submit.py └── ... /voc/scripts/ ← your files (deployed via Configure Workspace) ├── grade.sh ← entry point — Vocareum runs this ├── grade.py ← your grading logic (optional) └── ...
Grade file
The grade file ($vocareumGradeFile) is a CSV where each line maps a rubric criterion to a score:
<Rubric Criterion Name>,<Score>
Criterion names must exactly match the names defined in your assignment rubric, including capitalization and spacing. Scores are numeric, up to the maximum defined for that criterion.
Report file
The report file ($vocareumReportFile) contains free-form text or HTML displayed to the student as feedback after grading.
By default, Vocareum also appends the grading script's stdout and stderr to the report. To suppress this and show only what your script writes explicitly, include the following line in the report file:
echo "VOC_NO_REPORT_OUTPUT" > "$vocareumReportFile"
Setting up the grading script
The following steps apply to Vocareum Notebook, VS Code, and JupyterLab assignments.
Step 1: Create the script files
Create grade.sh as the entry point. Vocareum executes this file on every submission. Supporting scripts (such as a grade.py) should also be created and called from grade.sh.
A minimal grade.sh:
#!/bin/bash SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" python3 "$SCRIPT_DIR/grade.py" "$vocareumGradeFile" "$vocareumReportFile"
Step 2: Upload scripts via Configure Workspace
All files in /voc/scripts/ are deployed through Configure Workspace in the assignment settings. Vocareum places these files in the grading environment automatically on each submission.
To upload your scripts:
Open the assignment in the Vocareum instructor interface.
Select Configure Workspace to open the lab environment.
Place
grade.shand any supporting files in the/voc/scripts/directory.Save and close the workspace.
Step 3: Define rubric criteria
Rubric criteria define the score categories your script writes to $vocareumGradeFile.
Open the assignment settings and navigate to the Rubric section.
Add one criterion for each score your script will report.
Set the maximum points for each criterion.
Confirm that the criterion names match exactly — including capitalization and spacing — what your script writes to the grade file.
Step 4: Test the script
Before publishing, test the grading script against a sample submission inside Configure Workspace.
Place a representative student submission in
/voc/work/.Set
$vocareumGradeFileand$vocareumReportFileto writable temp paths, or run the script with test values.Run
bash /voc/scripts/grade.shand inspect both output files.Verify criterion names in the grade file match your rubric exactly.
Verify the report file contains clear, correct feedback.
Step 5: Publish the assignment
Once tested, publish the assignment. Students will submit through the standard lab interface and each submission will trigger a fresh grading environment run.
Further reading
Grading Environment Variables — full reference for all environment variables available in the grading environment, including lab-type-specific variables and GenAI variables
Using Custom Grading Scripts in Vocareum Notebook — when to use a custom script vs. the built-in nbgrader-based autograder, accessing the submitted notebook, and hybrid grading patterns
Custom Grading Script Examples. — complete code examples for test-based, AI-powered, and hybrid grading
GenAI Gateway — configuring AI model access for grading scripts
Installing Packages in Vocareum Labs — adding Python packages to the grading environment
