Discuss with your group:
Write down your answers (and why you chose them!) in your group’s shared doc.
In this week’s lab you will write an automatic “grader” for some of the methods we worked on in week 3.
In particular, you’ll write a script and a test file that gives a score to the functionality of a student-submitted ListExamples
file and class (see ListExamples.java). The specific format is that you’ll write a bash
script that takes the URL of a Github repository and prints out a grade:
$ bash grade.sh https://github.com/some-username/some-repo-name
... messages about points ...
This will work with a test file that you write in order to grade students’ work. You can use this repository to get started with your grader implementation; you should make a fork:
https://github.com/ucsd-cse15l-w23/list-examples-grader
Do the work below in pairs—as a pair, you should produce one implementation—push it to one member’s fork of the starter Github repository and include the link to that repository in your notes.
When your script gets a student submission it should produce either:
A general workflow for your script could be:
if
and -e
/-f
. You can use the exit
command to quit a bash script early..java
file into the same directorycp
and maybe mkdir
javac
, useful tools here are output redirection and error codes ($?
) along with if
set -e
. Why?grep
could be helpful hereWrite down in notes screenshots of what your grader does on each of the sample student cases below.
Assume the assignment spec was to submit:
ListExamples.java
ListExamples
static List<String> filter(List<String> s, StringChecker sc)
static List<String> merge(List<String> list1, List<String> list2)
You should use the following repositories to test your grader:
filter
in the wrong order, so it doesn’t match the expected behavior.pa1
.assertSame
, which compares with ==
rather than .equals()
, and think hard about duplicates for merge
)After you’re satisfied with the behavior on all of those submissions, write your own. Try to come up with at least two examples:
You should create these as new, public Github repositories, so that you can run them using the same grader script by providing the Github URL.
Write down in notes: Run everyone’s newly-developed student submissions on everyone’s grader. That means each team should be running commands like
bash grade.sh <student-submission-from-some-group>
Whose grading script is the most user-friendly across those tests?
We’ve also provided our Server.java
and a server we wrote for you called GradeServer.java
in the starter repository.
You can compile them and use
java GradeServer 4000
to run the server.
Look at the code to understand the expected path and parameters in GradeServer.java
. Loading a URL at the /grade
path with one of the repos above as the query parameter. What happens?
That’s quite a bit of the way towards an autograder like Gradescope!
Write down in notes: Show a screenshot of the server running your autograder in a browser.
Discuss and write down: What other features are needed to make this work more like Gradescope’s autograder? (Think about running for different students, storing grades, presenting results, etc)
Congratulations! You’ve done one kind of the work that your TAs do when setting up classes 🙂