We wish we'd thought of it, but we didn't! This technique was documented by Prof. Anderson and used as the final assessment process for APS105 during the Winter 2020 term. It explains the consideration between different final assessment tools and why and how the course landed on the process they did. You can read the process below (or download the PDF, below).
APS105 is a first year programming course. There are about 430 students, 4 instructors and 24 teaching assistants.
Preparation and Decisions
We considered Crowdmark and Quercus as platforms. Since the students would be uploading code, Quercus seemed the most convenient way to do this. The distribution of submissions, allocation of markers and recording of marks was done online, the last two through shared Google spreadsheets.
The logistics and methods were checked out using a short trial “exam”, all students were invited to participate and some of the students that did well on the midterm specifically asked to test it out. Many of the decisions made were based on the results and feedback from this trial. It also was suggested that the students use it to become familiar with the formats, etc.
The students had 3.5 hours available to complete the exam once they opened it (with extensions for accommodated students). There was a 12 hour window where they could open the exam and finish it.
The first question on Quercus was a commitment to ethics, and in the question were links to an exam (in the usual format) and a file of “provided code” which gave a skeleton of code for each question in which the students could complete the functions in answer to the question.
We ran six slightly-different versions of the questions and provided code, with them randomly assigned to students. Since the exams were named the same and students could only see their own exam, this fact was not known to the students. Forty (40) students were found to have used exams/provided code that was not assigned to them. Most have indicated they obtained copies, usually early, often from friends or online study groups.
Production of the multiple exams and assignment of the groups was fairly straight-forward, once you knew what you were doing. The online help was not much help. Joanna Lau, an Instructional Technologist with the faculty, was stellar in finding the answers and helping to make the thing work. Some backend help was also needed to get the students into groups without their knowledge, which she also facilitated.
- To get submissions for each question, two quiz questions of a “file upload” type had to be created for each, as the students had to differentiate code and “background/planning information”. One of the things that was not great was that Quercus insisted on numbering the questions itself, so the students might be answering question 5 on the exam but had to put the answers into questions 8 and 9 on Quercus (figure follows).
Despite all warnings, there were also a fair number of students that didn’t submit or didn’t update files before their exam closed. Sometimes this was a computer issue, but more usually they just didn’t pay enough attention.
b. The Quercus speedgrader was not really useful to us. Numerous reasons:
- It was “not the usual” way these are marked (not much of an excuse, but it was made)
- some students misfiled their uploads because the exam and Quercus question numbers were out of sync
- two questions on Quercus needed to be checked to get the mark for one exam question
- I didn’t feel assured that more than a single marker could work on a question at the same time without information being overwritten from one of them upon a “save” from the other
- It also did not allow offline work. (an inconvenience for some of my online markers)
This left us with the requirement to leave a trail as to why a student obtained a certain mark on a certain question. This was done using comments on a spreadsheet, which was not overly satisfactory.
c. With a large window and students only needing a portion of it for the exam, managing students requiring accommodation (extended time) was problematic. The only facility I could find was to extend the exam after the student started, so monitoring would be required. Rather than take a chance on missing a student, five more copies of the exam were created with other-than-normal time when the exam could be open for the students in this category.
d. General comment on time: There was nothing about this that did anything more than extend the amount of time to ready, execute and mark the exam compared to the usual paper-exam process. I did get some interest in the Crowdmark facility and will explore that as a way to manage paper exam marking in the future.
- Set up online help facilities for use cases of people setting exams with certain aims, such as my setup to run alternate exams without students knowing. Generally the human help was great, the online help and online help search fell short.
- Allow Teaching Team to have access to batch enrol/unenrol tool (UTQAT) that allows enrolling students in sections without student knowledge [new feature].
- Allow multiple file uploads in answer to a single question [new feature].
- Allow instructors to set the question numbers [new feature]
- Verify or add the feature that multiple graders can SpeedGrade the same student on different questions and multiple students for the same question simultaneously without interference.