Peer-assessment seems to be a persistent buzz word in education, and it is something I have used in the past with my students. However, until recently, I have never found it to be as useful as others suggest it might be.  Recently, though, I supplemented verbal peer feedback with with a Google Docs form, containing a number of questions on a numeric scale as well as a field for comments. We video recorded each student performing their assessment piece (a rapid presentation on human-computer interaction, 60 seconds at most), and then reviewed the videos as a class. Whilst students did not necessarily enjoy the process of watching themselves on screen, it certainly made them think about how they present themselves. For each video reviewed, every student in the class (including the presenter) and the teacher, submitted feedback via the Google form. In addition, students were asked to comment on each others’ work verbally, which they were at times reluctant to do. This was supplemented by verbal feedback from the teacher for each student.

The result was immediate feedback, backed by a giant spreadsheet containing around 400 rows for each group of 20 students. This spreadsheet contained self, peer and teacher generated data, each of which was weighted equally (no teacher bias here). The spreadsheet was sorted by the name of the presenting student, and then averages were calculated for each of the questions asked. These were then averaged into an overall grade for the student. Comments were aggregated into a single field. This anonymised data was then delivered back to students using our school platform, Gibbon.

Interestingly, whilst the students were a little more generous than I had been, the overall distribution of grades was pretty much in line with previous, teacher-generated grades, with most students retaining their normal position within the class in terms of academic performance.

In total the assessment took two 70-minute classes (one to present and record, one to watch and grade), and all data processing and entry was completed within 90 minutes. This represents a significant decrease in my marking time, but more importantly, the whole exercise gave students ownership of their grades, and demoted the teacher from the position of god-like arbiter of success.

I am calling this approach mass assessment, and am very interested to hear of other teachers doing similar things, or willing to give this a go.