I love trying out new ideas with students – taking a traditional activity and reshaping and enhancing it using the digital tools that are readily available. So when it was time for an 11/12th grade AP Environmental Science class to evaluate their peer water quality projects, we (the teacher, a fellow instructional coach and I) asked ourselves, “How can we make this process better and elicit truly awesome and helpful feedback?” Here’s what we came up with:
The idea: Student-created screencast evaluations of a peer’s digital project.
A little background: Students sampled water quality of the Chicago River. They analyzed organisms, performed a chemical analysis, made conclusions and reported their findings. Students chose the digital mediums for which to present their findings (popular mediums included Weebly, Piktochart and Smore).
The peer evaluation process: Students submitted the links to their finished projects via a Google Form. Students were given editing rights to the destination Google Sheet, and the teacher assigned peer review partners. Students were given a paper copy of the evaluation rubric and spent 10-15 minutes combing through their peer’s project (hopefully taking copious notes in the process).
The digital component: We are a GAFE district, and we walked students through the process of installing the Snagit App and Extension for Google Chrome (Snagit allows for simple screencast creation, and it saves the video directly in your Google Drive. Huzzah!). Students did a quick microphone test to make sure the video saves directly into their Drive. (Tech note: Make sure to run the test. We were in a lab, so the students had to Sign-in to Chrome from the browser settings and accept numerous Snagit terms/conditions…and the computers were behaving differently (this would be undoubtedly easier on a Chromebook or personal device). Make sure to run the test!). If the student was able to save a video with sound to their Drive, they’re ready to record.
The recording: Students strapped on their headsets and launched into the recording. They used their notes, clicked and paged around their peer’s project, and when they were finished they pasted the link to this Drive-hosted video on the project Google Sheet in the column next to their peer’s project.
The results: At the risk of using hyperbole, it was the best dang peer feedback I’ve ever heard. I made that conclusion very quickly – just walking around and listening to the students record live was a marvel to behold (the teacher also was impressed with the students’ focus, engagement and willingness to fully examine another student’s work) . The students’ opinions and analysis was thorough, in-depth, thoughtful and honest. The finished screencasts ranged from 3 – 9 minutes.
Other thoughts: The very detailed rubric is a MUST for this activity. Just having the students create a screencast and giving thoughts of the top of their head would have been a disaster. By having the students double down by completing the rubric first gave them a lot of time to reflect and put their thoughts together. And – without realizing it – the students were also self-reflecting on their own projects in a way, especially if they were evaluating a digital medium that they didn’t choose. Choosing the right medium for a project is super important, and the students will hopefully be more educated when selecting digital mediums in the future.
Student thoughts: The students weren’t entirely sold. After reflecting on this peer review process, the feelings were decidedly mixed. Here are some student opinions about peer review screencasting:
- “I prefer written feedback – it’s easier to go back and look at, so you don’t have to search when going back.”
- “Recording and talking made it easier to share your thoughts…face-to-face you might be overly nice and less honest.”
- “In conversations people can get defensive.”
- “Writing peer feedback is simple and easier. Screencasting was overly complicated.”
- “It’s better to have a conversation, but with a screencast you can review.”
So..reaction was mixed. I suspect it was the doubling-down aspect (completing the rubric first AND then recording) which made the peer review seem kind of cumbersome (not to mention they were in the same room as their peer while recording). But honestly, after reviewing the students’ video it was the BEST across-the-board peer feedback I’ve ever heard. I would totally do this again.
Oh – one final benefit of this process – I’ve been a part of situations where students turn-in digital projects and a decent amount of them don’t work (either the student sent the wrong link, the sharing settings are off, etc). Because another student had to examine the project from a spreadsheet first, if the link was not-functioning it had to be immediately fixed. So when it comes time for the teacher to grade, you can be sure that all the links work because they’ve already been independently checked. Awesome!