Skip to main content

PeerMark

Last year, I wrote a lot about my experience with SWoRD, a site that facilitates peer review of writing (including generating grades from peer review scores). Although I think there are a lot of neat things about SWoRD, there were also a lot of problems and I decided not to use it for the writing class this past spring. Instead, I used Turnitin's PeerMark tool, which is integrated into my school's Blackboard system.

Compared to last year, I made a few adjustments to the writing and reviewing process. The general pattern was that students would submit first drafts on Mondays, by class time; those papers would be made available to reviewers at the end of a two-hour grace period (i.e., class started at 3:30pm and papers were available to reviewers at 5:30pm so slightly late papers could still get reviewed without messing up any of the assignments) and reviews were due by class time on Wednesday (again with a two-hour grace period). Depending on the assignment, students reviewed three to five papers (the longest assignment was 5 pages, and most were a lot shorter, so this should not have been too huge a burden). Final drafts were due the following Monday and 'reflective memos' were due Wednesday. The reflective memos included students' reflections on their own work (e.g, 'what did you learn') and also their back evaluations of the peer reviews they received. For the back evaluations, I asked them to provide both a score of 1 to 5 and an explanation for their score. I did not have students do a peer review of the final draft (I just graded those myself).

Some big benefits of PeerMark, particularly compared to SWoRD, include:
  • I have lots of control over things like when the assignment is due, when the reviews are available, and how many reviews students must complete. Reviewers can either be randomly assigned or I can assign specific students to review specific authors, or some combination;
  • The reviewing interface allows students to highlight things in the paper and make notes at the exact spot they've highlighted;
  • Students can give both quantitative scores (on a 1-5 scale) and qualitative explanations;
  • Reviews can be anonymous to the authors but I can easily see who they are (you could also allow students to see who their reviewers are).
One of the obvious downsides, compared to SWoRD, is that grading is less 'automated' and therefore more work for me. With PeerMark, it's possible to get the average of the reviewing scores (overall and for individual items on the review), which could be used as at least part of the draft grade for the authors, but there's no fancy weighting system like SWoRD has. It's also possible to let authors give each review a score within the PeerMark interface, but only numeric scores between 1 and 10 are allowed and there's no place for an explanation of those scores. So I ended up reading all of the first drafts and the reviews much more carefully, giving them my own grades (and part of the review scores were based on whether the reviewer gave a similar score for the draft that I came up with myself). I also used my own weighting system for the draft grades; that is, the grade for the first draft was a weighted average of my own assessment and the peer review scores, weighted by the scores given to the reviews (so if someone did a bad job as a reviewer, their score did not count as much). This was all a bit of a pain but on the plus side, you can create whatever kind of weighting system you want and make it completely transparent.

There are a few things that are still not ideal and that I'm working on fixing the next time around. One is that there is no easy way to give students feedback on their reviewing skills, explaining exactly why they got the reviewing score they did and how to improve the next time around. The only thing I could come up with was adding comments to the reviewing grades in the Blackboard gradebook, which is a pretty clunky workaround. There was also some confusion in the back evaluations about which review students were evaluating. Since the reviewers were all anonymous, when students went into PeerMark to view their reviews, all they saw was 'Reviewer 1', 'Reviewer 2', etc. But in some cases, the order of those reviewers was not the same as what I was seeing so I could not be 100% sure who Reviewer 1 or 2 really was. Next time around, I may ask students to use some sort of identifier in their reviews, like a pseudonym or the last four digits of their student ID number.

But overall, things definitely were smoother this year. That was partly due to lessons I learned last year about the reviewing process in general (for example, I think my prompts for the reviews were clearer), but there was definitely less confusion about what was due when, thanks to everything being in Blackboard and being able to set deadlines that corresponded to class times, and I think reviewers gave better feedback because they could use the in-text comment tool to pinpoint specific spots that needed work. All in all, I think PeerMark is an excellent tool for facilitating peer review.

Comments

Popular posts from this blog

Economics Education sessions at ASSA

If I missed any, please let me know... Jan 07, 2011 8:00 am , Sheraton, Director's Row H American Economic Association K-12 Economic and Financial Literacy Education (A2) Presiding: Richard MacDonald (St. Cloud State University) Teacher and Student Characteristics as Determinants of Success in High School Economics Classes Jody Hoff  (Federal Reserve Bank of San Francisco) Jane Lopus (California State University-East Bay) Rob Valletta (Federal Reserve Bank of San Francisco) [Download Preview] It Takes a Village: Determinants of the Efficacy of Financial Literacy Education for Elementary and Middle School Students Weiwei Chen (University of Memphis) Julie Heath (University of Memphis) Economics Understanding of Albanian High School Students: Student and Teacher Effects and Specific Concept Knowledge Dolore Bushati (University of Kansas) Barbara Phipps (University of Kansas) Lecture and Tutorial Attendance and Student Performance in t...

This is about getting through, not re-inventing your course

As someone who has worked hard to build a lot of interactivity into my courses, I have never been interested in teaching fully online courses, in part because I have felt that the level of engaged interaction could never match that of a face-to-face class (not that there aren't some exceptional online courses out there; I just have a strong preference for the in-person connection). But the current situation is not really about building online courses that are 'just as good' as our face-to-face courses; it is about getting through this particular moment without compromising our students' learning too much. So if you are used to a lot of interaction in your F2F class, here are some options for adapting that interaction for a virtual environment: [NOTE: SDSU is a Zoom/mostly Blackboard campus so that's how I've written this but I am pretty sure that other systems have similar functionality] If you use clickers in class to break up what is otherwise mostly lect...

Moving on...

I want to let everyone know that I am officially closing out this chapter of my blogging life. It was 17 years ago this May that I started this blog, back when blogging was still relatively new, and I was exploring ways to have my students do some writing. During the years from 2008 to 2015-ish, when I was most active with experimenting with different pedagogical approaches, this space helped me process what I was learning, and connected me with economists and other colleagues who care about teaching. As I have moved into other roles, I have been torn about what to do with this space, feeling a bit weird about posting anything not directly related to teaching. I have finally decided I need to start fresh so I will be writing (though I have no idea how regularly) on Substack .  Thank you to everyone who has read and commented over the years. I hope you'll find me on Substack, or in real life!