Use of PeerWise in a not-so-common context

February 1, 2013 in Use cases

Many of the courses that have implemented PeerWise have tended to focus on the Sciences, Engineering or Health Sciences. However, it is certainly the case that the system can be incorporated into a far wider range of courses. Recently, I made contact with Adrian Renzo, formerly at the University of Auckland, to talk to him about his experiences of using PeerWise in his Anthropology course: “Issues and History in Popular Music”.

We recorded the conversation and you can view it by clicking on the screenshot or following this link: Interview with Adrian Renzo

The original session was recorded in Blackboard Collaborate. You can access the recording of the session here: (Note that the Blackboard Collaborate environment might require changes to your browser security settings. In case of difficulties, the Bb Collaborate online support portal may provide some helpful diagnostics).


Scoring: for fun and extra credit!

January 3, 2013 in Uncategorized

PeerWise includes several “game-like” elements (such as badges, points and leaderboards) which are designed primarily for fun and to inject a bit of friendly competition between students.  As an example, students accumulate points as they make their contributions and their score is displayed near the top right corner of the main menu.

In fact, if you have participated in your own course, perhaps you have noticed your score increasing over time?

Of course, not all students are motivated by such things, but a quick search of recent Twitter posts reveals that some students really seem to enjoy earning the various virtual rewards that are on offer:

Some instructors have even considered using these elements to award “bonus marks” or “extra credit” as a way of motivating their students.  Obtaining the data to verify that students have met certain goals is trivial – for example, instructors can view the scores of all of their students in real-time by selecting “View scores of all students” from the Administration menu:

However, one difficulty with using the score for awarding such credit is that coming up with realistic target scores is complicated by the way the scoring algorithm works.  The algorithm has previously been discussed in detail, but basically it rewards students for making contributions that are valued by their peers.  In order to achieve the highest possible score, a student must make regular contributions and:

  • author questions that their peers rate highly
  • answer questions correctly before their peers
  • rate questions as they are subsequently rated by their peers

What this means is the total number of points that a student can earn depends on how often their classmates endorse their contributions, and this is dependent not only on the size of the class but also on the “requirements” placed on the activity by the course instructor.  This makes it tricky to set reasonable score targets for students to reach.  Recently on the PeerWise-Community forum, member Brad Wyble raised exactly this point:

I’m interesting in linking the peerwise score to extra credit points but I’m a little stuck on how to proceed without an idea of what the range of possible values might be.   I don’t need to know an exact number, but is it possible to provide a rough estimate given a course of size 60? And how would this estimate change for a size of 150?

So, what is the typical range of PeerWise scores for a class of a given size?  Let’s start with a class of 50 students.  It turns out the range can be quite wide, as exemplified by the two extreme cases in the figure below (each line represents the set of scores for a single class of 50 students – the average number of questions authored and answers submitted by students in each class are shown in the legend):

Not only were students in the “blue” class all highly active, but almost all of them made contributions in each of the three areas required for maximising the score: question authoring, answering questions, and rating questions.  On the other hand, students in the “red” class were all quite active in answering questions, but only a few students in this class were active in all three areas.  In fact, only the first 12 students had non-zero component scores for each of the three components.  The remaining students scored 0 for the question authoring component (most likely because they chose not author any questions).  Students 13-24 in the figure had component scores only for question authoring and rating questions, whereas the remaining students (all below student 25) only had a single component score (for answering questions).  These students chose not to evaluate any of the questions they answered, and ended up with very low total scores (even though in some cases they may have answered many questions).

To calculate a “typical” range of scores for classes of varying sizes, we can average the class scores over a number of classes.  For example, to calculate the typical range for classes of approximately 200 students, a set of 20 classes were selected (where class sizes ranged from 185 to 215) and the student scores for each class were listed in descending order.  To calculate the average “top score”, the top score in each of the 20 classes was averaged.  Likewise for the second top score, and so on, averages were calculated in decreasing order for all remaining scores.  The figure below plots the average set of scores for classes of varying sizes (approximately 50, 100, 150 and 200) by averaging the class scores across a series of sample courses (in each calculation, between 15 and 20 classes were examined).

Brad also makes the following point in his forum post:

I suppose that another option would be to compute the grading scheme at the end of the semester once we see what the distribution of point values are.

This is an excellent idea – in many cases, the range of scores for a given course appear to be fairly consistent from one semester to the next (assuming the class size and participation requirements do not vary greatly).  The figure below plots the set of PeerWise scores for one particular course over 6 different semesters.  The class size was fairly consistent (around 350-400 students) and although the scores do vary, there is probably enough consistency to give instructors in future semesters some idea of what to expect (which may help them define targets for awarding bonus marks or extra credit).

In this class, only the very top few students achieve scores above 6000.  It is interesting to note towards the right hand edge of the chart, the very sharp drops in the curves correspond to the students who have not made contributions in each of the three areas.  Earning points in each of the question authoring, answering questions, and rating questions components is critical to achieving a good score – and it is probably important for instructors to emphasise this to students (although this information is shown when students hover their mouse over the score on the main menu).

Has anyone tried using the points (or the badges) as a way of rewarding students with extra credit or bonus marks?  It would be interesting to hear of your experience – please share!

MCQ writing: a tricky business for students?

November 12, 2012 in Use cases

When I talk with colleagues about introducing PeerWise with their students they invariably ask, how can students write good questions when they’ve never done it before?

My answer to this has two parts; firstly, we are teaching students who are used to being examined. In the UK, for example, students take exams at 4/5, 7, 11, 14, 16, 17 and 18 as well as all the end of module/year tests they are required to take to show how well they are progressing. Now I may be wrong, but I would hazard a guess that some of these exams will involve multiple choice question elements? If this is so then I contend that these hugely examined students are very familiar with multiple choice style questions and many will know a poor one when they see it, even if they don’t know that they know it (more on this in a minute).

Secondly, we can share with our students the pitfalls of writing MCQs and how they can avoid them, as well as showing them examples of well written questions and explaining why this is so. Contrary to some people’s concern, this isn’t teaching our students how to cheat (although it might give them a distinct advantage if they are faced with badly written multiple choice questions in an exam, however that’s our look-out, not theirs) but instead it is about scaffolding students’ introduction to PeerWise in a way that allows them to concentrate on authoring correct and challenging questions, rather than getting caught up in the process of question writing.

To scaffold PeerWise to our students I gave them a one hour introductory session (a PDF of most of this session can be found here), using a short quiz (to engage students in thinking about what makes poor MCQs), examples of subject-specific bad, average and good questions (with explanations), screen shots of the software (to show them what PeerWise looks like) and some of the feedback from the year before.

These suggestions are not rocket science. However it is the quiz that seems to illuminate the basics of MCQ writing for many. Originally written by Phil Race and Roger Lewis, the quiz was introduced to me as part of an MCQ writing workshop for staff members by Nora Mogey. (Note: try as I might I am unable to find a reference for the quiz, BUT it came from these people and I am not claiming any ownership of the original!) The questions are written in nonsense English and, as you can see, at first glance the answers are not obvious. However if you relax, stop trying to figure them out and instead allow your brain to listen to your gut response, you are likely to reach the correct answers. This is when students discover that they know a poorly written MCQ when faced with one, even if they don’t know why, and even if at first glance they don’t understand, by the time we’ve been through the answers they groan with recognition, understanding and wry smiles! Then we show students bad, average and good subject-specific questions, and explain what we expect and why.

We also show students what their peers from the year before said about PeerWise, and use comments such as ‘all questions should be checked by a Professor’ and ‘the comments didn’t help me at all’ to explain why repository quality and relevance is their responsibility, constructive commenting is essential, and that each person’s effort impacts upon others. My experience with our students is that if we make our expectations clear, and provide them with good reasons for those expectations, our students rise to the challenge and are more than capable of writing MCQs that are at least as intricate and challenging as the best MCQs we can write on a good day.

Moderate, monitor or leave well alone?

October 22, 2012 in Talking point

Talking point: When you’re using a PeerWise activity in your course, what do you do in terms of intervening in the students’ questions, answers and discussions?

When I tell someone about PeerWise for the first time, more often than not their first question is “But what if the students submit something that’s wrong?” My answer? “I don’t worry about it.”

As instructors, it seems to me that there are broadly three approaches we can take to administering our PeerWise repositories: you can moderate them, checking every question for correctness; you can monitor them, not explicitly checking everything but keeping a close eye out and intervening if you spot something egregiously incorrect; or you can leave well alone, and let the students look after themselves. My personal preference is for the last of these options, but plenty of people seem to recoil in horror when I tell them that. However, I would contend that not only can we legitimately not intervene at all, we definitely shouldn’t. Here’s why:

An instructor-free PeerWise space explicitly belongs to the students, and they have full responsibility for its contents. If they do spot an error, it’s up to them to resolve it: the teacher isn’t going to come along and fix it for them. I think this gives a much greater sense of ownership to the students, with corresponding greater commitment. Plus, deciding whether something really is an error or not, and why, can spawn some great, in-depth discussions in the comments threads, which I would argue are some of the most potent learning opportunities offered by PeerWise. This would be lost if we swept in, helicopter-like, to rescue the students all the time.

My experience in courses I have run is that less than a tenth of the submitted material has obvious errors, and from what has been reported this ratio seems to be broadly similar in other courses elsewhere. A good number of these problems do get identified and fixed by the students. “But not all,” I hear you say. True, not all. A small proportion of incorrect content does persist. But I’ve made my peace with that: students are smart people, and they understand the context very well – they know it’s a student-generated resource, and not everything in it is guaranteed to be correct. Besides, it’s not like PeerWise is the only place students might come across incorrect ideas: informal study groups in the library, the cafe or on Wikipedia are widespread, and no-one is moderating those…

I believe that as instructors we should be more relaxed about this sort of thing: any potential disadvantages of student mistakes are outweighed by the intellectual challenges of self and peer assessment, and taking responsibility for their own learning. PeerWise is a great space for students to make their own contributions to their course, and some of them really push the boundaries of creativity and sophistication. Don’t inhibit them by peering over their shoulder, virtual ‘red pen’ at the ready.

So, I’ve set my stall out. Are you nodding in agreement, or pounding the keyboard in rage? Vote in the poll (in the sidebar on the right), and use the comments below to tell me why I’m right or wrong!

So…what do you do in your courses? Have you changed what you do as a result of something that has happened? Would you prefer to do something different but have concerns or reservations? Join the debate!

What does a typical PeerWise course look like?

October 12, 2012 in Uncategorized

If you have ever wondered whether your class is too small (or too big) to use a tool like PeerWise, you may be interested in the following data.  To get a sense for both the typical size of a class on PeerWise, and the typical number of contributions made by students in each class, data from the last 1000 courses was examined.

While there are many examples of very large classes (>300 students), and even a few extremely large ones (>800 students), the majority of classes have fewer than 50 students.  The breakdown is given in the chart below.

In terms of student participation, it is quite common for instructors to reward a small fraction of course credit to students who contribute at least a certain number of questions and answers (for example, in the most recent class I taught, students were required to author 2 and answer 20 questions).

The table below gives the average overall participation (in terms of questions and answers) for classes of different sizes.  It also shows the average contributions per student in each of those classes.

The rightmost columns of this table are perhaps the most interesting – the chart below plots the figures from these rightmost columns – that is, the average number of questions authored and the average number of questions answered, by students in classes of various sizes.

If you ignore the very small (<20) and very large (>800) classes, the average number of answers submitted by students in all courses falls in quite a narrow range – from just below 30 to just above 40 answers per student.  Unsurprisingly, the average number of answers drops in very small courses – as students in these courses are unlikely to have a very large number of questions available to them to answer.  Conversely, very large courses sometimes end up with very large banks of questions (often many thousands of questions), enabling the very enthusastic students to answer (almost) as many questions as they would like.

So, where does your class fit in, and could the contributions of your students be described as “typical”?

Just how big is this thing…?

October 11, 2012 in Uncategorized

Immersed in numbers

From time to time, I email Paul Denny (he’s here – @paul), creator of PeerWise, to ask for up to date usage figures for the student-facing PeerWise website, to put into a talk or presentation that I am giving. I am giving one of these early next week to the assembled crew of the Carl Wieman Science Education Initiative here at UBC, so thought I would share some figures that Paul sent over…. some of them may surprise you!

  • Institutions:  308
  • Creators:  1796
  • Courses: 1905
  • Users: 94961
  • Questions: 379464
  • Answers: 8172405

Yes, you did read that correctly – getting on for half a million questions and ten million answers !!

Small print on the data:
“Institutions” only counts institutions for which there has been at least some activity, “creators” are instructors/teachers with the ability to create new courses.  ”Courses” includes all repositories created (even those for which is there is no associated content). “Questions” only includes active questions (i.e. not deleted or archived versions).

Sample Coursework Instructions & Marking Regime

October 10, 2012 in Announcements, Use cases

This year I’m on my 3rd year of using Peerwise to teach & evaluate students taking my Underground Construction course.  The students have to set 5 questions for their peers and have to solve 25 set by their peers.  I award 5% of the module assessment for setting (good!) questions and 5% for answering questions.  Last year our Malaysia campus joined in with our UK campus with a common Peerwise coursework (so the students couldn’t easily tell whether they were answering questions from another continent).  This year our China campus should join in.  The big benefit for me is speed of assessment – I can mark about 60 students an hour (provided I choose my on-line time OK !).  The big benefit for the students is that they enjoy it!

I have posted sample coursework instructions and my notes to markers in the Resources section of this site.  If you have any queries, I’ll do my best to answer them.

Using PeerWise to assign homework

October 10, 2012 in Use cases

This is a cross-post from my blog:

This past quarter I used a new tool called PeerWise in a physical chemistry course I co-taught.  The main idea behind PeerWise, which is a free web-based service developed by Paul Denny at the University of Auckland, is that students write multiple choice questions for each other.  I didn’t quite have the courage to say “Want homework? Write your own” so instead I used it as a supplement to the normal homework assignments, i.e. I reformulated most of the homework assignments as multiple choice questions and posted them on PeerWise, in addition to the usual homework assignment.

Some information about the course
The course is attended by about 170 chemistry and biochemistry majors and covers thermodynamics and kinetics – more specifically chapters 13-20 in Atkin’s Quanta Matter and Change.  It’s taught in a 9-week quarter and the students take one other course at the same time.  There are four other faculty members involved who also pick homework problems associated with their lectures.  The students were divided into six teams of roughly 30, and each team has six contact hours with TAs who can help them with the assignments.

Advantages of PeerWise
1. Feedback to the students.  PeerWise allows you to give extensive descriptions (you can even use videos: example 1 and example 2) on how to solve the problem immediately after the student has picked an answer.   The quality of the help provided by the TAs tend to vary greatly, and I hoped this would help even things out a bit.
2. Breaking complex problems up into more manageable pieces.  A standard question usually asks you to compute x given some conditions.  In PeerWise this can be broken up into, for example, a) which of the following equations would you use to compute x? and then b) what is the value x given these conditions.  Answering the first question lets people now whether they are on the right track before proceeding.
3. Feedback to me.  Students can leave questions or comments for each problem, which often helps me improve the explanation or identify errors in the problem.  Students can also rate the quality and difficulty of each question, which is very valuable.  I also get all sorts of detailed data including how many students answered the question correctly (more on that below).

Some data
I put 120 questions up on PeerWise and 119 students answered at least one question.  From what I understand many students worked in pairs and only one would answer on PeerWise.  39 students answered 100 questions or more.  For almost all problems, the majority of students picked the right answer, so they are not just clicking randomly to get the explanation as one could have feared.  For example, for the question with the most (86) answers was answered correctly by 57% while the question voted the hardest was answered correctly by 47% of the 76 students who answered it.

While I encouraged students to write their own questions, only 9 questions were written.  One of these questions was “Do you find PeerWise helpful?”  41 students answered, 38 said yes.  Also several students commented positively on PeerWise with one caveat (see below).

Clearly, many students used PeerWise as a study aid before the exam as seen on this figure.

One main problem: errors
The main complaint by the students was the many errors that crept in to the questions, multiple choice answers, and solutions.  The multiple choice format exacerbates the problem, since many students will work for hours trying to reproduce one of the answers.  This is of course impossible if there is an error and this is very frustrating to the student!

There are several sources of errors.
1. Several errors were introduced by me when I typed in the multiple choice answers (for example, using Joules when I meant kiloJoules) and explanations.
2. Several errors were already in the problems and was copied into PeerWise.
3. Several errors were in the answers provided in the solution manual of the text book and were copied into PeerWise by me.  Often the you would simply get a different numerical answer based on the numbers provided.

The error rate was about 1-2 questions per week and all (hopefully) will be fixed next year.

1. By default the questions are listed in the reverse order they are created (i.e. newest one first), so in cases where the order was important I would write the first question last, e.g. exercise 1b and then exercise 1a.
2. Each question comes with a one line preview, so the first sentence in the question should be the name of the exercise, e.g. Exercise 1a.
3. Each question can also be labelled by subject, so I would use labels like “week 1″.  With one click one can then display the questions on for a single week only.
4. Many of the comments left by the students said “I punch the solution into my calculator and get a different answer”.  It’s of course hard to know what the problem is, but in these cases I leave a link to the worked out solution on Wolfram|Alpha (example), which is basically on on-line calculator.
5. You add equation through one of several editors  (I recommend using the Latex editor) that create and image of the equation, which you can’t edit. If you view the page in HTML format there is a Latex code you can copy and paste into the editor and change, rather than typing everything in from scratch.

Changes for next year (When was the last time you learned anything by doing it once?)
Overall PeerWise was a hit, and I will use it next year for this course.  I plan to make the following changes:
1. I will add 10-20 math and conceptual training questions every week.  Questions one should be able to answer in 1 second: What is e−0?  If ΔG⊖<1 what can you say about the equilibrium constant?  Some of these will show up every week!
2. I will add 1-2 “regular” questions based on topics from previous weeks.
3. I will try to have the explanations include a screenshot of the solution done in Maple.  For three reasons: a) it will make me double check the answer in the solution manual, b) it will remove one source of error if I don’t have to retype something, and c) it will encourage the students to use Maple (students who use Maple efficiently are done in a fraction of the time compared to others).

Finally, I am racking my brain on how to get more students to write their own questions.


October 2, 2012 in Announcements

Welcome to this online community for instructors using PeerWise in support of their teaching!

Please take some time to explore what we’ve assembled here: you’ll find a space for discussion of ideas or issues in the Forum section, a chance to (virtually!) meet the other members of the community, a selection of resource videos about PeerWise that you may find useful. Additionally, there’s a collection of links to published material that either discuss implementations of PeerWise in various classes, or are relevant to the pedagogy of student-generated content.

Finally, probably the most important thing that you can do is… contribute! To post to the community and to be able to add resources, you’ll need to be registered, but that only takes a minute or two of your time.We’ve all kinds of plans for how we can get the community discussing various aspects of their use of PeerWise, and we’ll post details of these in due course. For now, come on in, have a look around and make yourself at home!

Welcome again

PeerWise-Community admins

Paul Denny, University of Auckland

Ross Galloway, Judy Hardy, Keith Brunton, University of Edinburgh

Simon Bates, University of British Columbia