You are browsing the archive for usage.

MCQ writing: a tricky business for students?

November 12, 2012 in Use cases

When I talk with colleagues about introducing PeerWise with their students they invariably ask, how can students write good questions when they’ve never done it before?

My answer to this has two parts; firstly, we are teaching students who are used to being examined. In the UK, for example, students take exams at 4/5, 7, 11, 14, 16, 17 and 18 as well as all the end of module/year tests they are required to take to show how well they are progressing. Now I may be wrong, but I would hazard a guess that some of these exams will involve multiple choice question elements? If this is so then I contend that these hugely examined students are very familiar with multiple choice style questions and many will know a poor one when they see it, even if they don’t know that they know it (more on this in a minute).

Secondly, we can share with our students the pitfalls of writing MCQs and how they can avoid them, as well as showing them examples of well written questions and explaining why this is so. Contrary to some people’s concern, this isn’t teaching our students how to cheat (although it might give them a distinct advantage if they are faced with badly written multiple choice questions in an exam, however that’s our look-out, not theirs) but instead it is about scaffolding students’ introduction to PeerWise in a way that allows them to concentrate on authoring correct and challenging questions, rather than getting caught up in the process of question writing.

To scaffold PeerWise to our students I gave them a one hour introductory session (a PDF of most of this session can be found here), using a short quiz (to engage students in thinking about what makes poor MCQs), examples of subject-specific bad, average and good questions (with explanations), screen shots of the software (to show them what PeerWise looks like) and some of the feedback from the year before.

These suggestions are not rocket science. However it is the quiz that seems to illuminate the basics of MCQ writing for many. Originally written by Phil Race and Roger Lewis, the quiz was introduced to me as part of an MCQ writing workshop for staff members by Nora Mogey. (Note: try as I might I am unable to find a reference for the quiz, BUT it came from these people and I am not claiming any ownership of the original!) The questions are written in nonsense English and, as you can see, at first glance the answers are not obvious. However if you relax, stop trying to figure them out and instead allow your brain to listen to your gut response, you are likely to reach the correct answers. This is when students discover that they know a poorly written MCQ when faced with one, even if they don’t know why, and even if at first glance they don’t understand, by the time we’ve been through the answers they groan with recognition, understanding and wry smiles! Then we show students bad, average and good subject-specific questions, and explain what we expect and why.

We also show students what their peers from the year before said about PeerWise, and use comments such as ‘all questions should be checked by a Professor’ and ‘the comments didn’t help me at all’ to explain why repository quality and relevance is their responsibility, constructive commenting is essential, and that each person’s effort impacts upon others. My experience with our students is that if we make our expectations clear, and provide them with good reasons for those expectations, our students rise to the challenge and are more than capable of writing MCQs that are at least as intricate and challenging as the best MCQs we can write on a good day.

Moderate, monitor or leave well alone?

October 22, 2012 in Talking point

Talking point: When you’re using a PeerWise activity in your course, what do you do in terms of intervening in the students’ questions, answers and discussions?

When I tell someone about PeerWise for the first time, more often than not their first question is “But what if the students submit something that’s wrong?” My answer? “I don’t worry about it.”

As instructors, it seems to me that there are broadly three approaches we can take to administering our PeerWise repositories: you can moderate them, checking every question for correctness; you can monitor them, not explicitly checking everything but keeping a close eye out and intervening if you spot something egregiously incorrect; or you can leave well alone, and let the students look after themselves. My personal preference is for the last of these options, but plenty of people seem to recoil in horror when I tell them that. However, I would contend that not only can we legitimately not intervene at all, we definitely shouldn’t. Here’s why:

An instructor-free PeerWise space explicitly belongs to the students, and they have full responsibility for its contents. If they do spot an error, it’s up to them to resolve it: the teacher isn’t going to come along and fix it for them. I think this gives a much greater sense of ownership to the students, with corresponding greater commitment. Plus, deciding whether something really is an error or not, and why, can spawn some great, in-depth discussions in the comments threads, which I would argue are some of the most potent learning opportunities offered by PeerWise. This would be lost if we swept in, helicopter-like, to rescue the students all the time.

My experience in courses I have run is that less than a tenth of the submitted material has obvious errors, and from what has been reported this ratio seems to be broadly similar in other courses elsewhere. A good number of these problems do get identified and fixed by the students. “But not all,” I hear you say. True, not all. A small proportion of incorrect content does persist. But I’ve made my peace with that: students are smart people, and they understand the context very well – they know it’s a student-generated resource, and not everything in it is guaranteed to be correct. Besides, it’s not like PeerWise is the only place students might come across incorrect ideas: informal study groups in the library, the cafe or on Wikipedia are widespread, and no-one is moderating those…

I believe that as instructors we should be more relaxed about this sort of thing: any potential disadvantages of student mistakes are outweighed by the intellectual challenges of self and peer assessment, and taking responsibility for their own learning. PeerWise is a great space for students to make their own contributions to their course, and some of them really push the boundaries of creativity and sophistication. Don’t inhibit them by peering over their shoulder, virtual ‘red pen’ at the ready.

So, I’ve set my stall out. Are you nodding in agreement, or pounding the keyboard in rage? Vote in the poll (in the sidebar on the right), and use the comments below to tell me why I’m right or wrong!

So…what do you do in your courses? Have you changed what you do as a result of something that has happened? Would you prefer to do something different but have concerns or reservations? Join the debate!

Just how big is this thing…?

October 11, 2012 in Uncategorized

Immersed in numbers

From time to time, I email Paul Denny (he’s here – @paul), creator of PeerWise, to ask for up to date usage figures for the student-facing PeerWise website, to put into a talk or presentation that I am giving. I am giving one of these early next week to the assembled crew of the Carl Wieman Science Education Initiative here at UBC, so thought I would share some figures that Paul sent over…. some of them may surprise you!

  • Institutions:  308
  • Creators:  1796
  • Courses: 1905
  • Users: 94961
  • Questions: 379464
  • Answers: 8172405

Yes, you did read that correctly – getting on for half a million questions and ten million answers !!

Small print on the data:
“Institutions” only counts institutions for which there has been at least some activity, “creators” are instructors/teachers with the ability to create new courses.  ”Courses” includes all repositories created (even those for which is there is no associated content). “Questions” only includes active questions (i.e. not deleted or archived versions).