You are browsing the archive for 2013 March.

If at first you don’t succeed, answer again!

March 28, 2013 in Features

A small milestone was reached at the start of this week when a student (from Central Queensland University in Australia) submitted the 10 millionth answer to a PeerWise question. That’s quite a few answers, but of course, many of them are incorrect.  In fact, around 3.5 million of them do not match the answer as specified by the question author.  So what happens to all these wrong answers?

Well, up until now, students have only been able to attempt a question once.  While they have been able to review their answers, there has not been a way for them to hide earlier attempts and submit new answers.  In other words, wrong answers have stayed wrong!  On the one hand, it is valuable to preserve the originally submitted answers as these can provide useful feedback to the instructor about how well their students are coping with course concepts.  However, the ability to re-answer questions has been a very common request – not only from students:

“It would be better if you could answer questions you’ve already answered a second time. Some questions I answered a long time ago and would not remember the answer from the first time around but it would be helpful to re-test myself.”

but also from instructors:

“Some of my students requested the option of redoing questions after they’ve answered them.  This would be particularly useful for questions they missed.”

and from members of the PeerWise Community:

As a result of this feedback, a new feature has been introduced which allows students to submit new answers to any question and, having reviewed all of the feedback, to indicate which answer they believe is the correct one.  This blog post briefly describes this new feature and presents some very early data showing how students are making use of it.

Colour-coded answers

The first thing that students will notice is that the answers they submit are now colour-coded.  The screenshot below shows a typical view of the “Answered questions” page.

Answers that appear to be incorrect are highlighted in red, answers that appear to be inconclusive are highlighted in orange, and answers that appear to be correct are not coloured.  This colour coding helps students to locate questions that they have answered incorrectly, and therefore might like to attempt again.  The column that displays the answer is now sortable – so that all incorrect answers appear at the top of the table.

A column labelled “Answer again?” has also been added to the table.  Clicking the link in this column will present the question to the student again, without showing any information about their previous attempts.

Confirming and changing answers

You may have noticed in the earlier screenshot that below the correctly answered questions (which are not coloured) some of the entries are coloured in green.  These are “confirmed” answers – questions for which the student has indicated they are sure their answer is actually correct.  Let’s take a look at how this happens.

As soon as a student submits an answer to a question, they are shown a variety of feedback.  This includes the explanation for the answer as written by the question author, any improvements to the explanation written by their classmates, and any comments written about the question.  In addition, they are shown each of the question options and the number of times each was selected by their peers.  Immediately beneath these options, the student now has a choice:

If, after reviewing all of this feedback, the student believes their submitted answer is not correct, they can simply change their answer by immediately attempting the question again.  Note that the original, or “first attempt”, answers to a question are preserved and always appear in the column labelled “First answers”.

However, if the student is certain that their answer is correct after reviewing the feedback, they can “confirm” their answer.  This does two things: it will display their confirmed choice in the column labelled: “Confirmed answers” and in the “Answered questions” table the corresponding result will be highlighted in green (and will appear at the bottom of the list when sorted).

The screenshot below shows a set of options for a question where option D has been confirmed by four students.

Note that in this case, although there was considerable disagreement over the correct answer following students’ first attempts, only option D appears as a confirmed answer.  In a small way, this “confirmation” process mirrors an important element of the Peer Instruction pedagogy – after individually committing to an answer, students have an opportunity to reach consensus by reflecting on feedback from their peers.

Reaching consensus

The ability to re-answer questions and to “confirm” answers is only a few days old, yet we can take a peek at how students are beginning to make use of it.  In particular, to investigate whether students are effectively reaching consensus, we can compare the typical spread of “first” answers with the typical spread of “confirmed” answers.

Students have submitted 74,357 new answers since the feature was released.  Of these, 13,904 (almost 20%) have been “confirmed” as correct (in some cases this is after changing the original answer).  The chart below illustrates the typical spread seen across the “first” answers submitted to a question.  The options are ordered by popularity – so the most popular option, on average, is selected around 71% of the time.  The second most popular option is selected around 18% of the time, and so on.  Only questions that had received 10 or more responses following the release of this feature were considered – in this case a total of 1351 questions.

As a comparison, the chart below illustrates the typical spread seen across the “confirmed” answers submitted to a question.  Once again, the options are ordered by popularity – in this case, the most popular “confirmed” answer is selected 99% of the time.

Only questions with 10 or more confirmed answers were included in this analysis – giving a total of 56 questions.  This is a small data set as the feature is very new, still, there is virtually no disagreement about which answer is correct.  In fact, 45 of the 56 questions had perfect agreement and, at least so far, no question had more than two different confirmed answers.

It will be interesting to see how this progresses, but if this early data is any indication, “confirmed” answers may provide an effective new way for students to verify whether they are right or wrong.