You are browsing the archive for Features.

Introducing the “answer score”

July 31, 2013 in Features

If you have viewed one of your courses in the last day or so you may have noticed a small addition to the main menu.  A new score, called the “answer score“, now appears (see the screenshot on the right).

The original score and associated algorithm remains unchanged except it has now been renamed to “reputation score” which more accurately reflects its purpose.  A number of instructors have been using this original score to assign extra credit to their students – as outlined in the following blog post which also describes the algorithm in detail:

http://www.peerwise-community.org/2013/01/03/scoring-for-fun-and-extra-credit/

However, this “reputation score” was the source of some confusion for those students who did not read the description of how it was calculated (this description appears if the mouse is hovered over the score area).  This is exemplified by the following comment submitted via the PeerWise feedback form:

This is a great tool. I love it. The only criticism is the slow update on the score. You need to wait 30min+ to see what score you have after a session.

This confusion is related to the fact that the “reputation score” for an individual student only increases when other students participate as well (and indicate that the student’s contributions have been valued).  On the other hand, the new “answer score” updates immediately when a student answers a question and this immediate feedback may alleviate some of the concerns and provide students with another form of friendly competition.  As soon as an answer is selected, the number of points earned (or in some cases lost) is displayed immediately as shown in the screenshots below.

More importantly, the new “answer score” now provides another measure of student participation within PeerWise.  Instructors may like to use this to set targets for students to reach.  A detailed description of how this works follows, but very basically for most students the “answer score” will be close to 10 multiplied by the number of questions they have answered “correctly” (where a correct answer is one that either matches the question author’s suggested answer or is the most popular answer selected by peers).  For example, the chart below plots the number of answers submitted that agreed with the author’s answer (note, this is a lower bound on the number of “correct” answers as just defined previously) against the new “answer score” for all students in one course who were active in the 24 hours after the new score was released.  The line is almost perfectly straight and has a slope close to 10.

A few students fall marginally below the imaginary line with a slope of 10 – this is because every “correct” answer submitted earns a maximum of 10 points however a small number of points are lost if an “incorrect” answer is submitted.  The number of points deducted for an incorrect answer depends on the number of alternative answers associated with the multiple-choice question – for example questions with 5 options have a lower associated penalty than questions with 2 options.  If a large number of questions are answered by randomly selecting answers (which is obviously a behaviour that we would want to discourage students from adopting), the “answer score” should generally not increase.

So, how can you now begin to make use of this?

It appears to be quite common, from hearing a range of instructors describe how they implement PeerWise in their own classrooms, to require students to answer a minimum number of questions per term.  To make things a little more interesting, you can now set an “answer score” target instead.  This is practically the same thing but a bit more fun (just remember to multiply by 10) – if you normally ask your students to answer 50 questions, try setting an “answer score” target of 500!

There is a new student leaderboard for the top “answer scores”, and as an instructor you can download a complete list of answer scores at any time (from the Administration section).

If at first you don’t succeed, answer again!

March 28, 2013 in Features

A small milestone was reached at the start of this week when a student (from Central Queensland University in Australia) submitted the 10 millionth answer to a PeerWise question. That’s quite a few answers, but of course, many of them are incorrect.  In fact, around 3.5 million of them do not match the answer as specified by the question author.  So what happens to all these wrong answers?

Well, up until now, students have only been able to attempt a question once.  While they have been able to review their answers, there has not been a way for them to hide earlier attempts and submit new answers.  In other words, wrong answers have stayed wrong!  On the one hand, it is valuable to preserve the originally submitted answers as these can provide useful feedback to the instructor about how well their students are coping with course concepts.  However, the ability to re-answer questions has been a very common request – not only from students:

“It would be better if you could answer questions you’ve already answered a second time. Some questions I answered a long time ago and would not remember the answer from the first time around but it would be helpful to re-test myself.”

but also from instructors:

“Some of my students requested the option of redoing questions after they’ve answered them.  This would be particularly useful for questions they missed.”

and from members of the PeerWise Community:

http://www.peerwise-community.org/forums/topic/reset-questions/

As a result of this feedback, a new feature has been introduced which allows students to submit new answers to any question and, having reviewed all of the feedback, to indicate which answer they believe is the correct one.  This blog post briefly describes this new feature and presents some very early data showing how students are making use of it.

Colour-coded answers

The first thing that students will notice is that the answers they submit are now colour-coded.  The screenshot below shows a typical view of the “Answered questions” page.

Answers that appear to be incorrect are highlighted in red, answers that appear to be inconclusive are highlighted in orange, and answers that appear to be correct are not coloured.  This colour coding helps students to locate questions that they have answered incorrectly, and therefore might like to attempt again.  The column that displays the answer is now sortable – so that all incorrect answers appear at the top of the table.

A column labelled “Answer again?” has also been added to the table.  Clicking the link in this column will present the question to the student again, without showing any information about their previous attempts.

Confirming and changing answers

You may have noticed in the earlier screenshot that below the correctly answered questions (which are not coloured) some of the entries are coloured in green.  These are “confirmed” answers – questions for which the student has indicated they are sure their answer is actually correct.  Let’s take a look at how this happens.

As soon as a student submits an answer to a question, they are shown a variety of feedback.  This includes the explanation for the answer as written by the question author, any improvements to the explanation written by their classmates, and any comments written about the question.  In addition, they are shown each of the question options and the number of times each was selected by their peers.  Immediately beneath these options, the student now has a choice:

If, after reviewing all of this feedback, the student believes their submitted answer is not correct, they can simply change their answer by immediately attempting the question again.  Note that the original, or “first attempt”, answers to a question are preserved and always appear in the column labelled “First answers”.

However, if the student is certain that their answer is correct after reviewing the feedback, they can “confirm” their answer.  This does two things: it will display their confirmed choice in the column labelled: “Confirmed answers” and in the “Answered questions” table the corresponding result will be highlighted in green (and will appear at the bottom of the list when sorted).

The screenshot below shows a set of options for a question where option D has been confirmed by four students.

Note that in this case, although there was considerable disagreement over the correct answer following students’ first attempts, only option D appears as a confirmed answer.  In a small way, this “confirmation” process mirrors an important element of the Peer Instruction pedagogy – after individually committing to an answer, students have an opportunity to reach consensus by reflecting on feedback from their peers.

Reaching consensus

The ability to re-answer questions and to “confirm” answers is only a few days old, yet we can take a peek at how students are beginning to make use of it.  In particular, to investigate whether students are effectively reaching consensus, we can compare the typical spread of “first” answers with the typical spread of “confirmed” answers.

Students have submitted 74,357 new answers since the feature was released.  Of these, 13,904 (almost 20%) have been “confirmed” as correct (in some cases this is after changing the original answer).  The chart below illustrates the typical spread seen across the “first” answers submitted to a question.  The options are ordered by popularity – so the most popular option, on average, is selected around 71% of the time.  The second most popular option is selected around 18% of the time, and so on.  Only questions that had received 10 or more responses following the release of this feature were considered – in this case a total of 1351 questions.

As a comparison, the chart below illustrates the typical spread seen across the “confirmed” answers submitted to a question.  Once again, the options are ordered by popularity – in this case, the most popular “confirmed” answer is selected 99% of the time.

Only questions with 10 or more confirmed answers were included in this analysis – giving a total of 56 questions.  This is a small data set as the feature is very new, still, there is virtually no disagreement about which answer is correct.  In fact, 45 of the 56 questions had perfect agreement and, at least so far, no question had more than two different confirmed answers.

It will be interesting to see how this progresses, but if this early data is any indication, “confirmed” answers may provide an effective new way for students to verify whether they are right or wrong.