You are browsing the archive for Use cases.

Correlation between authoring questions and understanding of threshold concepts in PeerWise

March 17, 2014 in Use cases

PeerWise is used in the Chemistry Department at the University of Liverpool as a student contributed assessment system in the “Chemical Engineering for Chemists” module. The aim of this module is to give chemistry students an insight into the world of chemical engineering and to enhance their understanding of the fundamental/threshold concepts in chemical engineering. The continuous assessments in this module play an important role in enhancing student understanding of chemical engineering concepts which are entirely foreign to most chemists.

PeerWise was used to enhance chemistry students’ understanding of threshold concepts in chemical engineering. Our theory was that students have to understand the fundamental/threshold concepts to be able to author good quality questions. Although answering peers’ questions in PeerWise provides a good revision material for learners which strongly supports learning, this research was focused on the importance of authoring questions on students’ understanding of challenging concepts.

The PeerWise scores on authoring questions on mass and energy balances as fundamental operations in a process analysis procedure were compared with exam marks related to these topics. The data were evaluated to find the correlation between PeerWise scores and exam marks. The positive correlation between PeerWise scores on authoring questions and exam marks proved the significance of using student contributed assessment system to enhance understanding of threshold concepts.

PeerWise – Experiences at University College London

September 13, 2013 in Uncategorized, Use cases

PeerWise – Experiences
at University College London

by Sam Green and Kevin Tang

Department of Linguistics, UCL

Introduction

In February 2012, as part of a small interdisciplinary team, wPeerWise_Logoe secured a small grant of
 £2500 from the Teaching Innovation Grant fund to develop and implement the use of PeerWise within a single module in the Department of Linguistics at University College London (UCL). The team was made up of various advisory staff from the Centre for Applied Learning and Teaching, also from the Division of Psychology and Language Sciences (PALS), and lecturers and Post-Graduate Teaching Assistants (PGTAs) in the department of Linguistics. The use of the system was monitored and student’s participation made up 10% of their grade for the term.

The subsequent academic year we extended its use across several further modules in the department by obtaining an e-Learning Development Grant at UCL.

Overall aims and objectives

The PGTAs adapted the material developed in the second half of the 2011/12 term to provide guidelines, training, and further support to new PGTAs and academic staff running modules using PeerWise. The experienced PGTAs were also be involved in disseminating the project outcome and sharing good practice.

Methodology – Explanation of what was done and why

Introductory session with PGTAs:

A session run by the experienced PGTAs was held prior to the start of term for PGTAs teaching on modules utilising PeerWise. This delivered information on the structure and technical aspects of the system, the implementation of the system in their module, and importantly marks and grading. This also highlighted the importance of team-work and necessity of participation. An introductory pack was provided for new PGTAs to quickly adapt the system for their respective modules.

Introductory session with students:

Students taking modules with a PeerWise component were required to attend a two-hour training and practice workshop, run by the PGTAs teaching on their module. After being given log-in instructions, students participated in the test environment set up by the PGTAs. These test environments contained a range of sample questions (written by the PGTAs) relating to students’ modules and which demonstrated to the students the quality of questions and level of difficulty required. More generally, students were given instructions on how to provide useful feedback, and how to create educational questions.

Our PGTA - Thanasis Soultatis giving an introductory session to PeerWise for students

Our PGTA – Thanasis Soultatis giving an introductory session to PeerWise for students

Our PGTA - Kevin Tang giving an introductory session to PeerWise for students

Our PGTA – Thanasis Soultatis giving an introductory session to PeerWise for students

Course integration

In the pilot implementation of PeerWise, BA but not MA students were required to participate. BA students showed more participation than MA students, but the latter nevertheless showed engagement with the system. Therefore, it was decided to make PeerWise a compulsory element of the module to maximise the efficacy of peer-learning.

It was decided that students should work in ‘mixed ability’ groups, due to the difficult nature of creating questions. However, to effectively monitor individual performance, questions were required to be answered individually. Deadlines situated throughout the course ensured that students engaged with that week’s material, and spread out the workload.

Technical improvement

The restriction of image size and lack of an ability to upload or embed audio files (useful for phonetic/phonological questions in Linguistics) was circumvented by using a UCL-wide system which allows students to host these sorts of files. This system (MyPortfolio) allows users to create links to stored media. This also allows the students to effectively anonymise the files, thus keeping them secret for the purpose of questioning.

Project outcomes

Using the PeerWise administration tools, we observed student participation over time. Students met question creation deadlines as required, mostly by working throughout the week to complete the weekly task. In addition, questions were answered throughout the week, revealing that students didn’t appear to see the task purely as a chore. Further, most students answered more than the required number of questions, again showing their willing engagement. The final point on deadlines was that MA students used PeerWise as a revision tool entirely by their own choices. Their regular creation of questions created a repository of revision topics with questions, answers, and explanations

Active Engagement

Active Engagement

The Statistics

PeerWise provides a set of PeerWise scores. To increase the total score, one needs to achieve good scores for each component.

The students were required to:

  • write relevant, high-quality questions with well thought-out alternatives and clear explanations
  • answer questions
  • rate questions and leave constructive feedback
  • use PeerWise early (after questions are made available) as the score increases over time based on the contribution history

Correlations between the PeerWise scores and the module scores were performed to test the effectiveness of PeerWise on student’s learning. A nested model comparison was performed to test the effectiveness of the PeerWise grouping in prediction of the students’ performance. The performance in Term 1 differs somewhat between the BA students and MA students, but not in Term 2 after manipulations with the PeerWise grouping with the BAs.

Term 1:

The BA students showed no correlation at all, while the MAs showed a strong correlation (r = 0.49, p < 0.001***)

MA Students - Term 1 - Correlation between PeerWise Scores and Exam Scores

MA Students – Term 1 – Correlation between PeerWise Scores and Exam Scores

In light of this finding, we attempted to identify the reasons behind this divergence in correlations. One potential reason was that grouping with the BAs was done randomly, rather than by mixed-ability, while the grouping with the MAs was done by mixed-ability. We,  hypothesized that mixed-ability grouping is essential to the successful use of the system. To test this hypothesis, we asked the PGTA for the BAs to regroup the PeerWise groups in the second term based on mixed-ability. This PGTA did not have any knowledge of the students’ Peerwise scores in Term 1, while the PeerWise grouping for the MAs largely remained the same.

Term 2:

The assignments in Term 2 were based on three assignments spread out over the term. The final PeerWise score (taken at the end of the Term 2) was tested for correlation with each of the three assignments.

With the BAs, the PeerWise score correlated with all three assignments with increasing levels of statistical significance – Assignment 1 (r = 0.44, p = 0.0069**), Assignment 2 (r = 0.47, p = .0.0040*) and Assignment 3 (r = 0.47, p = .0.0035**).

With the MAs, the findings were similar, with the difference that Assignment 1 was not significant with a borderline p-value of 0.0513 – Assignment 1 (r = 0.28, p = 0.0513), Assignment 2 (r = 0.46, p = 0.0026**) and Assignment 3 (r = 0.33, p = 0.0251**).

A further analysis was performed to test if PeerWise grouping has an effect on assignment performance. This consisted of a nested-model comparison with PeerWise score and PeerWise Group as predictors, and the mean assignment scores as the predictee. The lm function in R statistical package was used to build two models, the superset model having both PeerWise score and PeerWise Group as the predictors, and the subset model having only the PeerWise score as the predictor. An ANOVA was used to compare the two models, and it was found that while both PeerWise scores and PeerWise grouping were significant predictors separately, PeerWise grouping made a significant improvement in prediction with p < 0.05 * (see Table 1 for the nested-model output).

Table 1: ANOVA results

Analysis of Variance Table

Model 1: Assignment_Mean ~ PW_score + group
Model 2: Assignment_Mean ~ PW_score
  Res.Df    RSS Df Sum of Sq      F  Pr(>F)  
1     28 2102.1                              
2     29 2460.3 -1   -358.21 4.7713 0.03747 *
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

The strong correlation found with the BA group in Term 2 (but not in Term 1) is likely to be due to the introduction of mixed-ability grouping. The group effect suggests that the students performed at a similar level as a group, which implies group learning. This effect was only found with the BAs but not with the Mas; this difference could be attributed to the quality of the mixed-ability grouping since the BAs (re)grouping was based on Term 1 performance, while the MA grouping was based on the impression on the students that the TA had in the first two weeks of Term 1. With the BAs and MAs, there was a small increase of correlation and significance level over the term; this might suggest that the increasing use of the system assists with improving assignment grades over the term.

Together these findings suggest that mixed-ability grouping is key to peer learning.

Evaluation/reflection:

A questionnaire was completed by the students about their experience with our implementation of PeerWise. The feedback was on the whole positive with a majority of students agreeing that

  1. Developing original questions on course topics improved their understanding of those topics
  2. Answering questions written by other students improved their understanding of those topics.
  3. Their groups worked well together

These highlighted the key concept of PeerWise – Peer Learning

feedback_understandingfeedback_writtenfeedback_group

Our objective statistical analyses together with the subjective feedback from the students themselves strongly indicated that the project enhanced student learning and benefitted their learning experience.

E-learning awareness

One important experience was the recognition that peer learning – using e-learning – can be a highly effective method of learning for students, even with low amounts of any regular and direct contact from PGTAs to students regarding their participation.

It was necessary to be considerate of the aims of the modules, understand the capabilities of PeerWise and it’s potential for integration with the module, and importantly to plan in detail the whole module’s use of PeerWise from the beginning. Initiating this type of e-learning system required this investigation and planning in order for students to understand the requirements and the relationship of the system to their module. Without explicit prior planning, with teams working in groups and remotely from PGTAs and staff (at least, with regards their PeerWise interaction), any serious issues with the system and its use may not have been spotted and/or may have been difficult to counteract.

As mentioned, the remote nature of the work meant that students might not readily inform PGTAs of issues they may have been having, so any small comment was dealt with immediately. One issue that arose was group members’ cooperation; this required swift and definitive action, which was then communicated to all relevant parties. In particular, any misunderstandings with the requirements were dealt with quickly, with e-mails sent out to all students, even if only one individual or group expressed concern or misunderstanding.

Dissemination and continuation

A division-wide talk (video recorded as a ‘lecturecast’) was given by Kevin Tang and Sam Green (the original PGTAs working with PeerWise) introducing the system to staff within the Division of Psychology and Language Sciences. This advertised the use and success of PeerWise to several interested parties, as did a subsequent lunchtime talk to staff at the Centre for the Advancement of Teaching and Learning. As the experienced PGTAs documented their experiences in detail, created a comprehensive user-guide, included presentations for students and new administrators of PeerWise, and made this readily-available for UCL staff and PGTAs, the system can capably be taken up by any other department. Further, within the Department of Linguistics there are several ‘second-generation’ PGTAs who have learned the details of, and used, PeerWise for their modules. These PGTAs will in turn pass on use of the system to the subsequent year, should PeerWise be used again; they will also be available to assist any new users of the system. In sum, given the detailed information available, and current use of the system by the Department of Linguistics, as well as the keen use by staff in the department (especially given the positive results of its uptake), it seems highly likely that PeerWise will continue to be used by several modules, and will likely be taken up by others.

Acknowledgements

 

Use of PeerWise in a not-so-common context

February 1, 2013 in Use cases

Many of the courses that have implemented PeerWise have tended to focus on the Sciences, Engineering or Health Sciences. However, it is certainly the case that the system can be incorporated into a far wider range of courses. Recently, I made contact with Adrian Renzo, formerly at the University of Auckland, to talk to him about his experiences of using PeerWise in his Anthropology course: “Issues and History in Popular Music”.

We recorded the conversation and you can view it by clicking on the screenshot or following this link: Interview with Adrian Renzo

The original session was recorded in Blackboard Collaborate. You can access the recording of the session here: http://goo.gl/GG7k7 (Note that the Blackboard Collaborate environment might require changes to your browser security settings. In case of difficulties, the Bb Collaborate online support portal may provide some helpful diagnostics).

 

MCQ writing: a tricky business for students?

November 12, 2012 in Use cases

When I talk with colleagues about introducing PeerWise with their students they invariably ask, how can students write good questions when they’ve never done it before?

My answer to this has two parts; firstly, we are teaching students who are used to being examined. In the UK, for example, students take exams at 4/5, 7, 11, 14, 16, 17 and 18 as well as all the end of module/year tests they are required to take to show how well they are progressing. Now I may be wrong, but I would hazard a guess that some of these exams will involve multiple choice question elements? If this is so then I contend that these hugely examined students are very familiar with multiple choice style questions and many will know a poor one when they see it, even if they don’t know that they know it (more on this in a minute).

Secondly, we can share with our students the pitfalls of writing MCQs and how they can avoid them, as well as showing them examples of well written questions and explaining why this is so. Contrary to some people’s concern, this isn’t teaching our students how to cheat (although it might give them a distinct advantage if they are faced with badly written multiple choice questions in an exam, however that’s our look-out, not theirs) but instead it is about scaffolding students’ introduction to PeerWise in a way that allows them to concentrate on authoring correct and challenging questions, rather than getting caught up in the process of question writing.

To scaffold PeerWise to our students I gave them a one hour introductory session (a PDF of most of this session can be found here), using a short quiz (to engage students in thinking about what makes poor MCQs), examples of subject-specific bad, average and good questions (with explanations), screen shots of the software (to show them what PeerWise looks like) and some of the feedback from the year before.

These suggestions are not rocket science. However it is the quiz that seems to illuminate the basics of MCQ writing for many. Originally written by Phil Race and Roger Lewis, the quiz was introduced to me as part of an MCQ writing workshop for staff members by Nora Mogey. (Note: try as I might I am unable to find a reference for the quiz, BUT it came from these people and I am not claiming any ownership of the original!) The questions are written in nonsense English and, as you can see, at first glance the answers are not obvious. However if you relax, stop trying to figure them out and instead allow your brain to listen to your gut response, you are likely to reach the correct answers. This is when students discover that they know a poorly written MCQ when faced with one, even if they don’t know why, and even if at first glance they don’t understand, by the time we’ve been through the answers they groan with recognition, understanding and wry smiles! Then we show students bad, average and good subject-specific questions, and explain what we expect and why.

We also show students what their peers from the year before said about PeerWise, and use comments such as ‘all questions should be checked by a Professor’ and ‘the comments didn’t help me at all’ to explain why repository quality and relevance is their responsibility, constructive commenting is essential, and that each person’s effort impacts upon others. My experience with our students is that if we make our expectations clear, and provide them with good reasons for those expectations, our students rise to the challenge and are more than capable of writing MCQs that are at least as intricate and challenging as the best MCQs we can write on a good day.

Sample Coursework Instructions & Marking Regime

October 10, 2012 in Announcements, Use cases

This year I’m on my 3rd year of using Peerwise to teach & evaluate students taking my Underground Construction course.  The students have to set 5 questions for their peers and have to solve 25 set by their peers.  I award 5% of the module assessment for setting (good!) questions and 5% for answering questions.  Last year our Malaysia campus joined in with our UK campus with a common Peerwise coursework (so the students couldn’t easily tell whether they were answering questions from another continent).  This year our China campus should join in.  The big benefit for me is speed of assessment – I can mark about 60 students an hour (provided I choose my on-line time OK !).  The big benefit for the students is that they enjoy it!

I have posted sample coursework instructions and my notes to markers in the Resources section of this site.  If you have any queries, I’ll do my best to answer them.

Using PeerWise to assign homework

October 10, 2012 in Use cases

This is a cross-post from my blog: http://proteinsandwavefunctions.blogspot.dk/2012/07/using-peerwise-to-assign-homework.html

PeerWise
This past quarter I used a new tool called PeerWise in a physical chemistry course I co-taught.  The main idea behind PeerWise, which is a free web-based service developed by Paul Denny at the University of Auckland, is that students write multiple choice questions for each other.  I didn’t quite have the courage to say “Want homework? Write your own” so instead I used it as a supplement to the normal homework assignments, i.e. I reformulated most of the homework assignments as multiple choice questions and posted them on PeerWise, in addition to the usual homework assignment.

Some information about the course
The course is attended by about 170 chemistry and biochemistry majors and covers thermodynamics and kinetics – more specifically chapters 13-20 in Atkin’s Quanta Matter and Change.  It’s taught in a 9-week quarter and the students take one other course at the same time.  There are four other faculty members involved who also pick homework problems associated with their lectures.  The students were divided into six teams of roughly 30, and each team has six contact hours with TAs who can help them with the assignments.

Advantages of PeerWise
1. Feedback to the students.  PeerWise allows you to give extensive descriptions (you can even use videos: example 1 and example 2) on how to solve the problem immediately after the student has picked an answer.   The quality of the help provided by the TAs tend to vary greatly, and I hoped this would help even things out a bit.
2. Breaking complex problems up into more manageable pieces.  A standard question usually asks you to compute x given some conditions.  In PeerWise this can be broken up into, for example, a) which of the following equations would you use to compute x? and then b) what is the value x given these conditions.  Answering the first question lets people now whether they are on the right track before proceeding.
3. Feedback to me.  Students can leave questions or comments for each problem, which often helps me improve the explanation or identify errors in the problem.  Students can also rate the quality and difficulty of each question, which is very valuable.  I also get all sorts of detailed data including how many students answered the question correctly (more on that below).

Some data
I put 120 questions up on PeerWise and 119 students answered at least one question.  From what I understand many students worked in pairs and only one would answer on PeerWise.  39 students answered 100 questions or more.  For almost all problems, the majority of students picked the right answer, so they are not just clicking randomly to get the explanation as one could have feared.  For example, for the question with the most (86) answers was answered correctly by 57% while the question voted the hardest was answered correctly by 47% of the 76 students who answered it.

While I encouraged students to write their own questions, only 9 questions were written.  One of these questions was “Do you find PeerWise helpful?”  41 students answered, 38 said yes.  Also several students commented positively on PeerWise with one caveat (see below).

Clearly, many students used PeerWise as a study aid before the exam as seen on this figure.

One main problem: errors
The main complaint by the students was the many errors that crept in to the questions, multiple choice answers, and solutions.  The multiple choice format exacerbates the problem, since many students will work for hours trying to reproduce one of the answers.  This is of course impossible if there is an error and this is very frustrating to the student!

There are several sources of errors.
1. Several errors were introduced by me when I typed in the multiple choice answers (for example, using Joules when I meant kiloJoules) and explanations.
2. Several errors were already in the problems and was copied into PeerWise.
3. Several errors were in the answers provided in the solution manual of the text book and were copied into PeerWise by me.  Often the you would simply get a different numerical answer based on the numbers provided.

The error rate was about 1-2 questions per week and all (hopefully) will be fixed next year.

Tips 
1. By default the questions are listed in the reverse order they are created (i.e. newest one first), so in cases where the order was important I would write the first question last, e.g. exercise 1b and then exercise 1a.
2. Each question comes with a one line preview, so the first sentence in the question should be the name of the exercise, e.g. Exercise 1a.
3. Each question can also be labelled by subject, so I would use labels like “week 1″.  With one click one can then display the questions on for a single week only.
4. Many of the comments left by the students said “I punch the solution into my calculator and get a different answer”.  It’s of course hard to know what the problem is, but in these cases I leave a link to the worked out solution on Wolfram|Alpha (example), which is basically on on-line calculator.
5. You add equation through one of several editors  (I recommend using the Latex editor) that create and image of the equation, which you can’t edit. If you view the page in HTML format there is a Latex code you can copy and paste into the editor and change, rather than typing everything in from scratch.

Changes for next year (When was the last time you learned anything by doing it once?)
Overall PeerWise was a hit, and I will use it next year for this course.  I plan to make the following changes:
1. I will add 10-20 math and conceptual training questions every week.  Questions one should be able to answer in 1 second: What is e−0?  If ΔG⊖<1 what can you say about the equilibrium constant?  Some of these will show up every week!
2. I will add 1-2 “regular” questions based on topics from previous weeks.
3. I will try to have the explanations include a screenshot of the solution done in Maple.  For three reasons: a) it will make me double check the answer in the solution manual, b) it will remove one source of error if I don’t have to retype something, and c) it will encourage the students to use Maple (students who use Maple efficiently are done in a fraction of the time compared to others).

Finally, I am racking my brain on how to get more students to write their own questions.