Last changed 17 May 2017 ............... Length about 900 words (9,000 bytes).
(Document started on 7 May 2017.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/talks/bath.html. You may copy it. How to refer to it.

Web site logical path: [www.psy.gla.ac.uk] [~steve] [talks] [this page]

Formative MCQs, Peer Interaction, and Deep Learning

This page was updated after the talk.

Title: Formative MCQs, Peer Interaction, and Deep Learning
Date/time: Tuesday 9 May 2017. Session: 1:45pm - 4pm, (my own slot: 3:15pm - 4:15pm).
Occasion: MCQs and Deep Learning Workshop: A Possible Fix?
Place: 3 East 2.1, University of Bath
Presenter Steve Draper,   School of Psychology,   University of Glasgow.

Slides: PDF
Handout: PDF file
Related material:

Abstract

This talk discusses, with a view to improving practice in HE learning and teaching, the relationships between formative feedback, peer interaction, and deep learning; with particular focus on the uses of MCQs for this purpose.

There are three "pillars", or underlying principles, which a number of different successful and evidence-based techniques derive from.

  1. When learners realise (notice) that they are not sure of something which they had assumed they knew, this frequently makes them take note to resolve that uncertainty when they can.
    Three catalysts for this are: simply being asked to say how confident you are about an answer; getting immediate feedback on whether your answer is right; and being asked questions, not about facts but about which reason is the correct justification for a fact or theory.
  2. Interaction with peers (as opposed to experts) is a particularly good trigger for this, since we generally accept what experts say without much thought, and dismiss those who know less than us: but peers make us genuinely uncertain.
    Three ways of using peers as a prompt to re-evaluate one's confidence on a topic are: getting learners to design an MCQ; getting learners to critique a peer's work; and getting peers to discuss which answer for an MCQ is correct.
  3. There are two types of peer interaction relevant to learning:
    1. Peers create a joint product (e.g. coursework, a poster, or a single agreed decision).
    2. Peers discuss a single topic, with a common interest in understanding it better, but with no necessity to agree a single common opinion.
    In the former some learning may occur, but in fact the most efficient way to run a concrete project is by specialisation of labour, where you work on what you know best, and have only a secondary interest in learning from others. Any learning is an extra cost for no extra benefit in terms of the overt task (the product). Thus most groupwork in HE militates not for but against learning. However there are learning designs that avoid a joint product and succeed in prompting academic discussion ("constructive interaction") that corresponds to the kind of thought-provoking discussion that moves our understanding forward and deeper.

Bloom's "Mastery learning" is a technique demonstrating how powerful true formative assessment (perhaps with MCQs) can be. A first testing is diagnostic, then each learner immediately works on the weakness identified i.e. on the questions they got wrong. After that remedial study, re-testing demonstrates to the learner (and to everyone) how effective this is.

Deep learning has been important to the education literature in that work on it has demonstrated to teachers how little their students have understood what they have learned (and passed tests on) in terms of relating the new material to its connections to everyday applications of the material. What most think is desirable is that learners process the new material more deeply by making connections from it to various things they know already. (In fact one effect of surface learning, or imperfect learning, is that learners do not even recognise that two MCQs are about the same concept if they are not worded identically. So MCQs may be quite capable of testing for deep learning by using subsets of questions which to the teacher are redundant and repetitious but to weak learners are unrelated.)

If we want to test for whether learners understand the reasons for a fact or concept, then simply asking them may be done by MCQs: by assertion-reason questions.

Another technique is Reciprocal Peer Critiquing (RPC), where students are required to judge their peers' work. This helps them by exercising their ability to judge as a reader, instead of only as an author writing their own essays. And disagreements with peers provokes more thinking about whether the judgements are right or not. There is successful software (Aröpa) to help manage such exercises in big classes; not only doing the admin, but collecting subsequent feedback judgements on how helpful their critiques are judged by the recipients.

Mazur's "Peer instruction" uses MCQs of a special kind, brain teasers, to provoke discussion amongst peers within large classes about which answer is right. This has been widely successful, with large effects. Without explicit directions, peers naturally express reasons for the answer they favour. The disagreement in the class about the right answer provokes uncertainty and a desire to resolve it. The discussion, by eliciting reasons, provokes deeper learning: that links answers to reasons.

Getting learners to "teach" peers is another long established tactic. One version of this is to require students (usually in small groups) to author MCQs for the rest of the class, complete with built in feedback explaining why each option is correct or wrong. Again, the need to link reasons to "facts" provokes deep learning. The "PeerWise" software manages the use of this in large classes; and supports not only student use of the questions peers have designed, but also student ratings of the quality of each question. Again, this approach induces deep learning partly by engaging peer interaction, and also provides material for formative feedback to members of the class who use the created resource (bank of MCQs) to test themselves.


Workshop announcement, and sign-up

Contact: Matteo di Tina

Web site logical path: [www.psy.gla.ac.uk] [~steve] [talks] [this page]
[Top of this page]