Last changed
12 May 2005 ............... Length about 800 words (8,000 bytes).
(This document started on 8 May 2005.)
This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/rap/elist.html.
You may copy it. How to refer to it.
Web site logical path:
[www.psy.gla.ac.uk]
[~steve]
[rap]
[this page]
List of possibly relevant e-technologies
By
Steve Draper,
Department of Psychology,
University of Glasgow.
This page lists e-learning technologies that might possibly be relevant to
redesigning the assesssment on your courses. The order here is partly random,
and partly reflects my personal obsessions. For your purposes, ignore the
order and ruthlessly skim for what might help you and
your situation.
EVS
(Electronic voting systems)
(also know as Classroom Communication Systems etc.):
refers to technologies that facilitate interaction in large lecture classes.
Tests are presented in class and student respond using electronic handsets.
These technologies have great potential to provide immediate feedback and they
support self and peer assessment, and small group discussion in large classes.
- Class tests done with EVS mean not only that the marking is done and returned
on the spot, but the formative feedback/explanations are too, and are done
interactively so that students can elicit clarification on the spot rather than
relying on the written feedback being so perfect that it is always
understood.
- Revision lectures could use "contingent teaching": be guided entirely by the
audience's performance on diagnostic questions.
- Dynamically allocated tutorials. Alternatively, a diagnostic quiz could be
administered online (rather than in class), tutorial sessions scheduled for the
leading problems, and students recommended to attend these depending on their
individual performance. This abandons the idea of tutorials being organised
for small groups with social relationships built on continuity, but makes the
sessions much more personal in the sense of tailored for individual conceptual
need at the time.
E-portfolios:
electronic portfolios support personal development planning and self-regulated
learning by students (they reflect on and select learning outputs to record)
and monitoring of work by staff.
There are several largely contradictory functions for e-portfolios: before
adopting software it's best to be clear which of these you want.
- Support students' better reasoning to potential employers: archiving work
and evidence, and then helping students make claims about their knowledge and
skills, and supporting those claims with evidence.
"Showcasing" work done.
- Managing and recording accreditation (marks). Including marking
entries students make in the software. This is about giving staff more control
over students' learning. Summative assessment. About product.
- To support reflection and self-management of employability skills, personal
needs, and other learning outcomes.
This is about giving students more control over their learning. About process.
- I.e. "personal development planning".
- Reflection.
Simulations and games: provide intrinsic/dynamic feedback to students
often embedded in real life examples (e.g. problem solving, decision-making in
business). Simulations help integrate knowledge from different disciplines and
invariably enhance motivation.
Online exemplars and models of written work (essays, reports) with
feedback and/or level statements. Students can use these to help understand the
task and what counts as "good performance". They might be asked to compare
their work with exemplars to encourage self-assessment and self-correction.
Frequently Asked Questions: a form of self-assessment with feedback.
Students select questions that they wish answers to and receive feedback
results.
Answer Gardens: a way of building up answers to questions previously
asked by students and formulating these into online reusable resources.
Discussion boards: can be used to create peer discussion around online
submissions and are used to assess the quality of student discussion
Online questions posted by students. If done before lectures or
tutorials this form of feedback helps staff to tailor the teaching to students
needs. Scaled up, this is the heart of "just in time teaching" where students
are required to read the material before class, and "lectures" become entirely
devoted to addressing issues raised by students about it.
Online diagnostic tests: short tests used to gauge classroom
understanding at key points during the course. There is a great deal of
research on this form of innovative assessment in the USA but little work on
how this might be translated into online contexts.
Online tests: provide immediate feedback, repetition and
reinforcement. Useful in skills learning where practice is essential (e.g.
problem solving) and as a self-assessment task to help develop learner
responsibility.
Databanks of feedback comments: can be used by teachers to respond to
students written work more efficiently.
Peer marking and assignment distribution management software: helps
teachers manage peer-marking processes. It supports anonymous sharing of
students' work amongst peers and the collation and distribution of peer
feedback. [Free software from Dundee.]
Plagiarism detection software.
Such software can automate some of the work required by staff to ensure that
assignments submitted by students are actually produced by them.
VLEs (Virtual learning environments) / portal: support the management
of assignments and when integrated with student records systems also help
teachers monitor students' progress and identify those in difficulty.
Questionmark
and their software "Perception" are mainly about computerised assessment
(testing students). So for authoring questions in many formats (not just
basic MCQs), they may have something to offer. Possibly, such questions could
then be asked using an EVS.
Web site logical path:
[www.psy.gla.ac.uk]
[~steve]
[rap]
[this page]
[Top of this page]