Last changed
20 Mar 2000 ............... Length about 1,500 words (10,000 bytes).
This is a WWW document maintained by
Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/talks/errframe.html.
Web site logical path:
[www.psy.gla.ac.uk]
[~steve]
[talks]
[this page]
Discussion on error types
Led by
Steve Draper,
Department of Psychology,
University of Glasgow.
[suitable for emailing out]
Steve Draper will lead a GIST session concerned with classifying errors
especially in accidents.
The seminar is to GIST at 4pm Thur 13th April 2000,
room F121, Computing Science. More details at:
http://www.psy.gla.ac.uk/~steve/talks/errframe.html
I will propose briefly what seems to me a new idea, then invite the audience
to discuss it, particularly by offering cases that either fit or undermine
the proposed idea(s). As many of the audience will know the field better
than I do, this is crucial. In fact I have 2 such topics, but will only get
to the second one, if there is little discussion of the first.
The first idea is to apply the 3 way classification of individual errors into
slips, rule-based mistakes, and knowledge-level mistakes to 4 levels: the
individual, the team or group, the organisation, and society. Management
errors can perhaps be understood as appearing in several places in this
12-type scheme.
I propose to lead a GIST session concerned with analysing errors especially in
accidents. I will propose briefly what seems to me a new idea, then invite
the audience to discuss it, particularly by offering cases that either fit or
undermine the proposed idea(s). As many of the audience will know the field
better than I do, this is crucial.
In fact I have two such topics, but will only get to the second one, if there
is little discussion of the first.
The talk is to GIST
at 4pm Thur 13th April 2000,
room F121, Computing Science.
The concept of error only makes sense as relative to some violated norm.
Individual errors are thought of as contradictions within an individual's
mind: they do something inconsistent with what, in some other sense, they
know and want.
Furthermore, in most though not all cases we not only have a mental mechanism
to act correctly but extra mechanisms to check on this, so that an error
indicates more than one failure at once i.e. of the checking as well as of the
basic action production.
Both of these features are strongly suggestive of how it might be interesting
to analyse team error, and organisational error; where the separate parts or
aspects of a person's mind might map on to separate individuals in a team or
separate units in an organisation.
A 4-way classification of individual human error is now widely used:
- Slips: Actual behaviour fails to conform to the
intention/plan (wrong action).
- Lapses: Actual behaviour fails to conform to the
intention/plan (omitted action, memory failure).
- Rule-based mistake: wrong rule selected for action
i.e. behaviour conforms to immediate intention, but intention, while
consistent with a viable rule for action, is inconsistent in this case with
wider knowledge.
- Knowledge-based mistake: error in generating a novel
plan for a novel situation.
My proposal is to apply (cross, multiply) this 4-way classification with 4
levels:
- The individual. As above. This is in effect analysing the interaction
(and inconsistencies) of parts or aspects of a person's mind with each other.
- The team or group. Group error. I.e. problems arising essentially from
the interaction of individuals.
- The organisation. Problems arising from the interaction of parts of the
organisation, and perhaps more importantly, from different goals and policies
within an organisation.
- Society as a whole. Here, we can analyse shortcomings of the
configurations of institutions and organisations in our society: do we have a
wholly useful accident analysis procedure? Is government playing too much,
too little, the wrong role? etc. Are legal procedures in fact making things
less safe by obstructing the free cooperation that is important to
understanding what went wrong?
A link.
Notes:
- By "team" I'm thinking of a small group e.g. of 6 people who interact
personally even if they never meet face to face (e.g. pilot and air traffic
controller, train driver and signalman);
- while by "organisation" I'm thinking of big entities like Railtrack or
an airline, not small companies that operate as a team.
- Should there be a further "top" level of error due to ignorance?
Certainly, not knowing something leads to mistakes. But we not feel that is
an error, as in general that would imply a norm of omniscience. On the other
hand, ignorance is often regarded as culpable (e.g. in law). A car driver
drives into you, not through malice, but because they weren't looking. And
many/most errors by physicians (as opposed to surgeons and nurses) are due to
not getting the right information, rather than drawing the wrong conclusion
from the available information. This could be seen as culpable due to the
omitted information-getting action, thus making it an error after all.
- One can talk of management of oneself, one's time, a team, or an
organisation. Thus "management errors" can appear at several levels here.
Of interest here is the possibility of understanding something about
management errors as conflicts between different aspects of knowledge and
intentions, analogous to individual errors, at higher levels.
For instance, rule-based mistakes at the organisational level might be cases
where the wrong rule or SOP is acted on; while knowledge-based errors
correspond to policy errors: where senior management think out and enact a
policy that turns out not to further the company's interests taken as a whole.
- A number of errors seem to derive from the faulty equating (i.e.
conflation) of two things in a person's mind; but this cuts across categories.
E.g. slips where you press an adjacent key to the proper one; capture errors
of the kind where you assimilate the wrong object to a rule e.g. putting the
salad in the oven; Abermule where both stationmaster and driver take the
token but don't check if it is the right one; Welwyn, where the signalman
seems to have done one or both of: sent 'train out of section' on wrong
instrument (and so referring to wrong line), or confounded the first and
second express so that he attributed his memory of the first passing to the
second.
- Slip and mistakes: is this basic distinction in fact clear and tenable?
I've always had intermittent trouble with this.
In both cases there is a real sense in which the person has a goal/intention,
which is incorrectly developed (otherwise we and he couldn't recognise it
later as an error i.e. a contraction between what he wanted/wished/meant to do
and what he did do). In any simple program, or model of the mind as software,
there is no particular line just a big hierarchy of goals being subdivided
into subgoals. In both cases, the low level actions don't properly match the
high level goals they were meant to accomplish. How can you draw a line? One
attempt would be pre-stored plans vs. new ones (mistakes are errors in
calculating new plans; slips in executing old ones). Another is basically
relative to consciousness: in slips the person is at no time aware of a wrong
intention, but in mistakes they are.
- There is a distinction in Organisational Psychology (and perhaps
management studies) between 3 ways individuals interact, and hence ways in
which individuals my be organised:
- Clan: individuals take on the goals of the leader, as opposed to plans,
rules, or anything else. They do what she or he says and/or wants. They act
to please the person, taking all their interests into account.
- Bureaucracy: individuals follow the ordained rules. This means they don't
do what the head says on a whim, but equally, they are liable to follow the
rules even in cases where the outcome clearly conflicts with the apparent goal
(reason for) the rules.
- Market: individuals take no account whatsoever of the others' interests,
but negotiate each action (purchase) separately, on the basis of it being in
both the party's interest on this occasion.
This of course may be illuminating to bear in mind in considering how things
can go wrong both in teams and organisations (and societies as a whole).
(This distinction maps, apparently, on to the distinction in an individual's
mind between goal, plan of action for that goal, and single executed actions.)
This second topic is about expanding on the implications of these fundamental
points:
- Causes have no part in pure science (which describes relationships), only
in engineering. Causes are inherently relative to human capacity for action
and hence intentions and wishes, not to facts about the world.
- All events have multiple (actually, infinite) causes. Talking about "the
cause" is a childish, and sometimes dangerous, logical error.
- We select out from this infinite set only those causes we think
we can and want to change.
- In analysing the cause of an accident, we compare what happened against
norms, selecting out deviations from these expectations. "Human error" refers
to when a person acts against what was expected of them; "act of god" refers
to a natural process that went against expectation.
- Recommendations, however, are about changing what is done. These again
must be selected for what can be done; and in fact, for what can be afforded.
It follows that accident reports can only be sensibly written by those with an
equal knowledge of safety issues and industry costs.
Web site logical path:
[www.psy.gla.ac.uk]
[~steve]
[talks]
[this page]
[Top of this page]