About the Book
What links the frustrations of daily life, like VCR clocks and voicemail systems, to airplane crashes and a staggering “hidden epidemic” of medical error?
Kim Vicente is a professor of human factors engineering at the University of Toronto and a consultant to NASA, Microsoft, Nortel Networks and many other organizations; he might also be described as a “technological anthropologist.” He spends his time in emergency rooms, airplane cockpits and nuclear power station control rooms -- as well as in kitchens, garages and bathrooms -- observing how people interact with technology.
In the first chapter of The Human Factor, Kim Vicente sets out the disturbing pattern he’s observed: from daily life to life-or-death situations, people are using technology that doesn’t take the human factor into account. Technologies as diverse as stove tops, hospital work schedules and airline cockpit controls lead to ‘human error’ because they neglect what people are like physically, psychologically, and in more complex ways. The results range from inconvenience to tragic loss of life.
How has this situation come about? The root cause of the problem, Vicente explains in the second chapter, is a “two cultures” issue. There is a divide in the world of technological design -- just as there is in the world more generally -- between humanistic and mechanistic world-views. The humanistic view (in, say, cognitive psychology) deals with people in the abstract, ignoring that using tools is an integral human activity. The mechanistic view, on the other hand, forgets that it is real people who have to use the tools engineers develop. The two groups aren’t talking to each other: as the author puts it, “our traditional ways of thinking have ignored -- and virtually made invisible -- the relationship between people and technology.”
As is often the case in human factors engineering, the solution is both revolutionary and, on the surface, simple: what we have to do is focus on the relationship between people and technology. Taking a cue from systems thinking, Kim Vicente argues that we should focus not just on better products or better practices, but the fit between them. What this means is not the development of more high-tech or low-tech articles, but a Human-tech revolution, where the human comes before the technological but the two are always linked.
In some areas the revolution is already at work: it’s not always the case that technology doesn’t take the human factor into account. When it does, as in the case of the Reach toothbrush, the Palm Pilot, or the “critical incident” reporting method developed at the Philadelphia Children’s hospital, the technology is a success. The Fender stratocaster guitar became the favourite of musicians around the globe because it was designed with the needs of guitarists in mind, in everything from its overall shape to the position of its controls. The Human-tech Aviation Safety Reporting System, a way for pilots to confidentially report near-misses, has made air travel dramatically safer.
Technology as Kim Vicente understands it isn’t just the physical “stuff” we use. In The Human Factor the word is used in a much broader sense, to include the physical and non-physical elements of complex systems. Information, teamwork, organizational structures and political decisions play a crucial role in determining how well a technological system as a whole functions. The “Human-tech ladder” sets this out in more detail, and also provides the structure for the rest of the book. Design should begin by understanding a human or societal need, and then tailoring the technology to reflect what we know about human nature at the physical, psychological, team, organizational and political levels.
Kim Vicente offers a host of examples of technology relating to human needs poorly and well at each level. The physical is perhaps easiest to understand: a toothbrush that fits into hard to reach parts of the human mouth is better tailored to the human body than one that cannot. At the psychological level, technology has to take into account how people process and remember information, whether in designing voicemail systems or airport baggage checks. Poor Human-tech can be devastating. For example, awkwardly placed and uninformative gauges in the design of the control room at the Three Mile Island nuclear power station left even highly trained engineers uncertain as to the status of the reactor, contributing to the infamous accident there.
At the team level, the Cockpit Resource Management system is a way of training pilots to communicate and share responsibilities effectively. The way people work together is itself a form of technology that needs to run smoothly to avoid disastrous accidents, such as the time an Eastern Airlines jet crashed in Florida because the entire crew was distracted by the condition of an unimportant light bulb and no-one attended to flying the plane.
Kim Vicente discusses the human factor at the organizational level in chapter seven of The Human Factor. “Soft” technology such as staffing levels and corporate culture can be designed so that an organization learns from its front-line staff. For instance, the medical community traditionally holds individual doctors and nurses responsible for mistakes. When things go wrong we tend to blame people -- when in fact they may have made heroic efforts to use poorly designed technology. Errors in hospitals are more often the result of systemic flaws: none is wholly at fault, but together they interact to cause accidents. At the Philadelphia Children’s hospital, the Human-tech solution is a system which encourages staff to make full reports on near-misses, and asks them to tell managers about potential dangers so that the hospital as a whole can institute protective measures. This critical incident technique led to a 90% reduction in medical mistakes at the hospital.
The final level of human nature which The Human Factor addresses is the political. Here, a Human-tech shows us that when political elements -- laws, funding, regulations -- ignore what we know about human nature, dangers arise. In the case of the E. coli tragedy in Walkerton, Ontario, Kim Vicente uncovers a host of “system design” elements at the political level -- policy aims, legal regulations, budget allocations -- which interacted with environmental factors and staff incompetence to kill seven people and make thousands of others sick.
In conclusion, Kim Vicente feels that our civilization is at a crossroads: we have to change our relationship with technology to bring an end to technology-induced death and destruction, and start to improve the lives of everyone on the planet. The final chapter of The Human Factor sets out the ways we can regain control of our lives. As consumers, we can recognize and distinguish better designed products, and buy the more Human-tech ones. By participating actively in society we can remind people that ignoring the human factor, as happened at Walkerton, has terrible implications. In our workplaces we can all ensure that more human friendly technologies, hard and soft, predominate. Companies need to take a Human-tech approach to the rules and practices they institute, and design soft systems to guarantee that their employees have the competencies, information, goals and commitment to do their jobs. Other bodies, from the media to engineering schools can all play their part in making technology with a close affinity to human nature the norm rather than a rarity: a better world will be the inevitable result.
From the Hardcover edition.
About the Author
In 1999, Kim Vicente was chosen by TIME magazine as one of twenty-five Canadians under the age of forty as a “Leader for the 21st Century who will shape Canada’s future.” A professor of engineering at the University of Toronto, he lectures widely around the world and has acted as consultant to, amongst others, NASA, NATO, the US Air Force, the US Navy, Microsoft and Nortel.
He lives in Toronto.