I can’t remember many non-fiction book authors whose multiple tomes have generated as much interest as Nicholas Carr’s information technology related works. Two of his three previous books, The Big Switch and The Shallows, were reviewed by me for this blog. The Glass Cage was published by Carr last year and I finally pulled it off the shelf determined to read it over the holiday weekend.
The Glass Cage follows up from the premise of The Shallows, that technology is affecting the way in which we (humans) use our brains. Citing examples of the impact of automation on airplane pilots (less reactive to stress and emergency situations requiring manual flight) and GPS devices on drivers (and our sense of direction), Carr makes the point that we need to be more careful about the choices that we make about which tasks are handed off to computers and which ones we choose to continue to do without automation.
Carr writes about the impact of automation over the past two hundred plus years and how after each round of new automation, job growth rebounded. However, after 2000, that rebound in job growth hasn’t occurred. Carr cites the work of Erik Brynjolfsson and Andrew McAfee (my review of their book, The Second Machine Age), is where they make the case that technology is eroding many jobs, particularly middle class jobs, and that those without appropriate education and training will be forced to take lower paying jobs or move from full time employment to part time employment.
Automation doesn’t just substitute for some job roles, writes Carr, but it alters the entire task. This creates a change in the attitude and skills of people who are impacted by automation. There are two cognitive conditions, automation complacency and automation bias that occur when computers create a false sense of security and when people give undue weight to the computer’s results reported through their monitors.
Carr states that systems that produce errors fairly frequently force users to remain alert versus passive and under the assumption that the computers are doing their jobs appropriately. Automation tends to turn us from actors into observers and that may diminish our skills or keep us from developing deep learning. He further writes “when automation reaches its highest level, when it takes command of the job, the worker, skillwise, has nowhere to go but down.”
Carr provides examples of automation in the medical field where extensive databases of medical conditions have been developed with a goal of providing predictive patterns for the practice of evidence-based medicine, and he warns that doctors relying on computers will lose their ability to make confident personal judgements as these databases increase in size and decision-making capability.
“For all their gifts, computers still display a frightening lack of common sense.” Carr writes that there’s no substitute for the judgment that people can provide. However, engineers and programmers compound the problem of automation reducing judgment when they hide the workings of their software from those who operate it, turning the system into a black box. Human operators have the capability to make decisions based on ethical judgment, something that computers have not yet mastered. Carr points out that in the business world, there appears to be no stability in the offing between the division of work among humans and computers.
With continuous improvements in automation, we’ve placed the power in the hands of those who own the robots, not the human laborers. This cycle of automation “is either virtuous or vicious.” Carr suggests that we (humans) need to reclaim our automated tools as “instruments of experience rather than production.” If we don’t, we’ll close off the freedoms of human experience.
I recommend this thought-provoking book for information technology decision makers as well as corporate strategists. Understanding the downside of automation may help both groups achieve their goals and differentiate their companies or institutions from their competitors. It will also benefit many people, economically and from a learning capabilities perspective