MENU
Airplane Runway Crash

Replacing Professionals With a Computer

October 27, 2013 • Innovation, News

Ah, my old nemesis!  Nick Carr, the same one who caused a furor by insisting that “IT Doesn’t Matter”, meaning IT is a commodity and of no strategic value to a business, now believes that IT is taking over so much of what we do that we’re in danger of losing the skills we need to function ourselves.  Sounds like, from his article in The Atlantic that we’re on the road to “Idiocracy“.

Notwithstanding the irony of who is making the argument, it’s an interesting one.  He used examples of airline crashes caused by pilots who, needing to take over from the automatic pilot in an emergency, failed to respond instinctively as any pilot would have in time’s past, and, lacking reflexes or time to think through the problem, crashed killing all aboard.  Dramatic, but it’s not hard to think of functions that used to be handled by trained and experienced professionals that are now being delegated more and more to a computer.  Stock trading is now done in high volumes by rules-based systems; all kinds of technicians, doctors and executives rely on decision support software; without accounting and auditing functions quarterly results would take years to produce.

In my business I’ve advocated that rules-based AI systems would allow underwriters to quote ten times more  policies.  But even as I make that case, it occurs to me that success depends on having an experienced cadre of underwriters to compose the rules.  So what happens when those people retire and the underwriters behind them have been relying on the AI for years?  Where will they have gathered their experience?  Anyone who has been doing a job for years will have fantasies about the computer system that can take care of their routine and repetitive tasks; but there’s usually a lot of baby in that bathwater.

If you’re considering automating a sophisticated function you should build in a human dependency mechanism that keeps the skills in accessible to human operators.  Not just in case of failure, but so that the system can develop and advance.  Carr’s article suggests that autopilot systems should shut off from time to time, or be constrained in their scope so that humans are required to perform certain tasks and stay sharp.  This may be hard to commit to in practice.  Whether or not human traders put in their own orders from time to time will not affect the outcome should the company’s high-frequency trading system go off the rails (as famously happened in 2010).  Self-driving cars might prove to offer such improved safety that we’ll opt to accept the occasional cost of a sudden failure.  But wherever possible we should design our systems so that we don’t simply hand the keys to the kingdom over to Skynet.

Leave a Reply

Your email address will not be published. Required fields are marked *

« »