Blind faith in the algorithm

It’s fairly safe to say Apple’s ditching of Google Maps for their own navigation system has proved not to be company’s smartest move.

The humiliation of Apple was complete when the Victoria Police issued a warning against using the iPhone map application after people became lost in the desert when following faulty directions to the town of Mildura.

Mapping is a complex task and it’s not surpising these mistakes happen, particular given the dynamic nature of road conditions and closures. It’s why GPS and mapping systems incorporate millions of hours of input into the databases underlying these services.

Glitches with GPS navigations and mapping applications aren’t new. Some of the most notorious glitches have been in the UK where huge trucks have been directed down small country lanes only to find themselves stuck in medieval villages far from their intended location.

While those mishaps make for good reading, there are real risks in these misdirections. One of the best publicised tragedies of mis-reading maps was the death of James Kim in 2007.

Kim, a well known US tech journalist, was driving with his family from Portland, Oregan to a hotel on the Pacific Coast in November 2006 when they tried to take a short cut across the mountains.

After several hours driving the family became lost and stuck in snowdrifts and James died while hiking out to find help. His wife and two children were rescued after a week in the wilderness.

Remarkably, despite warnings of the risks, people still get stuck on that road. The local newspaper describes it the annual ritual as find a tourist in the snow season.

Partly this irresponsibility is due to our modern inability to assess risk, but a more deeper problem is blind faith in technology and the algorithms that decide was is good and bad.

A blind faith in algorithms is a risk to businesses as well – Facebook shuts down accounts that might be showing nipples, Google locks people out of their Places accounts while PayPal freeze tens of thousands of dollars of merchants’ funds. All of these because their computers say there is a problem.

Far more sinister is the use of computer algorithms to determine who is a potential terrorist, as many people who’ve inadvertently found themselves on the US government’s No Fly List have discovered.

As massive volumes of information is being gathered on individuals and businesses it’s tempting for all of us to rely on computer programs to tell us what is relevant and to join the dots between various data points.

While the computers often right, it is sometimes wrong as well and that’s why proper supervision and understanding of what the system is telling people is essential.

If we blindly accept what the computer tells us, we risk being stuck in our own deserts or a snowdrift as a result.

Similar posts:

By Paul Wallbank

Paul Wallbank is a speaker and writer charting how technology is changing society and business. Paul has four regular technology advice radio programs on ABC, a weekly column on the smartcompany.com.au website and has published seven books.

Leave a Reply