Tag: algorithms

  • Guessing ethnic affinity

    Guessing ethnic affinity

    What’s your ethnic affinity? Apparently Facebook thinks its algorithm can guess your race based upon the nature of your posts.

    This application is an interesting, and dangerous, development although it shouldn’t be expected that it’s any more accurate than the plethora of ‘guess your age/nationality/star sign’ sites that trawl through Facebook pages.

    Guessing your race is something clumsy and obvious but its clear that services like Google, LinkedIn and Facebook have a mass of data on each of their millions of users that enables them to crunch some big numbers and come up with all manner of conclusions.

    Some of these will be useful to governments, marketers and businesses and in some cases it may lead to unforeseen consequences.

    The truth may lie in the data but if we don’t understand the questions we’re asking, we risk creating a whole new range of problems.

    Similar posts:

    • No Related Posts
  • The limitations of algorithms

    The limitations of algorithms

    Are algorithms getting too complex asks Forbes Magazine’s Kalev Leetaru in an examination of how the formulas that are increasingly governing our lives have grown beyond the understanding of their creators.

    With computer code now controlling most of the devices and processes we rely on in daily life, understanding the assumptions and limitations of  those programs and formulas becomes essential for designers, managers and users.

    Leetaru cites the Apollo 13 malfunction and Volvo’s recent embarrassment where a self driving car nearly ran over a group of journalists however there’s no shortage of more tragic mistakes from the consequences of software design decisions, the crash of Air France 447 over the Atlantic Ocean with the loss of 228 lives where two pilots who stalled their plane due to misunderstanding the characteristics of their cockpit  is one recent sad example.

    As business and government becomes more dependent on software, more risks will arise from managers not understanding the limitations of the algorithms they use in their business.

    Similarly a range of industries to exploit the quirks of algorithm driven markets are developing, the Search Engine Optimisation business designed to exploit quirks in Google’s search algorithm is an established example but more will come to the fore as people find ways to profit by anticipating price movements.

    However algorithms have a way to go before they fully take over, as Salon’s examination of Facebook’s news feed reveals a key part of the social media service’s deciding what appears on users screens are the decisions of around thousand ‘power users’.

    The news feed algorithm had blind spots that Facebook’s data scientists couldn’t have identified on their own. It took a different kind of data—qualitative human feedback—to begin to fill them in.

    While Facebook falls back on large focus groups to fill in the algorithm’s gaps, Uber has found a different problem in estimating driver arrival times where it’s currently not possible to accurately calculate estimated times of arrival in real time.

    “The best way to minimise time differential issue is to communicate statistically expected time, which will result in almost always being different than actual (i.e. wrong), but will be less different/wrong on average,” says Uber CEO Travis Kalanick.

    Uber and Facebook’s challenges with their algorithms illustrate there’s some way to go before all critical business functions can be handed over to software but as automation becomes standard in many areas, not least autonomous vehicles, the limitations of programs and the assumptions of programmers will become apparent.

    Similar posts:

    • No Related Posts
  • King Canute and Google: When the algorithm is wrong

    King Canute and Google: When the algorithm is wrong

    As society and business drown in big data we’re relying on algorithms and computer programs to helps us wade through a flood of information, could that reliance be a weakness?

    British Archeology site Digital Digging discusses how Google displays Manchester United winger Ryan Giggs in the results search for Cnut, the ancient king of Denmark better known in the English speaking world as King Canute.

    Apparently Giggs appears in the search results for Canute because of the footballer’s futile attempt to hold back a tide of information about his love life.

    While Google’s algorithm seems to have made a mistake, it’s only doing what it’s been programmed to do. A lot of trusted websites have used the term ‘Canute’ or ‘Cnut’ in relation to Giggs so the machine presents his picture as being relevant to the search.

    Confusing Ryan Giggs and King Canute is mildly amusing until we consider how critical algorithms like Google Search have become to decision making, there are no shortage of stories about people being wrongly billed, detained or even gaoled on the basis of bad information from computers.

    The stakes in making mistakes based on bad information are being raised all the time as processes become more automated, a chilling technology roadmap for the US military in Vice Magazine describes the future of ‘autonomous warfare’.

    By the end 2021, just eight years away, the Pentagon sees “autonomous missions worldwide” as being one of their objectives.

    Autonomous missions means local commanders and drones being able to make decisions to kill people or attack communities based on the what their computers tell them. The consequences of a bad result from a computer algorithm suddenly become very stark indeed.

    While most decisions based on algorithms may not have the life or death consequences that a computer ordered drone strike on a family picnic might have, mistakes could cost businesses money and individuals much inconvenience.

    So it’s worthwhile considering how we build the cultural and technological checks and balances into how we use big data and the algorithms necessary to analyze it so that we minimise mistakes.

    Contrary to legend, King Canute didn’t try to order the tide not to come in. He was trying to demonstrate to obsequious court that he was fallible and a subject to the laws of nature and god as any other man.

    Like the court of King Canute, we should be aware of the foibles and weaknesses of the technologies that increasingly guides us. The computer isn’t always right.

    Similar posts:

    • No Related Posts