Following last week’s US election attention has fallen onto the role of Facebook in influencing public opinion and the role of rumours and fake news.
The CEO of Facebook, Mark Zuckerberg, says claims that his company’s news feed influenced the US election are nonsense but, as Zeynep Tufekci the New York Times writes, the platform has shown in its own experiments that the service does influence voters.
Sadly misinformation is now the norm on the web given anyone can start a blog and post ridiculous and outlandish claims. If that misinformation fits a group’s beliefs, then it may be shared millions of times as people share it across social media services, particularly Facebook.
Facebook’s filter bubbles exacerbates that problem as each person’s news feed is determined by what the company’s algorithm thinks the user will ‘like’ rather than something that will inform or enlighten them.
Those ‘filter bubbles’ tend to reinforce our existing biases or prejudices and when fake news sites are injected into our feeds Facebook becomes a powerful way of confirming our beliefs, something made worse by friends gleefully posting fake quotes or false news that happens to fit their world views. If you click ‘Like’, you’ll then get more of them.
Over time, Facebook risks becoming irrelevant if the news being fed from the site becomes perceived as being unreliable
For Facebook, and for other algorithm driven services like Google, the risks in fake news don’t just lie in a loss of credibility, there’s also the risk of regulatory problems when news manipulation starts affecting markets, commercial interests or threatens established power bases.
The fake news problem is something that affects the entire web and its users, for Facebook and Google it is becoming a serious issue.