In fake news we trust

Ten days before the US Presidential election, Donald Trump lashed out at Facebook, Google and Twitter, claiming these ‘dishonest media’ companies were burying the FBI’s investigation of his Democrat rival Hillary Clinton. Then after Trump’s victory, ‘The Donald’ claimed instead that his millions of followers on Facebook, Twitter and Instagram had helped him win.

So do social media networks now qualify as news organisations? Many Twitter users don’t think so. They’re technology companies was one reply to Trump’s Twitter account. Wow! You think Twitter, Google and Facebook are media companies? asked another.

A third replied: Social media is made up of members sharing info. Facebook agreed, asserting that it is a technology group, not a media company. And Twitter stated after the vote that it was incorrect to ‘scapegoat’ social media for an election result.

Yet Trump’s comments and Facebook’s subsequent admission that fake news was posted on its site during the election campaign are timely reminders that the platforms on which news is being delivered are undergoing fundamental change.

President Obama has stated that social media creates a ‘dust cloud of nonsense’, bemoaning ‘an age where there’s so much active misinformation and it’s packaged very well and it looks the same when you see it on a Facebook page or you turn on your television’.

Facebook chief executive Mark Zuckerberg meanwhile asserted on his timeline that ‘more than 99 per cent of what people see is authentic’, but added: ‘I believe we must be extremely cautious of becoming arbiters of truth ourselves.’

At the heart of this matter is Facebook’s news feed algorithm, automated software which ranks users’ actions and delivers them the posts it deems them likely to find most engaging. The algorithm has a closely-guarded and constantly-changing formula.

Facebook initially used a system called EdgeRank as part of this process but five years ago it began using machine learning technology to determine what appears in users’ feeds. It is a gigantic, powerful machine, given Facebook’s global user base of 1.65 billion people – around a quarter of the world’s population – and two million advertisers paying for slots on its site.

An estimated 1,500 pieces of content might be displayed in a member’s news feed, driven by more than 100,000 unique variables including affinity, time, relationship settings, post types and user actions such as hiding posts, clicking on adverts and viewing others’ timelines.  Zuckerberg has pledged to ‘always update users on how Facebook’s news feed evolves’.

However, communicators remain concerned that turning social media sites into algorithm-controlled intermediaries for news content has transformed the business model of traditional news publishers.

‘Their business models have completely changed in the last few years from what was a fairly simple model of creating and selling content direct to its readers,’ states Jim Hawker, co-founder of digital marketing and PR agency Threepipe.

‘Publishers are constantly playing catch up to the ever increasing demands of Facebook, Google and other social platforms that have become the conduit for news and long-form content.’

That still should not give the likes of Facebook the status of a news organisation, insists Chad Latz, president of the digital transformation group of PR agency Cohn & Wolfe. ‘If one considers how these properties function, reference to social media channels and search engines as ‘news organisations’ would be both misleading and a miscategorisation,’ he states, adding that he prefers the term ‘social news recommendations channels’ for social media sites.

‘True news organisations apply an editorial and often curatorial approach to what content gets developed and served to their audiences,’ he continues. ‘There is also the matter of journalistic standards of objective reporting, versus the statement of mere opinion.

‘While some content on the websites of news organisations might be determined by algorithms for the purposes of showing adjacent or related content, based on related terms or apparent user interest, it does still exist in the context and editorial choices of what the news site itself publishes. Facebook has no inherent editorial feature.’

The issue is made worse, argues Hawker, by the trust that readers are placing in their social media sites, with people increasingly reading all of a news item on such a website, rather than clicking through to the original provider.

Indeed, a recent survey by Pew Research claims that Facebook and Twitter were ‘central in shaping American voters’ perceptions’ at the Presidential election, with one in five social media users saying they modified their views about a political or social issue because of a social media post.

It also found that two thirds of Facebook’s 156 million members in the US claim they get news from the site. On Election Day, Twitter users sent more than 75 million election-related tweets worldwide, while Facebook said 115 million users ‘liked’, posted, commented and shared content related to the election about 716 million times.

Lewis Webb, associate director for creative strategy at FleishmanHillard Fishburn, believes such influence puts some onus on social networks to be responsible about how they disseminate news, though there are limits.

‘A huge part of Facebook’s value is being able to sift through all of the content in your network and then serve you what is important, interesting or emotional,’ he says. ‘So they voluntarily cross the line of controlling what we see, but want to draw their own lines on how far they will go.

‘Of course they have some responsibility for what is on their network, ensuring that users are not exposed to illegal content. But asking them to govern content beyond what is legal is a messy suggestion.’

Luke Peters, head of content and social media at public relations agency Text100, argues that the controversy over fake news shows that algorithms are not the issue with news on social media websites.

‘The issues raised by the US election don’t necessarily boil down to flaws in algorithms. Facebook and Google do a very good job of reacting to their audience’s needs – tailoring their software to highlight what they think readers will click on,’ he says.

‘Rather, it’s a case of vetting, or lack thereof: there’s simply not enough intervention from either company when patently false news begins to distribute. Google has a set of criteria that sites must meet in order to be included in Google News searches which, in theory, encourage transparency, but when ‘false’ news networks start appearing in results, or other mainstream sites start linking to them, clearly there’s a problem.

‘Facebook’s difficulties meanwhile are amplified by how easy it is for anyone in its audience to create and distribute false reports or memes. It’s very easy to share sentiments you want to believe are true, and so people do.’

Indeed, Paul Horner, who runs a network of fake news sites in the US, told the Business Insider online news site that he believes lack of checking of such stories won Trump the election.

‘My sites were picked up by Trump supporters all the time,’ he said. ‘I think Trump is in the White House because of me. His followers don’t check anything. They’ll post anything, believe anything. His campaign manager posted my story about a protester [against Trump] getting paid $3,500 as a fact. Like, I made that up. I posted a fake ad on Craigslist.’

How should communicators adapt to this new world? It starts, argues Wayne Guthrie, founder of technology consultancy Fearlessly Frank, with an acceptance of the new reality. ‘Facebook is already a news organisation. Checking Facebook has replaced reading the newspaper for more than half the world’s literate population,’ he says.

‘But, unlike traditional news media, Facebook’s ‘friends’ structure means the content rarely challenges the reader’s beliefs, biases and prejudices. Facebook cannot come even close to being a new fourth estate. An algorithm has no interest in broadening anyone’s horizon or offering a fundamentally different point of view.’

One radical solution Guthrie has advised on is the setting up of a new type of social network. Vero, co-founded by Ayman Hariri, the billionaire son of the assassinated former Lebanese prime minister Rafik Hariri, has pledged to do things differently.

‘An algorithm does whatever it is programmed to do,’ says Hariri. ‘So if it’s programmed to be malicious, that’s what it will be like. I don’t believe that Google or Facebook are sitting there doing that.

‘But there’s a lot of pressure on both of them because they have to keep up with expectations on Wall Street. Given that they have an advertising model, they have to sell more ads and therefore they have to continuously grow their algorithms with big systems and quantum computing.

‘We don’t use algorithms and we don’t want your big data. We’re not going to measure peoples’ lives ever. We believe in a chronological feed. We’re always going to be thinking about ways to enhance the user experience and algorithms don’t do that.’

What else can be done? Hawker believes the onus is on traditional news publishers to think digitally before they consider their print operations first. He argues that they must find ways of working within these platforms, but also think more creatively about how to monetise their content outside of them.

Computational linguistics expert Jason Baldridge meanwhile suggested at a recent Cohn & Wolfe event that social media websites might benefit from the creation of ‘anti-filter bubbles’ that introduce bias correction into slanted posts.

‘This may be an important step that can help us be exposed to, and understand, the perspectives of those who differ from ourselves,’ comments Latz. ‘In the truest sense, social news feeds are just that. They represent displayed content driven by your own platform behaviour, language choice and publishing activity, interest graph, alleged preferences, or those of your friends, often with priority given to those closest and most similar to yourself – regardless of whether the content constitutes ‘news’ or not.’

Peters believes social media sites will eventually introduce more human intervention to police news feeds, while Webb would like more transparency around algorithms. In the meantime, Webb believes communicators can help shape a story’s life beyond its publication by gently encouraging media contacts to push stories through social channels and should be wary of trying to play the social networks at their own game.

‘Attempting to influence social algorithms by using popular topics or memes is a very short-term tactic,’ he warns. ‘Unless your story or content is genuinely relevant to your brand, this approach is at best a distraction, and at worst it is something that your audience will see through as a cynical and insincere attempt to be popular. Becoming a trending topic isn’t a strategy.’

Unless you are Donald J Trump, of course. However, even he may find in his four years of office that keeping the social media machine oiled with the right kind of stories is a tough ask.

His Facebook page will be well worth watching.