Faking it: All the news that’s fit to share
Mark Frankel
BBC News social media editor. Twitter: @markfrankel29

A story about payment to Trump protestors was one of many fakes shared on social media
One of the fall-outs from the US election has been complaints that Facebook allowed its users to be influenced by what’s become known as ‘fake news’ - sites that looked like news but were really just propaganda. In this case, the problem seems to have been almost exclusively in a direction that favoured Trump over Clinton.
Facebook isn’t denying the existence of the problem but finds itself in an awkward position because it’s reluctant to take on the responsibilities of a news organisation. After all, it’s never claimed to be one.
So where does the ‘fake news’ controversy leave news organisations like the BBC? Feeling smugly content that our whole operation is dedicated to ‘true news’? Well, it’s not quite that simple. Large numbers of people now see BBC content, for instance, on platforms like Facebook. We may find ourselves rubbing shoulders with the fakers, which is far from ideal.
To some extent, this is nothing new. There's always been fake news on the internet. But not all fake news is the same order of magnitude or significance. And it’s worth separating out parodies that get shared on social media from articles whose readers never get further than the misleading headlines, but they get shared too - and so become part of the fake agenda. That’s just the way people behave.
What makes today's debate different and more serious is that here the admittedly unquantifiable impact of fake news coincides with an extremely bitter election campaign and an extremely close result.
To see that fake news happened before, during and after the campaign - and not just in the service of Mr Trump - take a look at the controversy about the photograph of White House staff, ashen-faced as they allegedly watched Donald Trump on his first visit to the White House. It circulated on social media but it emerged later that the staff were actually watching President Obama accepting Trump’s victory a couple of days earlier: same emotions, but the different context takes the edge off the power of the photograph.

The moment NBC News challenged Rudy Guiliani over tweets from a parody account
Facebook has been singled out in the current debate on fake news. It’s far from perfect but you don't have to search far to see much of the same fakery on Twitter, Instagram, Google and YouTube. Witness the recent problem with Google search ranking a fake story about Trump winning the popular vote.
Facebook stands out because it's the largest social sharing beast in the jungle with a large slice of the world actively participating in the circulation of content. What’s more Facebook’s algorithms are geared towards emotion and content that resonates on a personal level - rather different from Google, with its mission to “organise all the world’s information”.
To be fair, there are already ways you can flag up fake news on Facebook, as well as the usual privacy mechanisms to mute or block content you don’t want to see. To most users, though, these controls are very much secondary to the familiar ones that allow them to share, like and comment.
And here’s the challenge: if there is to be real change, any debunking needs to happen in real time, at the point where the fake stuff’s being shared. "Donald Trump Protestor Speaks Out: 'I Was Paid $3,500 To Protest Trump’s Rally'" needs to be nipped in the bud as it’s trending, not in some considered debunk two days later, that won’t appear in anyone’s news feed.
News organisations need to be part of the debate in real time and for that to happen, the key social networks and their algorithms would have to give equal prominence to the factual debunks as to the feed-topping, fake sensations - in real time. That requires a high level of buy-in.
Mark Zuckerberg has said, rather belatedly, that Facebook wants to deal with the problem of fake news, although the company doesn’t want to land itself with a new role as the arbiter of truth. (That isn’t our job either, by the way. Dedicated news publishers like the BBC are in the business of presenting facts and context in an objective and impartial way.)
While there may be an opportunity opening up here, we shouldn't be blind to the fact that social media companies’ primary interest is in bringing an audience to their platforms and chasing advertising revenue.
And in truth, there may be a degree of self-interest on both sides. We, the serious news providers, won’t achieve anything without the co-operation of Facebook et al. They build the technology. So if they are prepared to look at filters, warning systems and to act more resolutely against known transgressors, then I think we should collaborate. If we’re serious about an independent press and the limits of what a tech company can do, we can and should work together.
The argument about editors for social media companies feels a little limited. We all benefit to greater and lesser degrees from the resonance of our content on Facebook so why not collaborate on fact-checking and share our experiences of what works. (Journalist and academic Jeff Jarvis has some interesting ideas here.)
In the end, self-policing by social media companies can only take us so far. We all need to grow up and accept that fake news is a by-product of the internet and human nature but it's not hard for the social media giants to build a few more filters to enable us to trust a little more of what we read and share at source on their platforms. The tech expertise is already out there. It just needs integrating.
Other Academy blogs by Mark Frankel
Is this the real Pope? Five ways to spot a fake Twitter account
How we discovered the truth about YouTube’s Syrian ‘hero boy’ video
The Pope, the dictator, the fake photo: It pays to fact-check social media
The Academy section on social media skills, including verification
BBC Three: How to stop getting false news in your Facebook feed
