Has Facebook failed in tackling its content issues?

Facebook has a problem today. Or to be more precise, the social media platform has two problems – both are being played out in the media, and one of them was entirely avoidable.

Websites where the content is user-generated will always present a problem for the people who run them. For newspapers, the risk is that comments posted by readers after stories are offensive or libellous. This they monitor for, and a reader complaint routinely results in action such as deletions.

Facebook has a more complex problem. 350 million photographs are added by users every day – many shared in closed groups, some of which, the BBC reported last year, run by and for people with a sexual interest in images of children. The pages of these secret groups included obscene posts and images, and images that should have remained private.

Facebook has established two ways of spotting and dealing with problems in such a vast amount of data – programmes to try and spot inappropriateness; and a reporting function that results in a programme using an algorithm to reach a judgement on appropriateness.

The BBC came back to the story – of course they did. Its journalists found a 90-strong sample of posts and images that were in apparent breach of Facebook’s policies and reported them via the site. 82 remained up after that adjudication.

What followed made a bad situation worse for Facebook. Contacted by the BBC, the company asked for the journalists to send them the images referred to – and on receipt of them, reported the BBC to the police for sending indecent images. This is now leading the headlines, and will likely broaden criticism of Facebook and mean the story is in the media for longer.

Let’s leave aside the fact that a secret group called ‘Teenage fantasy’s [sic]: secret group’ should ring alarm bells for the right algorithm… this is a post about PR, not computing.

There are lessons to be learnt here in how a media crisis should be handled. If Facebook learns those lessons, in fact, it will also come a bit closer to its stated goal of protecting children.

How could Facebook have done better?

  1. Recognise a crisis: any organisation should have a way of spotting a problem that has the potential to get big. Facebook either isn’t very good at this, or doesn’t care, as the words ‘journalist’, ‘investigation’, ‘paedophile’ and ‘photographs’ in conjunction should see a matter passed up the chain to someone who can assess the seriousness of what’s unfolding.
  2. Have a team: any crisis, even a mini-crisis, needs a team that includes the right people – here, you’d expect that to include a technical person (who understands the IT), a lawyer and an experienced PR professional. I’d expect the PR to be the one counseling against reporting the BBC to the police.
  3. Know the BBC were always going to come back: ‘A year later, has anything changed?’ is a classic and legitimate journalistic device. The Beeb were looking at Facebook last February – the fact interest in the story had faded by March never meant it had gone away for good.
  4. Tell the truth: sounds obvious doesn’t it? Following last year’s story, Facebook gave reassurances about the systems it put in place to guard against inappropriate content. The urge to say ‘all sorted!’ is strong when you’re under pressure. If that’s not true, you prolong the story and destroy trust in what you say next time.
  5. Work out if the press have everything: the BBC presented 90 images – remember 350m are added to Facebook every day. What do you reckon – likely to be more than 90 problem pics out there?
  6. Show by its actions it takes this seriously: some tangible action – or action that will lead to a tangible action – is important. And once again, no, that’s not reporting the person who’s brought the problem to your attention to the police.

Finally, Facebook needs to know what the ‘end’ of this looks like. The ‘end’ is not when it drops out the news – it’s when Facebook knows the problem that led to the story has been substantively sorted – otherwise, and imagine the surprise on the average Facebook exec’s face when this happens – 2018 will see a report along the lines of: ‘Two years later, has anything changed?’

And it’s when an issue keeps coming back that real damage is done.

If you’d like to talk about making a plan for a crisis – or have a situation that’s ringing alarm bells for you – I hope you’ll get in touch.

Share this post