Facebook has a problem today. Or to be more precise, the social media platform has two problems – both are being played out in the media, and one of them was entirely avoidable.
Websites where the content is user-generated will always present a problem for the people who run them. For newspapers, the risk is that comments posted by readers after stories are offensive or libellous. This they monitor for, and a reader complaint routinely results in action such as deletions.
Facebook has a more complex problem. 350 million photographs are added by users every day – many shared in closed groups, some of which, the BBC reported last year, run by and for people with a sexual interest in images of children. The pages of these secret groups included obscene posts and images, and images that should have remained private.
Facebook has established two ways of spotting and dealing with problems in such a vast amount of data – programmes to try and spot inappropriateness; and a reporting function that results in a programme using an algorithm to reach a judgement on appropriateness.
The BBC came back to the story – of course they did. Its journalists found a 90-strong sample of posts and images that were in apparent breach of Facebook’s policies and reported them via the site. 82 remained up after that adjudication.
What followed made a bad situation worse for Facebook. Contacted by the BBC, the company asked for the journalists to send them the images referred to – and on receipt of them, reported the BBC to the police for sending indecent images. This is now leading the headlines, and will likely broaden criticism of Facebook and mean the story is in the media for longer.
Let’s leave aside the fact that a secret group called ‘Teenage fantasy’s [sic]: secret group’ should ring alarm bells for the right algorithm… this is a post about PR, not computing.
There are lessons to be learnt here in how a media crisis should be handled. If Facebook learns those lessons, in fact, it will also come a bit closer to its stated goal of protecting children.
How could Facebook have done better?
Finally, Facebook needs to know what the ‘end’ of this looks like. The ‘end’ is not when it drops out the news – it’s when Facebook knows the problem that led to the story has been substantively sorted – otherwise, and imagine the surprise on the average Facebook exec’s face when this happens – 2018 will see a report along the lines of: ‘Two years later, has anything changed?’
And it’s when an issue keeps coming back that real damage is done.