Facebook grows up

Any powerful institution has enemies. Facebook has more than most. The platform has multiplied its critics both by its inability to talk clearly about its own power and by its sluggish and uncertain response to alarm about the harms it can cause. But, all the same, the social network with 2.7bn members (and 1.5bn daily users) is doing a little growing up.

Signs of the social network’s greater self-awareness have been accumulating. Its founder Mark Zuckerberg began, gingerly, to talk about power when acknowledging Facebook’s public reponsibilities in front of American legislators last year. I attended the last of Facebook’s workshops on its planned ‘Content Oversight Board’ in Berlin last week and Facebookers there talked quite unselfconsciously about the network’s ‘editorial control’ – not a phrase that they would have been allowed to use in front of strangers a year ago.

Facebook is gradually abandoning the idea that it is a sovereign kingdom which does everything by itself and accepts as few restraints on its freedom of action as possible. We do not yet know exactly how the content oversight board will work, but Facebook has said that it will accept as binding judgements from it about whether disputed content should stay up or be taken down. (In Berlin, for example, a majority of those attending said that the board should be financed through something like a trust which would deal with the costs of staffing it.) Zuckerberg’s newest wheeze for a cyber-currency called Libra has been designed from the get-go to be a banking association with Facebook as only one member.

Another instance of greater maturity: Facebook is managing to say that on issues such as freedom of speech, exisiting international human rights law might have some useful ideas. This was a recurrent theme of the content oversight discussions in Berlin, but in truth we were pushing at a door which was already open. Last summer, Richard Allan, Facebook’s policy head for Europe, Middle East and Africa said that they look “for guidance in documents like Article 19 of the International Covenant on Civil and Political Rights (ICCPR), which sets standards for when it’s appropriate to place restrictions on freedom of expression.” Most people (me included) had missed this clue.

These moves may not sound like much, but they mark a welcome change of attitude.

So Facebook is beginning to adapt to what it has become. It began as a student social game, morphed into the the most lucrative advertising engine the world has ever seen, did good as a new way of distributing free speech but is also a multinational source of viral outrage and poison. Its creator is now trying to retrofit guardrails on to his behemoth.

A Facebook team has been holding workshops and conversations across the world for the past six months. The Berlin meeting was the last of these and a compendium of all the advice they heard in the past few months was released just afterwards. The Berlin discussions (held under the ‘Chatham House rule’ of not attributing views to individuals) uncovered some of the dilemmas not yet solved:

  1. Facebook insists that not only does it want to ‘give people a voice’ but also that it is a ‘global community’. The proposed content oversight board is supposed to give reasoned judgements about decisions taken by tens of thousands of content moderators and to do so on a consistent basis for the whole planet. Where human rights are at issue, the rights should be universal. But the context of cotent disputes is often local. Reluctant though Facebookers are to admit this, a global board is going to need local help. One suggestion raised in Berlin: regional advisers acting like ombudsmen or ‘advocates-general’ (an adviser to a court in continental Europe and the EU).
  2. The global-vs-local issue above is tied in with the problem of sifting and selecting cases for the board to decide. A billion photographs are posted on Facebook each day. Facebookers suggest that the board (or presumably a panel of its members) to decide whether something comes down or stay up in 24 hours, with reasons given within two weeks. That means that the filtering system for deciding what ever reaches the board is not a technical, procedural question but a sizeable hard question which outranks many others. Stanford law professor Nate Persily has just tried gaming how the board will work with a group of students and came to the conclusion that the board will have to be larger than the suggested 40 members – or the board will have to be full-time.
  3. I’d guess that the board will find it almost impossible to judge individual cases unless and until they have been selected on the basis that a single case is emblematic of a wider contentious issue and that the board’s decision would be designed to guide moderators in a whole category of cases.
  4. Facebook has begun to talk about human rights law, but that is of course not the same as the detail of how that will work. Rights, law and democracy are interconnected. Facebook can outsource judgement about content, but it has to refine the rules first. Its community standards on hate speech, for example, are vague compared to other codes which try to deal with the same problem (several experts offer advice on refining it here). David Kaye, the UN’s special rapporteur on free speech and who has just published a short book on the social networks, offers the example of how human rights law would help platforms deal with the anti-vaxxing movement. This is exactly the kind of problem which is toughest for social polatforms: communication which can do great harm but which is not illegal. Under the ICCPR (Article 19.3), one acceptable restriction on free speech can be a danger to public health. Given the looming infectious disease crisis caused by falling vaccination rates, I can see Facebook, perhaps urged by its content board, getting tougher on activists peddling medical myths. Choices about standards are moral, democratic, political or social. They are not legal or procedural.
  5. There was very little discussion in Berlin of how much information the oversight board would be entitled to get when researching the background to a content dispute. This is liable to become important, and perhaps disputed, quite quickly.

Eighteen months ago, I wrote this in the context of yet another row between Facebook and the news media:

“Everyone – journalists and publishers included – has a duty to help Facebook through the philosophical, political and moral issues it has landed in….Beating up Facebook and gloating over its difficulties is not going to make it go away. These questions of colliding rights (e.g. free expression vs privacy vs right to know) have been giving editors and lawmakers migraines for centuries.”


Although it is bound to be criticised for almost every aspect of the content oversight board, Facebook has taken a step forward by taking advice. But every practical detail will matter. The free speech scholars Kate Klonick and Evelyn Douek wonder if Mark Zuckerberg realised exactly what he was plunging into when he announced the content oversight board last year

“All they needed to figure out was how such an adjudicative body could govern a global community, what it would look like, what the scope of its review and jurisdiction would be, whose norms it would reflect, the composition of the board, what it would mean for it to be a diverse body, the values it would reflect, and how it would be independent, accountable, and transparent. You know, the little things.”


Share

Tags: , , ,

Comments are closed.