09
Apr 18

The anatomy of Facebook’s ‘huge mistake’ (or what Zuckerberg could say to the US Congress)

This week, Mark Zuckerberg is due to appear before the US Congress. There’s no shortage of people offering hard questions which should be fired in his direction. Here’s what I think he he ought to say before the interrogation begins:

“I know that members of the committee will want to know about attempts to interfere in American elections, whether social media ruins childhood or whether democracy is damaged by digital communication. We’ll get to those issues, but in opening I’d like to draw the committee’s attention to some of the bigger questions below the surface of the recent controversies.

Apologies first. We at Facebook have made many avoidable mistakes. We were blind to the scale of civic responsibilities inherent in what we do. We were warned and we didn’t want to hear. When I said the other day that in retrospect ‘we clearly should have been doing more all along’, I was drawing attention to our repeated failures to be imaginative in seeing our responsibilities to many different societies, and not just to this one.

I’m going to stop senior people at Facebook saying that were surprised when our network turned out not to be the uncomplicated force for good which we said it was. Quite a number of us have said that one failure or another was ‘unforeseen’ or that we were ‘caught out’. You can occasionally say ‘we didn’t see that coming’. But you can’t say it all the time. Truth is, we’re up to our necks in problems and we should have foreseen at least some of them.

Continue reading →

Share

25
Jan 18

Dear news publishers: how to sift signals from noise about Facebook

Facebook’s two announcements about its ‘news feed’ – that it would make news a lower priority and let users determine quality rankings – triggered an extraordinary explosion of self-pity on the part of the news media. Given Facebook’s reach (2bn users) and the quantity of advertising it has removed from established media, that’s hardly surprising. But much of this indignation is short-sighted. Some advice to newsrooms and those who run them.

  1. Don’t say you weren’t warned. Facebook never guaranteed – as far as I’m aware – any particular income stream to any publisher or that they would not switch their policy and algorithms. As ‘Instant Articles’ came on the scene plenty of wise voices said to publishers: ‘By all means experiment with this, but don’t rely on it. Ever.’ Don’t pretend you didn’t hear this.
  2. But unpredictability is now Facebook’s greatest threat to news media. Instead of trying to blackmail Facebook into returning money you think should be going to you, try something more likely to work. If Facebook wants to make nice to news (and at least part of the company seems to want to), get them to understand early warning of decisions which will affect news publishers’ revenue would be wise.
  3. Beware of striking private deals with platforms. Wider and deeper transparency requirements for platforms (almost certainly enforced by legislation) is the key to making sense and improving the slew of issues around misinformation, election manipulation and other dark arts to which Facebook has lent itself, both willingly and unknowingly. The relationship between a citizen and her/his devices will be central of twenty-first century democracies. How we know what we know (and can trust) is an issue which goes wider than the business agonies of news media. Facebook is first and foremost a gigantic advertising machine but is also now a society machine. Or a politics machine if you prefer. Politics is about how power is allocated in a society. The devices we use to connect and to collect data play an ever-larger large role in that distribution of power. And on the subject of power: try to help Mark Zuckerberg talk about Facebook’s power. The word never comes up in his bland ramblings about ‘community’ and ‘connection’.
  4. Forget any idea that platforms can be forced to pay to prop up mainstream media. Rupert Murdoch took advantage this week of the Facebook fuss to suggest that online platforms should pay news publishers for their content in the same way that cable TV companies pay programme-makers for rights to air their work. Murdoch should send whoever drafted that statement to run something obscure in Tasmania: the parallel is nonsense. Cable companies have nothing to sell to consumers until they have content. Social networks sell advertising space by leveraging network effects between friends. They have no need of news. Given that news has never been more than a low single percentage of Facebook’s total activity and that it set light to a firestorms of controversy, I can easily imagine its executives arguing that they should leave it alone for good.
  5. But that’s not possible. Facebook can’t now avoid being entangled with news. So people in Menlo Park who dream of a news-free network will be disappointed. For now, the network is simply too big and too widely used to sidestep the dilemmas which come with news and all the strong tensions and emotions it provokes. Facebook is part of the infrastructure of free speech. Period.
  6. Let’s bury the phrase ‘fake news’. As an example, British government spokesmen made two announcements this week of initiatives to ‘combat’ (as headline writers like to say) fake news. One change was actually new machinery about detecting misinformation spread by states, a quite different thing. Even making allowance for the fact that the current (Conservative) government is in slow-motion and terminal decline, it is clear that the announcement-makers have no idea what they’re talking about. Quite apart from the problems of defining the news the government doesn’t like because it is ‘fake’, this sort of knee-jerk is liable to reverse long traditions of protecting free speech. A professor of artificial intelligence who goes by the splendid name of Yorick Wilks nailed it: “Someone in Whitehall has lost all sense of what a free society is about if they think government should interfere in determining what is true and false online.” One simple test for evaluating policies about misinformation: is it an exact remedy for a specific harm?
  7. Everyone – journalists and publishers included – has a duty to help Facebook through the philosophical, political and moral issues it has landed in. Treating the platforms as if they are merely technical means of transmission open to exploitation by bad actors is exactly what Facebook, Google and Twitter have begun (at differing speeds) to acknowledge that they are not. Beating up Facebook and gloating over its difficulties is not going to make it go away. These questions of colliding rights (e.g. free expression vs privacy vs right to know) have been giving editors and lawmakers migraines for centuries. Similar countries – across Europe for example – take radically different approaches based on history, culture and experience. Example: the contrasting approach to privacy rights in Spain or France with Scandinavia.
  8. Keep repeating: news isn’t nice. Facebook’s hard problem is the tension between its business model and the public interest value of news. The business model relies on interaction between users and time spent on the network (which Mark Zuckerberg now wants to be ‘time well spent’). That gives priority to emotion and ‘shareability’. Indignation and outrage go viral easily. News published and distributed in the public interest may appeal more to reason than to emotion. It may be unpleasant, complicated. Worse still (from Facebook’s point of view) news may be best conveyed by people who know more than other people, thus undermining any idea that everyone in a social network knows as much as everyone else. News may even insist that you learn what you may not want to know. None of this aligns easily with ‘community’ or ‘connection’, which are so central to Facebook. Tough problem, but not insoluble.
  9. Two things you can repeat to Facebook as often as you like: be transparent and take advice. The network is doing all sorts of research (psychological affects of social media, manipulation risks etc); they share frustratingly little detail. It astounds me that neither Facebook nor Twitter has ever set up advice groups of independent experts to advise them on hard public interest problems. (Google has done so, to good effect). Even better, try to persuade Facebook to combine the transparency and the advice. Allow experts and scholars outside the company to be involved in the research and allow them to talk and write about it. If these questions are really Facebook’s tests for quality of news, they need help to strengthen them. (Powerful academic version of this case from the Dutch scholar Natali Helberger here).
  10. Lastly, stop treating the future of news media as if it’s a zero-sum between journalism on the one hand and platforms on the other. The public don’t see the distinction that clearly (although mainstream media are more trusted). What everyone – users, platforms, news media – has to worry about is how to tell what is reliable from what is not, to verify and authenticate. New problems like the easier faking of video arise all the time. Solving that kind of stuff will take cooperation.

 

Continue reading →

Share

09
Jan 18

A short handbook on opening up the hi-tech giants

During the final months of 2017 a lot of public and private attention was being directed at opening up the secrets of the algorithms used by social networks and search engines such as Facebook and Google. They have edged cautiously towards opening up, but too little and too late. The attacks on their carelessness have mounted as their profits have climbed.

The public pressure came from voices (including mine) arguing that inquiries into misinformation/disinformation in news were all very well but missed the main point. Attention is also being paid to this in private negotiations between the social networks and news publishers.

These discussions have included the suggestion that the networks might make much more detailed data on how they operate available to the publishers, but not to the public. This kind of private deal won’t work if it’s tried. The functioning of the networks is crucial to publishers, but it matters to a lot of other people as well.

You may think that your elected representatives are on the case: there’s an inquiry into disinformation in news in the UK parliament. German and French politicians are bearing down on the online giants. But not much will change until these legislators and pundits look at the detail of how social networks function. I suspect that the German and French attempts to regulate these platforms will, however well-intentioned, misfire. Regulation of self-expression is inherently difficult because of the collision with rights of free speech.

Continue reading →

Share

02
Nov 17

Facebook has hit a wall – the people running the company don’t know it yet

 

 

 

Continue reading →

Share

16
Oct 17

Curb your enthusiasm for hi-tech giant-killing: start with transparency

Demands to regulate hi-tech companies like Google, Facebook and Apple are being heard at deafening pitch almost every day. This rush by the political herd on both sides of the Atlantic to make new laws (or to enforce the breakup of these corporations) is no better focussed or thought-out than the extraordinary degree of latitude which the same political classes were prepared to allow the same online platforms only a couple of years ago.

The cry for regulation and the laissez-faire inertia of the recent past have a common origin: ignorance. The cure for ignorance is knowledge. And knowledge of exactly what these companies do and don’t do must be the foundation of any further action to get them to shoulder their moral and civic responsibilities. If laws are needed to prevent harm, let them first compel transparency. Any politician pushing that line has my vote.

When Mark Zuckerberg of Facebook rejected claims of Russian online interference in the US presidential election as ‘pretty crazy’, he was either lying or ignorant of what had been happening on Facebook. He has of course admitted he was wrong since (awesomely well-researched narrative by Alexis Madrigal of The Atlantic here).

But suppose that Facebook is open to inspection by national agencies or commissions which supervise elections. That would not necessarily mean open to public inspection, but perhaps to bodies whose duty is to check electoral fairness and compliance with the law. Why would that be so hard?

Continue reading →

Share

25
Sep 17

Facebook: reactive apology and re-inventing the wheel

Watching Facebook wrestle with ageless questions about privacy, free speech and fair play rules for democratic elections is a little bit like watching a group of students produce an occasional essay on political philosophy – without the benefit of any reading or teaching on the subject. The Facebook executives struggling with these questions want to start over without the clutter of received ideas.

Mark Zuckerberg’s latest post, on dealing with the problems of Facebook being used for electoral interference, gives us a lot of sensible changes which the network will make. It also hands down the great man’s definition of freedom. Given that Zuckerberg is a de facto electoral commission for many states on the planet, this a statement of some importance for civil society.

The key sentences are:

Freedom means you don’t have to ask permission first, and that by default you can say what you want. If you break our community standards or the law, then you’re going to face consequences afterwards.

Continue reading →

Share

26
Jun 17

What editors worry about today (notes from Newsgeist)

I went at the weekend to a Google Newsgeist conference in Copenhagen. The discussions are under ‘Chatham House’ rules (views can’t be attributed to individuals) but here are some quick and selective notes on what I learnt as a couple of hundred people from all over Europe (and a few from the US) chewed over journalism, technology and news.

  • Top shock value. When online news platforms try out several headlines on the first version of a story to see which one works best, bots can be used to ‘game’ the results of a survey, distorting the result and delivering – say – a headline more sympathetic to an individual featured in the story. The bots were originally developed to twist these experiments in favour of advertisers, but can just as easily be used to bend headlines in favour of anyone with the clout or expertise to deploy bots. If news sites aren’t savvy and careful.
  • Man with numbers. An expert on news consumption in Facebook made this simple point. The news people are shown in Facebook is more balanced than a lot of people imagine. What is unbalanced is people’s consumption of the stories they are shown. They simply ignore what they think they aren’t going to like.
  • Big underlying fear. No one quite knows what to do about this, but they’re afraid of it. Societies which can’t agree about what might or might not be true are at risk. That isn’t the same as societies disagreeing about stuff: democracies do that ceaselessly. But if a society is fundamentally divided on how to establish (with evidence) truth and how to recognise that, trouble follows. America in the age of Trump was on everyone’s mind. ‘We are losing our sense of collective reality,’ as one participant put it. (My views on this in a review of Cass Sunstein’s book #Republic are here). No consensus on the causes: many people think social networks cause or aggravate the problem, others think it has deeper causes such as social, demographic and geographic segregation.
  • New trend. Journalists, and particularly those with technology backgrounds, are beginning to think harder about how algorithms surface information and the long-term, accumulative implications. Those algorithms which lean particularly heavily on emotion (Facebook) were contrasted with Upday (a news app produced by Samsung and Axel Springer), whose designers claim to be balancing the emotion with some reason and use of public interest criteria. Are we witnessing the first attempts to design an editor-in-chief with machine learning?
  • To be continued. Online makes the disguise of information easier, cheaper and almost frictionless. Huge debunking and factchecking efforts have responded to new fears about ‘fake news’. The Google people present agreed that more technical inquiry was needed into whether there might be better ways to ‘hard code’, label or tag information to make it harder to distort or misuse. I talked about the INJECT project (with which I’m involved) which aims to produce software which, among other things, will help journalists add references and backing to what they write with minimum hassle.
Share

30
May 17

A little (election) manifesto for Facebook

When an issue starts to surface at literary festivals, you can tell that it’s gone middle-class mainstream. Over the weekend, TV celebrity Stephen Fry recommended that Facebook be classified ‘as a publisher’ in order to tackle ‘fake’ news and online abuse.

With an election on in the UK right now, Facebook is the weapon of choice for political parties wanting to spring surprises. The network is being credited with almost magical powers of persuasion and manipulation – largely because few people understand how it works. I live in a marginal constituency in London and printed political nonsense comes through our letterbox every day; I’ve no need to look at Facebook to see people fiddling with the truth.

So how should worry about Facebook be usefully focussed? In a public exchange with Facebook’s Adam Mosseri recently I urged Facebook to think much wider and to worry about helping to rebuild faith in truth in societies where that trust has begun to break down. Unsurprisingly, someone researching platforms asked me what on earth I meant. What follows is based on my reply.

When I talked about a ‘reconstruction effort in civil society’, I was compressing too much into a short phrase. Many journalists were thinking very narrowly about fake news, factchecking and misinformation. That is to say that the concerns about it are perfectly real (as are the solutions which are being tried out and multiplying), but the issue goes wider.

Continue reading →

Share