16
Oct 17

Curb your enthusiasm for hi-tech giant-killing: start with transparency

Demands to regulate hi-tech companies like Google, Facebook and Apple are being heard at deafening pitch almost every day. This rush by the political herd on both sides of the Atlantic to make new laws (or to enforce the breakup of these corporations) is no better focussed or thought-out than the extraordinary degree of latitude which the same political classes were prepared to allow the same online platforms only a couple of years ago.

The cry for regulation and the laissez-faire inertia of the recent past have a common origin: ignorance. The cure for ignorance is knowledge. And knowledge of exactly what these companies do and don’t do must be the foundation of any further action to get them to shoulder their moral and civic responsibilities. If laws are needed to prevent harm, let them first compel transparency. Any politician pushing that line has my vote.

When Mark Zuckerberg of Facebook rejected claims of Russian online interference in the US presidential election as ‘pretty crazy’, he was either lying or ignorant of what had been happening on Facebook. He has of course admitted he was wrong since (awesomely well-researched narrative by Alexis Madrigal of The Atlantic here).

But suppose that Facebook is open to inspection by national agencies or commissions which supervise elections. That would not necessarily mean open to public inspection, but perhaps to bodies whose duty is to check electoral fairness and compliance with the law. Why would that be so hard?

Continue reading →

Share

25
Sep 17

Facebook: reactive apology and re-inventing the wheel

Watching Facebook wrestle with ageless questions about privacy, free speech and fair play rules for democratic elections is a little bit like watching a group of students produce an occasional essay on political philosophy – without the benefit of any reading or teaching on the subject. The Facebook executives struggling with these questions want to start over without the clutter of received ideas.

Mark Zuckerberg’s latest post, on dealing with the problems of Facebook being used for electoral interference, gives us a lot of sensible changes which the network will make. It also hands down the great man’s definition of freedom. Given that Zuckerberg is a de facto electoral commission for many states on the planet, this a statement of some importance for civil society.

The key sentences are:

Freedom means you don’t have to ask permission first, and that by default you can say what you want. If you break our community standards or the law, then you’re going to face consequences afterwards.

Continue reading →

Share

26
Jun 17

What editors worry about today (notes from Newsgeist)

I went at the weekend to a Google Newsgeist conference in Copenhagen. The discussions are under ‘Chatham House’ rules (views can’t be attributed to individuals) but here are some quick and selective notes on what I learnt as a couple of hundred people from all over Europe (and a few from the US) chewed over journalism, technology and news.

  • Top shock value. When online news platforms try out several headlines on the first version of a story to see which one works best, bots can be used to ‘game’ the results of a survey, distorting the result and delivering – say – a headline more sympathetic to an individual featured in the story. The bots were originally developed to twist these experiments in favour of advertisers, but can just as easily be used to bend headlines in favour of anyone with the clout or expertise to deploy bots. If news sites aren’t savvy and careful.
  • Man with numbers. An expert on news consumption in Facebook made this simple point. The news people are shown in Facebook is more balanced than a lot of people imagine. What is unbalanced is people’s consumption of the stories they are shown. They simply ignore what they think they aren’t going to like.
  • Big underlying fear. No one quite knows what to do about this, but they’re afraid of it. Societies which can’t agree about what might or might not be true are at risk. That isn’t the same as societies disagreeing about stuff: democracies do that ceaselessly. But if a society is fundamentally divided on how to establish (with evidence) truth and how to recognise that, trouble follows. America in the age of Trump was on everyone’s mind. ‘We are losing our sense of collective reality,’ as one participant put it. (My views on this in a review of Cass Sunstein’s book #Republic are here). No consensus on the causes: many people think social networks cause or aggravate the problem, others think it has deeper causes such as social, demographic and geographic segregation.
  • New trend. Journalists, and particularly those with technology backgrounds, are beginning to think harder about how algorithms surface information and the long-term, accumulative implications. Those algorithms which lean particularly heavily on emotion (Facebook) were contrasted with Upday (a news app produced by Samsung and Axel Springer), whose designers claim to be balancing the emotion with some reason and use of public interest criteria. Are we witnessing the first attempts to design an editor-in-chief with machine learning?
  • To be continued. Online makes the disguise of information easier, cheaper and almost frictionless. Huge debunking and factchecking efforts have responded to new fears about ‘fake news’. The Google people present agreed that more technical inquiry was needed into whether there might be better ways to ‘hard code’, label or tag information to make it harder to distort or misuse. I talked about the INJECT project (with which I’m involved) which aims to produce software which, among other things, will help journalists add references and backing to what they write with minimum hassle.
Share

30
May 17

A little (election) manifesto for Facebook

When an issue starts to surface at literary festivals, you can tell that it’s gone middle-class mainstream. Over the weekend, TV celebrity Stephen Fry recommended that Facebook be classified ‘as a publisher’ in order to tackle ‘fake’ news and online abuse.

With an election on in the UK right now, Facebook is the weapon of choice for political parties wanting to spring surprises. The network is being credited with almost magical powers of persuasion and manipulation – largely because few people understand how it works. I live in a marginal constituency in London and printed political nonsense comes through our letterbox every day; I’ve no need to look at Facebook to see people fiddling with the truth.

So how should worry about Facebook be usefully focussed? In a public exchange with Facebook’s Adam Mosseri recently I urged Facebook to think much wider and to worry about helping to rebuild faith in truth in societies where that trust has begun to break down. Unsurprisingly, someone researching platforms asked me what on earth I meant. What follows is based on my reply.

When I talked about a ‘reconstruction effort in civil society’, I was compressing too much into a short phrase. Many journalists were thinking very narrowly about fake news, factchecking and misinformation. That is to say that the concerns about it are perfectly real (as are the solutions which are being tried out and multiplying), but the issue goes wider.

Continue reading →

Share

20
Apr 17

Two faces of Facebook

This morning’s headlines are about the Facebook’s progress in connecting your brain to their social network. Their scientists, led by the ex-head of the American defence research agency Darpa, foresee the day when you won’t even have to lift a finger to press a ‘Like’ button. You’ll think it and it’ll happen.

The focus on Facebook’s announcement at their F8 developers conference in California is understandable. But my eye was caught by something quite different in what the network’s founder Mark Zuckerberg said. Something which shines a light on what a split personality Facebook is becoming on the issue of its effect on human society.

Zuckerberg talked about pictures and how much easier Facebook would make it to edit them. There’s a coffee cup in a picture of you: the touch of a key will add a whisp of steam or a second cup. You could make it look, he said, as if you’re not drinking coffee alone. Facebook will help us be ‘better able to reflect and improve our life experiences.’

New products would focus on the visual. And that, Zuckerberg said, is ‘because images created by smartphone cameras contain more context and richer information than other forms of input like text entered on a keyboard.’ Boring old words: so tiresome, so time-consuming, so slow.

Continue reading →

Share

22
Mar 17

Cut clutter, clarify and care about every word: Robert Silvers, RIP

Much has been written about Robert Silvers, one of two founders of the New York Review of Books, who died the other day. I never met Silvers but almost feel as if I knew him, despite the fact that he almost never wrote in the NYRB as an author. To read the NYRB was to read the minds of many knowledgeable people; one was also reading the mind of Silvers. His mark was on every paragraph.

That was of course because he edited every word. People who create ideas which last simplify and clarify. Listen to Silvers in this interview from four years ago. I can imagine that his voice might, to some, sound ‘elitist’ and arrogant in its certainty. But hear the clarity of purpose – and the watchful care to have that expressed in every word and comma. Lazy writing is lazy thinking, and vice versa.

The only statement of editorial mission the NYRB ever needed appeared in the first issue in 1963:

“This issue … does not pretend to cover all the books of the season or even all the important ones. Neither time nor space, however, have been spent on books which are trivial in their intentions or venal in their effects, except occasionally to reduce a temporarily inflated reputation, or to call attention to a fraud.”

Continue reading →

Share

06
Mar 17

Dear Google, your algorithm went walkabout

In the past couple of years Google has moved more and more openly into creating editorial content, albeit material assembled by computers and not by people. One algorithm experiment in this line reveals a terrible muddle about truth.

The version of machine-created material most often seen in a Google search is the box which flips up on the right hand side of a screen to summarise what Google knows about the main subject of a search. I asked Google for the nearest branch of the restaurant chain Wahaca to my home in London:

For this kind of search, such panels work just fine. I get links to Wahaca locations on the left and a summary of the things I’m most likely to want to know about Wahaca neatly laid out on the right. This is the sort of thing that search does well with what the early pioneers of online called ‘ease of do’. Exact factual information, in a split second.

Continue reading →

Share

20
Feb 17

Mr Zuckerberg’s education has further to go

Continue reading →

Share