What editors worry about today (notes from Newsgeist)

I went at the weekend to a Google Newsgeist conference in Copenhagen. The discussions are under ‘Chatham House’ rules (views can’t be attributed to individuals) but here are some quick and selective notes on what I learnt as a couple of hundred people from all over Europe (and a few from the US) chewed over journalism, technology and news.

  • Top shock value. When online news platforms try out several headlines on the first version of a story to see which one works best, bots can be used to ‘game’ the results of a survey, distorting the result and delivering – say – a headline more sympathetic to an individual featured in the story. The bots were originally developed to twist these experiments in favour of advertisers, but can just as easily be used to bend headlines in favour of anyone with the clout or expertise to deploy bots. If news sites aren’t savvy and careful.
  • Man with numbers. An expert on news consumption in Facebook made this simple point. The news people are shown in Facebook is more balanced than a lot of people imagine. What is unbalanced is people’s consumption of the stories they are shown. They simply ignore what they think they aren’t going to like.
  • Big underlying fear. No one quite knows what to do about this, but they’re afraid of it. Societies which can’t agree about what might or might not be true are at risk. That isn’t the same as societies disagreeing about stuff: democracies do that ceaselessly. But if a society is fundamentally divided on how to establish (with evidence) truth and how to recognise that, trouble follows. America in the age of Trump was on everyone’s mind. ‘We are losing our sense of collective reality,’ as one participant put it. (My views on this in a review of Cass Sunstein’s book #Republic are here). No consensus on the causes: many people think social networks cause or aggravate the problem, others think it has deeper causes such as social, demographic and geographic segregation.
  • New trend. Journalists, and particularly those with technology backgrounds, are beginning to think harder about how algorithms surface information and the long-term, accumulative implications. Those algorithms which lean particularly heavily on emotion (Facebook) were contrasted with Upday (a news app produced by Samsung and Axel Springer), whose designers claim to be balancing the emotion with some reason and use of public interest criteria. Are we witnessing the first attempts to design an editor-in-chief with machine learning?
  • To be continued. Online makes the disguise of information easier, cheaper and almost frictionless. Huge debunking and factchecking efforts have responded to new fears about ‘fake news’. The Google people present agreed that more technical inquiry was needed into whether there might be better ways to ‘hard code’, label or tag information to make it harder to distort or misuse. I talked about the INJECT project (with which I’m involved) which aims to produce software which, among other things, will help journalists add references and backing to what they write with minimum hassle.
Share

Tags: , , , , , , ,

Comments are closed.