In the past couple of years Google has moved more and more openly into creating editorial content, albeit material assembled by computers and not by people. One algorithm experiment in this line reveals a terrible muddle about truth.
The version of machine-created material most often seen in a Google search is the box which flips up on the right hand side of a screen to summarise what Google knows about the main subject of a search. I asked Google for the nearest branch of the restaurant chain Wahaca to my home in London:
For this kind of search, such panels work just fine. I get links to Wahaca locations on the left and a summary of the things I’m most likely to want to know about Wahaca neatly laid out on the right. This is the sort of thing that search does well with what the early pioneers of online called ‘ease of do’. Exact factual information, in a split second.
But as Adrienne Jeffries of The Outline reports, Google’s answers box will try to reply to a question put into a search, using some weird sources. Four or five US presidents who were apparently members of the Klu Klux Klan; there’s no evidence that any of ones named were in the KKK. President Obama was intending to declare martial law. Medically indefensible data about monosodium glutamate from a alternative-health content farm.
And so on. This kind of rubbish is now so familiar from fake news controversies that you will have the picture. Google have a decent record of adapting or withdrawing products that don’t work (the company have faced a lot of legal pressure over libellous terms popping up in ‘auto-complete’) so this one will probably get worked over.
But this algorithm car crash points to a larger question. Truth isn’t absolute. I have a high degree of confidence (close to 100%) that what Google tells me about Wahaca will be true. I have a little less confidence (maybe 80%) that a rating on TripAdvisor will be accurate, because such sites can be manipulated and gamed. But if I were to ask if Barrack Obama had been planning a ‘communist coup’ and the answer came from the ‘Western Centre for Journalism’, my confidence level goes to zero. Everything else I know about Obama tells me that can’t be the case and the source sounds like a fake news pump.
The larger problem for search engines is that truth is iterative: a higher degree of certainty emerges over time as people test statements and claims, sifting the more-likely-to-be-true from the obviously false. Facts may be instantly accepted as facts or they may face a long struggle to move from claim to fact. The issue of fake news has reminded people that in democracies that this messy filtering is important. Sites, startups and new techniques for dealing with fake news are popping up all over.
A search engine is designed to serve the individual and her or his needs. But serving the individual self is not the highest value we have and it ought not exclude other values. Information is an element in the glue which sticks societies together; it shapes the many-sided relationships which, in total, make a society. Free societies try to ensure that as much information circulates and that restrictions are kept to a minimum, usually by laws protecting free speech.
But there is a public benefit in seeking truth, even if it is contested and may never be fully agreed. Anyone raising their voice in the public sphere takes part, journalists included. Such a search is always imperfect. But that does not render it without value. No one should ask high-tech giants or search engines to be courts arbitrating truth. But I think they have a moral, democratic obligation to support truth-seeking and certainly one not to obstruct it. A knotty challenge for the algorithm makers.
Tags: Adrienne Jeffries, Barack Obama, Google, The Outline, Wahaca