Pariser’s argument is that the world wide web isn’t what he thought it was. The search engines and social networks manipulate what you see in ways they don’t tell you about and which make them money. Algorithms which sift for “relevance” create a personal information world for you: a filter “bubble” screens you off from wider, richer possibilities. The new giants which dominate the information networks, such as Google and Facebook, should be regulated so that they can do better for society.
Pariser is right to draw attention to the major, barely-announced shift in the way that Google adjusts search results to suit an individual (although there’s dispute about the extent to which it happens). But his worry is the latest chapter in a long debate over the “echo chamber” effects of the internet. Does the availability of so much information deliver the paradox of people less well-informed because they can choose only to consume material which supports their existing beliefs and opinions? There is at least one piece of recent research which casts doubt on this widely-held belief.
My own sense, unsupported by scientific inquiry, is that “echo chamber” tendencies are probably more than offset by the internet’s ability to allow instant, rich, serendipitous exploration of the world’s digital library. When was the last time you sat down at the screen to check closing time at Waitrose and, before you knew where you were and after several sideways jumps, found yourself browsing, via a signpost in Arts & Letters Daily, a piece in Lapham’s Quarterly on diets which include earth, chalk and hair?
So Pariser worries about the search engines and social networks giving people too much of what they want and not enough of what they ought to have. But he is very vague about how this might be remedied. The heading “What Governments and Citizens Can Do” appears on page 237 of his 243-page book. At his talk today, Pariser talked about a “regulatory reset” or a “reset in the ownership of information”. There are two small snags with this: it would restrict free choice and it won’t work. Regulations obliging people to work for the civic good are worthy. But social engineering of this type is no more likely to succeed than attempts to make moonbeams out of cucumbers.
What we should be concentrating on here is the quality of public reason, the guidelines and standards applied to society-wide battles of ideas.. That search engines, platforms and networks are frequently now making choices of social and political importance about what information circulates seems indisputable (see here for good collection of examples and links).
Rules should be used, sparingly, to stop bad things happening. This is where I think Pariser is missing a trick. He’s right to be concerned about what kind of public sphere the digital giants are helping to build: that should be society’s concern. But the key here is transparency: if we know exactly how search engines filter and how Facebook tunes the use of the “Like” button, we have the information we need to choose. We can make use of Google and Facebook rather than being used by them. The problem lies in current concealment of the exact ways in which information utilities manage information, not in the fact that they are doing it. If we know what they are up to, we can choose whether to mind or not. Only then should we worry about whether something is bad enough to be stopped by law.
Over the next ten and more years, wired societies are going to have to discuss and decide important constitutive choices about how information circulates. Should society require the information giants to disclose how their search works? (Google did in fact disclose that it was “personalising” search, but in the most discreet fashion imaginable). The technical details of algorithms and cookies are not just geeky trade secrets: they are part and parcel of the evolving public sphere. As such, they are everybody’s business and there’s a case for saying that should be public information which ought to be available to all.