The filter bubble and public reason

I went today to listen to Eli Pariser, author of “The Filter Bubble: what the internet is hiding from you”. I wasn’t convinced, in several ways.

Pariser’s argument is that the world wide web isn’t what he thought it was. The search engines and social networks manipulate what you see in ways they don’t tell you about and which make them money. Algorithms which sift for “relevance” create a personal information world for you: a filter “bubble” screens you off from wider, richer possibilities. The new giants which dominate the information networks, such as Google and Facebook, should be regulated so that they can do better for society.

Pariser is right to draw attention to the major, barely-announced shift in the way that Google adjusts search results to suit an individual (although there’s dispute about the extent to which it happens). But his worry is the latest chapter in a long debate over the “echo chamber” effects of the internet. Does the availability of so much information deliver the paradox of people less well-informed because they can choose only to consume material which supports their existing beliefs and opinions? There is at least one piece of recent research which casts doubt on this widely-held belief.

My own sense, unsupported by scientific inquiry, is that “echo chamber” tendencies are probably more than offset by the internet’s ability to allow instant, rich, serendipitous exploration of the world’s digital library. When was the last time you sat down at the screen to check closing time at Waitrose and, before you knew where you were and after several sideways jumps, found yourself browsing, via a signpost in Arts & Letters Daily, a piece in Lapham’s Quarterly on diets which include earth, chalk and hair?

So Pariser worries about the search engines and social networks giving people too much of what they want and not enough of what they ought to have. But he is very vague about how this might be remedied. The heading “What Governments and Citizens Can Do” appears on page 237 of his 243-page book. At his talk today, Pariser talked about a “regulatory reset” or a “reset in the ownership of information”. There are two small snags with this: it would restrict free choice and it won’t work. Regulations obliging people to work for the civic good are worthy. But social engineering of this type is no more likely to succeed than attempts to make moonbeams out of cucumbers.

What we should be concentrating on here is the quality of public reason, the guidelines and standards applied to society-wide battles of ideas.. That search engines, platforms and networks are frequently now making choices of social and political importance about what information circulates seems indisputable (see here for good collection of examples and links).

Rules should be used, sparingly, to stop bad things happening. This is where I think Pariser is missing a trick. He’s right to be concerned about what kind of public sphere the digital giants are helping to build: that should be society’s concern. But the key here is transparency: if we know exactly how search engines filter and how Facebook tunes the use of the “Like” button, we have the information we need to choose. We can make use of Google and Facebook rather than being used by them. The problem lies in current concealment of the exact ways in which information utilities manage information, not in the fact that they are doing it. If we know what they are up to, we can choose whether to mind or not. Only then should we worry about whether something is bad enough to be stopped by law.

Over the next ten and more years, wired societies are going to have to discuss and decide important constitutive choices about how information circulates. Should society require the information giants to disclose how their search works? (Google did in fact disclose that it was “personalising” search, but in the most discreet fashion imaginable). The technical details of algorithms and cookies are not just geeky trade secrets: they are part and parcel of the evolving public sphere. As such, they are everybody’s business and there’s a case for saying that should be public information which ought to be available to all.

Update 24/6/11: here is some research (short version, long version) which looks in detail at how Google personalisation actually works. Extract from Pariser’s Royal Society of Arts talk here.

Share

Tags: , , , , ,

2 comments

  1. I was in the audience at the RSA earlier today as well and enjoyed the chance to hear Eli Pariser set out his argument in person.

    I must confess that I haven’t had an opportunity to read ‘The Filter Bubble’ yet but – based on Mr Pariser’s address – I formed a similar impression to yours.

    It struck me that, by fixing his gaze on the visible symptoms of unchecked innovation (like the presentation of personalised search results), Mr Pariser was undermining his own case. In fact, I formed the distinct impression that his motive is to plead for a pause in the relentless application of new technology simply because it exists.

    Perhaps his chief dilemma in framing the issue has been that a debate about the ethics of innovation is far less likely to influence the right crowd; jamming a spanner in the continuity of the world of code development – by picking on the algorithms of the Big Guns – may, however, have the effect of causing momentum to stutter – if only momentarily.

    Judging from the coverage of his argument in both the traditional and online media over the past week or so, he may have had some success. He has, at least, caused indignant ripples among voices in the developer and wider web community.

    Casting Google and Facebook as the villains is fine but – as you appear to suggest – the focus on the algorithm tends to detract from the more significant questions he seemed to be posing in his talk: to what extent is society concerned about the intrusiveness of technology? What checks and balances should be available to consumers to protect their privacy? And to what extent should we actively consent to the use of data in order to enjoy the apparent benefits that personalisation offers?

    In that respect, Mr Pariser’s contribution is timely. Heading in to the talk today, I was sceptical about the EU’s proposals to insist upon explicit disclosure of the use of cookies on websites and the requirement for users to actively opt in to the activation of cookies.

    Like many marketing people, I’d instinctively felt that such a move would be counter-intuitive to good user experience.

    However, I’m rapidly coming round to the point of view – not least because of contributions like Mr Pariser’s – that the EU may be right.

    In fact, on reflection, brands, charities and whoever else implements cookies (and who doesn’t these days?) are being remarkably presumptuous about their rights to data compared to their users right to privacy; even dismissive of the importance of those rights.

    In fact, that feeling has been further underlined by the revelation by the UK’s Information Commissioner’s Office (ironically, thanks to a Freedom of Information request) that, since they implemented a live example of how the EU’s guidelines may be applied at their own website (see http://www.ico.gov.uk) – approximately 90% of visitors have refused to give permission. (See source article at http://econsultancy.com/uk/blog/7692-ico-follows-ico-rules-cookie-usage-drops-by-90-percent).

    That suggests to me that, if a user doesn’t perceive that there is an essential need for a brand or organisation to access personal data, then they won’t give their consent to it. This conclusion tends to support your view that transparency is essential. If brands or organisations explain precisely how they use cookies and why, then users are likely to be more trusting.

    Of course this is why the EU’s intended approach is causing such alarm to the SEO and advertising industries – it’s likely that, in most cases, there’s very little consumer benefit at all.

    Which leads me to conclude that the question that Eli Pariser ought to have asked – and is really asking – is ‘Who gains most from personalisation?’.

    Thanks for posting a really interesting perspective on the debate.

  2. I’ve written an alternative view on the subject available here http://technorati.com/technology/article/why-we-need-bubbles/