Two faces of Facebook

This morning’s headlines are about the Facebook’s progress in connecting your brain to their social network. Their scientists, led by the ex-head of the American defence research agency Darpa, foresee the day when you won’t even have to lift a finger to press a ‘Like’ button. You’ll think it and it’ll happen.

The focus on Facebook’s announcement at their F8 developers conference in California is understandable. But my eye was caught by something quite different in what the network’s founder Mark Zuckerberg said. Something which shines a light on what a split personality Facebook is becoming on the issue of its effect on human society.

Zuckerberg talked about pictures and how much easier Facebook would make it to edit them. There’s a coffee cup in a picture of you: the touch of a key will add a whisp of steam or a second cup. You could make it look, he said, as if you’re not drinking coffee alone. Facebook will help us be ‘better able to reflect and improve our life experiences.’

New products would focus on the visual. And that, Zuckerberg said, is ‘because images created by smartphone cameras contain more context and richer information than other forms of input like text entered on a keyboard.’ Boring old words: so tiresome, so time-consuming, so slow.

Not a single reference to any risk of bad consequences. Do humans lie, manipulate and deceive? They surely do, and were doing it long before online social networks came along. Do humans aim to present the best picture of themselves to others? Yes.

But making fiction easier by making picture-faking frictionless runs risks. Societies rely, perhaps to a greater extent than we realize, on a shared grasp of reality and truth. That does not mean that we don’t argue about it; healthy societies fight verbal battles about the details and interpretation of reality all the time. But most of those arguments rest on the shared assumption that someone is right and someone is wrong. Time, events or evidence will eventually tilt understanding towards one version or another. Truth is iterative, but it is mostly discoverable. That is the legacy of the Enlightenment, which superseded a monopoly of truth previously held by kings and priests.

There are three dangers in the route Facebook is taking:

  • Preferring pictures words. Much of the information we consume is now audio-visual and it is generally more attractive that the written. But words usually encode and transmit more complex meanings than images and sound. If words becomes less important, keeping an eye on awkward, complex truth is harder.
  • We have been accustomed to being able to tell from the source whether something is true or false. Recent discoveries about ‘fake news’ have taught us that this isn’t the reliable distinction we might have imagined that it was. The root of this problem is that the authenticity and origin of online information is so easy to manipulate.
  • Boosting the ability to blur the truth will, over the long term, weaken respect for what is accurate, reliable and true. Societies in which this happens will be the losers.

Facebook gives signals that it is waking up to the long-term dangers they may be creating. Only days before the F8 conference, a sizeable team led by the network’s news executives Adam Mosseri and Campbell Brown had been in Perugia for the International Journalism Festival trying to make clear that they were listening (an account by Buzzfeed’s Craig Silverman here).

Information is changing in transformative ways. Facebook’s excitement about technological possibilities hasn’t yet been matched by its grasp of the political, ethical and psychological challenges that it keeps throwing up.

Share

Tags: , , , , , ,

Comments are closed.