No great surprise that the election of Donald Trump was a tipping point for opinion about Facebook. Now people are really asking the questions about the influence of social networks and the mix of human intervention and algorithms that power their selection of news.
This is not a post about the causes of the American election surprise and its implications of journalism (there’s an informative survey of opinions here). This is another bulletin on the progress that Facebook is making in absorbing and acting on the fact that it has moral and democratic responsibilities which stem from its colossal informational power.
At the weekend, Facebook’s chief honcho Mark Zuckerberg responded to charges that Facebook had influenced the election outcome, in particular by circulating fake news stories. No surprise either that Zuckerberg guesses not. But he is guessing. And I’d guess that subsequent research may show infuence. We’ll see.
Fake news is an issue, but it is not the heart of the question. The question which matters is how Facebook – the techies, the software and your community – decides what to show you. Anyone with a smartphone can now distribute information, true, false or debatable. The group of people who used try to sift the truth information likely to matter to society (aka journalists) no longer control the distribution of what they produce. Facebook is the first news distribution platform which operates at scale across the whole planet. Plainly that gives it power and influence; we just don’t yet know precisely how that works. Facebook’s responses to the dilemmas raised by this have been hesitant, crabwise, half-admissions that it may have some ‘editorial’ responsibilties and is not only a big, neutral tech-only company.
But Zuckerberg and his lieutenants are still thrashing around trying to find words to describe what Facebook’s news feed does without conceding that they are editing. Zuckerberg’s weekend post said that news and opinion shared on Facebook ought to be ‘authentic’ and ‘meaningful’. Zuckerberg says, quite reasonably, that being arbiters of truth isn’t straightforward. The key passage is this:
I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves.
Decoding this latest text from the moutaintop, I’d reckon:
- Zuckerberg is Facebook’s editor-in-chief. His colleagues say regularly that they don’t want or need an editor. They’ve already got one. By one account he has a top-level editorial committee to to help him. They’re already arbitrating some truth/untruth.
- Someone at Facebook HQ needs to tell the editor-in-chief that big words like ‘authentic’ and ‘meaningful’ only carry much meaning if qualified by interpretation or evidence. Authentic or meaningful to whom? By what measure?
- Zuckerberg needs to be open about what he and his colleagues are doing. That will bring two advantages: (1) they willl be able to take a full and significant part in a major debate now kicking off about the public responsibilities and free speech obligation of networks; (2) it will be easier for Facebook to think ahead about these questions, something which it still seems to find tough to do.