No great surprise that the election of Donald Trump was a tipping point for opinion about Facebook. Now people are really asking the questions about the influence of social networks and the mix of human intervention and algorithms that power their selection of news.
This is not a post about the causes of the American election surprise and its implications of journalism (there’s an informative survey of opinions here). This is another bulletin on the progress that Facebook is making in absorbing and acting on the fact that it has moral and democratic responsibilities which stem from its colossal informational power.
At the weekend, Facebook’s chief honcho Mark Zuckerberg responded to charges that Facebook had influenced the election outcome, in particular by circulating fake news stories. No surprise either that Zuckerberg guesses not. But he is guessing. And I’d guess that subsequent research may show infuence. We’ll see.
Fake news is an issue, but it is not the heart of the question. The question which matters is how Facebook – the techies, the software and your community – decides what to show you. Anyone with a smartphone can now distribute information, true, false or debatable. The group of people who used try to sift the truth information likely to matter to society (aka journalists) no longer control the distribution of what they produce. Facebook is the first news distribution platform which operates at scale across the whole planet. Plainly that gives it power and influence; we just don’t yet know precisely how that works. Facebook’s responses to the dilemmas raised by this have been hesitant, crabwise, half-admissions that it may have some ‘editorial’ responsibilties and is not only a big, neutral tech-only company.