Facebook (s fb) has been trying for some time now to clean up or improve the News Feed by removing things it defines as “low quality,” and it announced another effort along those lines on Thursday — saying it will reduce the visibility of “like bait” and content that gets posted too often. But all of these efforts have a dilemma at their core: namely, how will Facebook differentiate between what it calls low-quality content and what users really want to see?
In its blog post on the announcement, Facebook says that like-bait is content that “explicitly asks News Feed readers to like, comment or share the post in order to get additional distribution beyond what the post would normally receive.” And how do we know whether those likes actually generate more sharing of that content than a post would otherwise receive? The short answer is we don’t. Only Facebook knows that, based on its black-box algorithms.
The network posted an example of what it means by like-bait: photos of a baby rabbit, a kitten, dolphins and a mosquito, posted by an account whose name is “When your teacher accidentally scrapes her nails on the chalkboard and you’re like whaaaaaat” (which would seem to break Facebook’s rules on real names, if nothing else). It asks users to like, share or comment — or ignore.
There’s no question that many, perhaps even most, Facebook users would dislike this content intensely and vote to have it removed from their News Feed — except perhaps for younger users, who often enjoy that sort of thing, in part because it irritates adults. But I can think of other examples of content that might be considered like-bait that I saw friends willingly share, including photos of people fighting cancer who were trying to get a certain number of likes, and so on.
It might be spam, but I still like it
That kind of thing may not be “high quality” content, but some people clearly enjoy it. Part of Facebook’s dilemma can be seen in the blog post itself, when the company describes the difference between what people say when they fill out a survey, and what they actually do when they use the site — they click and share or comment, but then when asked, they say that they don’t like it.
“People often respond to posts asking them to take an action, and this means that these posts get shown to more people, and get shown higher up in News Feed. However, when we survey people and ask them to rate the quality of these stories, they report that like-baiting stories are, on average, 15% less relevant than other stories with a comparable number of likes, comments and shares.”
This is a little like the old days of TV analytics, where people would tell Nielsen that they only watched PBS and nature shows — but when Nielsen switched from surveys to actual monitoring software that tracked what people watched, it found that people’s viewing habits were dramatically different. As it turned out, many watched the same brainless sitcoms and goofy specials they claimed to have no interest in when they were filling out the survey.
Facebook says that the changes won’t impact pages that are “genuinely trying to encourage discussion among their fans.” But how will it distinguish between those pages and the ones that are just posting like-bait? That’s not an easy question to answer, even if you have the click habits of a billion users to study. And Facebook is essentially saying: “We’re not going to pay attention to what you do — we’re going to purify your News Feed for your own good.”
As I’ve tried to point out before, this is part of what makes life a lot harder for Facebook than it is for Twitter. The latter might get complaints about the stream being too noisy, but users know that for the most part, they are seeing the content they choose to see from the users they choose to follow. Not so on Facebook.
Facebook is much more interventionist, because it is trying to create this Platonic ideal of a “digital newspaper” that CEO Mark Zuckerberg seems to have in mind. And so it removes content it thinks might bother you (whether it’s photos of violence in Syria or breastfeeding) and chooses the rest of your content based on secret algorithms that you can only guess at — and ones that content owners criticize for being a bait-and-switch. And that is a much harder job.
Post and photo thumbnails courtesy of Thinkstock / jurgenfr and Thinkstock / Justin Sullivan