Facebook (s fb) seems to be trying to get more transparent about how the algorithms behind the social-networking site function, with a statement on Monday about how it is cracking down on “clickbait.” And just like the last time Facebook tweaked its algorithm (and the time before that), everyone is trying to figure out whether it will help or hurt sites like Upworthy or BuzzFeed — or their own site. But despite the attempts at openness, the bottom line remains the same: Facebook is a black box. No one really has any clue why the site chooses to show or hide certain content.
A lot of the attention around this latest change, like the one before it, has been focused on trying to decipher what Facebook means by the term “clickbait.” But as John Hermann points out at The Awl — and as I tried to argue the last time the site started fiddling with the algorithm — clickbait is one of those things that everyone thinks they know when it appears, but no one can really define. The bottom line is that whether it’s a slideshow or a listicle with GIFs, one person’s clickbait is another person’s fascinating and shareable content. Even Facebook can’t seem to define exactly what it is trying to stamp out:
“Click-baiting is when a publisher posts a link with a headline that encourages people to click to see more, without telling them much information about what they will see. Posts like these tend to get a lot of clicks, which means that these posts get shown to more people, and get shown higher up in News Feed.”
Is Facebook partly to blame?
From that description, what the site seems to be targeting isn’t so much clickbait-style stories or posts, but clickbait headlines — mostly the so-called “curiosity gap” style headlines that promise some earth-shattering revelation and then don’t follow through on it (Techmeme often rewrites headlines for the same reason). In other words, some of the same posts that seem to bother people so much will likely continue to exist and be ranked highly, provided the people posting them describe them accurately in the headline. Even the example that Facebook used for its post could continue to be shared and highly ranked.
A related question is whether Facebook itself is partly — or even largely — responsible for the state of affairs it is now trying to correct by tweaking the algorithm. When product manager Mike Hudack complained earlier this year about the decline of online media into an orgy of shameless clickbait, he was hit by a barrage of criticism about Facebook’s role in promoting that kind of content, since media outlets see the social network as the holy grail of site traffic, which is their bread and butter (for his part, Hudack said the network was trying to help in that regard — efforts that may have resulted in the latest algorithm change).
The site’s algorithm is a black box
In any case, the bottom line for both media sites and users alike is that Facebook’s algorithm is very similar to Google’s algorithm, in the sense that it’s a black box, one whose inner workings are almost totally inscrutable. Just as sites like Metafilter occasionally find that their traffic has fallen off a cliff due to some mysterious change in Google’s ranking methods, so Facebook routinely elevates or smothers certain types of content — a good example being the “social readers” that many media outlets such as The Guardian came out with in 2012, only to see their usefulness evaporate overnight after Facebook tweaked its algorithm.
Facebook downgrades click bait http://t.co/1fBlvINutJ Unchanged: Maybe it will send your post to people, maybe it won't. You can't know.
— Jay Rosen (@jayrosen_nyu) August 25, 2014
But the reasons why Facebook chooses to highlight specific types of content and hide others remain completely hidden from users and publishers — and explanations like the one it gave for the latest change don’t really help that much, in part because they raise almost as many questions as they answer.
The practical impact of this algorithm-driven becomes obvious during an event like the demonstrations in Ferguson, Mo. While Twitter was filled with live reporting about the incident and its aftermath, many users complained that Facebook was almost silent on the news. Was that because of the way certain stories were shared? Was it because people didn’t click on Ferguson headlines? Or was it because Facebook chose to highlight uplifting personal stories instead of depressing and violent news events? No one knows.
Like a newspaper publisher with editors who choose which stories are important and which aren’t, Facebook decides what to show based on its own criteria — criteria that are largely unshared with the outside world, and therefore can only be inferred based on external signals. Is that a good thing or a bad thing? Ultimately, that’s for each user to determine for themselves. But the fact remains that what comes through Facebook is the site’s version of what you should be reading — and what you need to know about the world — not your version.