31 July 2014 by Hal Hodson
No one really knows exactly how Facebook decides what we see when we log in. Reverse-engineering the algorithm behind it could help us find out
WHO controls your Facebook News Feed? We are fed a specially selected diet of jokes, photos and gossip from our Facebook friends, but not by a person. Instead an algorithm does the work – giving it the power to influence us.
The furore over an experiment in which Facebook researchers attempted to manipulate users’ emotions via their News Feed, albeit only slightly, highlighted the extent of that power.
Facebook’s algorithms are a closely guarded secret. “These are black boxes,” says Christo Wilson of Northeastern University in Boston. “In many cases the algorithms are held up as trade secrets, so there’s a competitive advantage to remaining non-transparent.”
For Karrie Karahalios and Cedric Langbort at the University of Illinois and Christian Sandvig at the University of Michigan, Facebook’s influence is out of balance with our understanding of how its algorithm works. So they are carrying out what they call a collaborative audit, looking at the Facebook experiences of thousands of people to work out the underlying algorithmic rules.
To do this they have created an app called FeedVis, which creates a stream of everything that your friends are posting. When I tried it, I saw an endless stream of comments, likes and posts by friends I’d forgotten I had. To the right I saw my standard News Feed, which was empty by comparison.
In their first, small study using FeedVis, the team found that most people – 62 per cent – didn’t know that the News Feed is automatically curated. People were shocked that they weren’t seeing everything their network posted. In cases where posts of close friends or family were excluded, many became upset.
The team is starting to understand some of the basic rules that govern what people see. “We know that if you comment on someone’s wall, you’re more likely to see a post from them than if you just like something,” says Karahalios. “And if you go to a person’s timeline you’re more likely to see content from them later.” The work was presented at the Berkman Center at Harvard University last week.
But Facebook’s algorithms change constantly. “Even if I figure it out today, that doesn’t necessarily mean it’ll be like that tomorrow,” says Wilson.
To expand the experiment, the team will recreate a person’s profile based on their likes, comments and other Facebook activity and then see if they can detect patterns in what their News Feed shows them.
Already, Facebook appropriates its users’ profiles to create adverts on their friends’ feeds that look like normal content. There are other tricks, too. “I could share a link to the McDonald’s website, commenting that a McLobster sounds disgusting,” says Sandvig. If you like that link, Facebook registers that you like McDonald’s. “It doesn’t appear on your feed, but your friends will get ads that say ‘Hal likes McDonald’s’,” he says.
Understanding these dynamics is crucial, as Facebook is increasingly the tool that people use to communicate and find out about their world. “In the history of mass media, there have been channels with huge reach, but it’s typically a human in the apex of the control loop,” says Wilson. “That’s just not true any more.”
This article appeared in print under the headline “Facebook’s biggest secret”