As we move closer to an official release, more details are slowly emerging about Facebook's new dedicated news section, which will showcase content from a selection of 'trusted' publishing partners, in an effort to boost engagement and discussion across The Social Network.
News has long been a sticky subject for Facebook. Largely blamed for fueling movements based on misinformation and political manipulation, Facebook got rid of its 'Trending News' section last year, but now, Zuck and Co. have seen a new way forward, using the model of Facebook Watch as a template for an improved news section.
But how, exactly, Facebook goes about 'improving' news coverage will be closely watched, and scrutinized, by users and publishers alike.
This week, The Information got hold of an internal memo which provides some insight into how the new section will operate, including notes on how Facebook will choose what content to display, and the role that human editors will be playing in that chain.
According to the report:
- Facebook will utilize human editors for curating its displayed news headlines, which is the same approach it once had in place for Trending News, before insider reports revealed that those editors had been manipulating the results in order to hide or demote certain stories (including those critical of Facebook). Facebook then switched to an unbiased algorithm feed, which was arguably worse, then it cut the Trending section altogether. Really, Facebook will likely need human editors, but it will also need strict policies and guidelines to avoid the same issues.
- Facebook's editors will “seek to promote the media outlet that first reported a particular story, and additionally prioritize stories broken by local news outlets.” This could provide a big boost for smaller outlets, giving them increased visibility to billions of Facebook users. How, exactly, Facebook will be able to do this in practice will be another challenge to consider, but it could provide more 'on-the-ground' perspective in certain events, and maximize the work of dedicated journalists.
- Facebook's editors will seek to avoid content that's “constructed to provoke, divide, and polarize", while also prioritizing those with "on-the-record sources". This is a big one - Facebook's distribution algorithms have essentially changed the way that news is reported in many respects, due to the focus on engagement over all else. If a story provokes a response, it gets more comments and Reactions, and thus, more reach. Given this, it's often far more beneficial for a publisher to take a hard angle on a story, as opposed to covering the facts. As a basic example, a story with a headline like "Man Kills Neighbor" is not going to spark as much response as "Black Man Kills White Neighbor". The specific details of the case may well be irrelevant, but the more emotive focus will generate increased Facebook response, and benefit that publisher - while consequently also fueling divides. If Facebook can do anything to limit such, by demoting content clearly shaped around the algorithm, that will be beneficial.
In terms of Facebook's arrangements with publishers, Nieman Lab has reported that Facebook has been offering publishers three-year licensing deals in which it would pay them as much as $3 million per annum. The actual numbers in play are believed to be less than that, but it even so, if the deals are even within that ballpark, it would underline just how serious Facebook is about this new push. But they wouldn't be paying so much for no reason - and another question then, aside from how exactly Facebook plans to use such content, is how Facebook will decide on which outlets to partner with. Who chooses, amid ongoing accusations of misinformation, which publishers are 'trustworthy'?
This will be the crux of Facebook's refreshed news approach. If The Social Network can establish a new, trusted panel of news providers, and then promote them to its billions of users, that could have a significant impact on the broader news cycle. Previous research has shown that some 43% of Americans get at least some of their news content from The Social Network. If Facebook can use its news push to promote more accurate sources, and force out some of the junk, that could go a long way to re-establishing trust in media organizations more broadly.
It's a big hill to climb, and Facebook will seemingly be looking to navigate such in the lead up to a new Presidential election. But this could be a massive change for Facebook, with broad societal implications. If it can get it right.