As part of its ongoing effort to bring more diversity of content into your News Feed - and avoid creating echo chambers where all users hear see is information that reinforces their existing beliefs - Facebook has announced another update to Trending Stories, while also adding in a new way to showcase trending news in your region.
The latest change to Trending Stories will see Facebook present a new carousel layout for each topic, highlighting a set of stories from a range of publications covering the issue which you can flip through.
As per Facebook:
"By making it easier to see what other news outlets are saying about each topic, we hope that people will feel more informed about the news in their region."
Essentially, this is another way for Facebook to exposure users to more perspectives, without intruding on the personalization of their Facebook experience. Facebook notes that there's no predetermined list of publications that are eligible to appear in Trending, and the stories that are featured are chosen based on engagement with each post, engagement with each publisher and the amount of other articles linking to it.
This is The Social Network's latest effort to improve Trending Stories after much criticism of the module.
As you might recall, last March, Facebook came under fire after several former staffers from their Trending news team spoke out about questionable practices, including manual interference with trending topics - essentially angling the discussion to their benefit.
Facebook denied the claims and conducted an investigation, which, they say, found no systematic bias in their process, but in the wake of these accusations, Facebook announced a range of updates to their Trending News approach, including the removal of manual curators, switching to a purely algorithm-fuelled feed, with the descriptions shortened to Twitter-esque, basic topic descriptions to further reduce the need for manual interference or editors.
That new approach had its own problems (and was updated to include publisher headlines in January) - but then came the next wave of criticism for Facebook, that it was facilitating the spread of fake news and filter bubbles, which generally left many ill-informed of the real issues at hand heading into the 2016 US Presidential Election.
Given that around 44% of the entire US population now get at least some of their news from the site, Facebook's influence over the modern day news cycle is greater than many suspect - you might not think you rely on Facebook for news content, but all those links shared, all those news stories highlighted in your feed, the amount of time you spend on Facebook. All of these factors mean that you are being influenced, in some form, by the content presented there.
This new approach is part of Facebook's reluctant acknowledgement of that fact, and their efforts to help facilitate greater understanding by exposing more users to more news sources - though, really, it's probably not going to work.
The same impetus is behind the second element of their new approach - Facebook's also looking to add a new Trending News module into some users' News Feeds to better showcase news content.
As explained by Facebook:
"One of the things we regularly hear from people who use Trending is that it can be difficult to find in the Facebook mobile app. We're soon beginning a test in News Feed that will show people the top three Trending stories, which they can click on to see the full list of Trending topics and explore what people are discussing on Facebook."
Facebook says that only a small number of users will be included in the test pool, and if you are, you'll be able to remove it via the drop-down menu.
Again, it's interesting - and as noted by TechCrunch, Facebook's been keen to note in the past that the platform can actually increase diversity of perspectives (23% of users' friends are of the opposite political affiliation, while 29% of stories in the News Feed display views that disagree with a user's ideology). But it doesn't. Here's why.
Facebook's News Feed algorithm-defined system is designed to show you more of what you want to see, more of what you engage with, through comments, Likes, shares. That inevitably shows you more content from the people you like, who are also more likely to be people whose opinions, and particularly their political affiliations, you agree with. If someone posts a lot of pro-Trump messages which go against what you believe, you'll probably unfollow them - which is an important distinction to make within the context of the above stats.
For example, Facebook says that 23% of users' friends are of the opposite political affiliation - but just because you're friends with someone, doesn't mean you're still seeing their updates. People are more likely to unfollow someone than unfriend them - most want to avoid unnecessary offense - so even if your friends are of the opposite opinion, that doesn't mean you're hearing them.
In essence, Facebook is designed to reinforce your pre-existing beliefs. That's not deliberate, but Facebook knows that if they show you more of what you like and agree with you'll spend more time on platform. Sure, they could show farmers who are struggling to make ends meet stories about how inner-city folk are complaining about pumpkin spice lattes, but they wouldn't read them. It's therefore in Facebook's interests to show these users content that aligns with their experience, which, in turn, can help fuel political divide.
As such, it's hard to see that raising awareness of trending stories, and more diverse viewpoints, is going to have any significant impact. It's definitely worth Facebook trying, but it does seem that the main cause of the issue is Facebook's system itself, which they can't change as they'll lose user engagement.
On one hand, social media is hugely positive, as it gives everyone the opportunity to connect with people like them, to reach out across geographical borders and build stronger communities. But the opposite is also true - it also gives counter movements the chance to connect and fuel their own communities. This is an inevitable side-effect, and there may not be a solution. At least, not an easy one.