Get ready for the next wave of changes to Facebook’s News Feed algorithm.
Today Facebook has outlined how it seeks to improve its feed ranking processes by implementing a range of new rating approaches, including updated user surveys, seeking different types of feedback, and weighting posts based on ‘angry’ reactions.
These new elements could ultimately have a big impact on what billions of people see in their Facebook feeds, and then on how the Pages optimize their approaches to increase reach.
Here’s a look at the various ways Facebook wants to change its approach.
First, Facebook is launching a new set of user surveys and response queries to get a better insight into what people actually want to see in their feeds.
User surveys are has long been part of Facebook’s feed ranking process – You’ve probably noticed instructions like this in your own Facebook feed asking for feedback on your experience.
Facebook says ‘tens of thousands’ of these feedback surveys are filled out on a daily basis, providing the company with a range of expanded insights to work with.
As he explained Facebook:
“While the engagement of fasting – or how often people like, comment on or share it – can be a useful indicator that people are interested, this survey-based approach, which mostly occurs outside the immediate reaction to fasting, gives a more complete picture of what types of posts people consider most appreciated and what content diminishes their news feed experience. ”
Part of his recent approach in this regard were questions about whether the publication was “worth your time?”
“In 2019 introduced polls to ask people, “Is this post worth it?” and we use that feedback to inform you about how we are arranging posts in their news in the future. For example, if people say a post is worth their time, we will strive to make such posts appear more in the News Feed, and if they are not worth their time, we will aim to show such posts closer to the bottom. “
This approach has given Facebook a new insight into what users find valuable, as opposed to engagement, and now wants to roll out a new set of survey questions focused on what drives the ‘value’ of the user experience, as a means of improving its algorithmic sorting.
It’s an interesting approach, because what you find “valuable” is probably a lot different from what you find “interesting” or even what you “like”.
Do you find sports posts ‘valuable’? What about your friends’ posts?
And in the context of brand content, are updates to your site what people would say “valuable”?
This could become more of a focus for your approach going forward.
In line with that line, Facebook is also conducting a new series of tests to determine what types of posts people find “inspiring” in their feeds?
It seems less likely to provide significant value to insight – but again, that’s another consideration. If Facebook wants to provide more valuable, inspiring content, it might be helpful to consider such a case in your Facebook posting process.
Think about the political content
Facebook also seeks to better understand the broader impacts of political content on the platform, and user feedback suggests that many people have simply had enough of divisive, aggressive, and finger-focused political debate.
Facebook CEO Mark Zuckerberg made a a certain point about it in Facebook’s last call for earnings, noting that:
“One of the main feedback we’re currently hearing from our community is that people don’t want politics and struggles to take over their experience on our services.”
In fact, after a period of division politics around the world, she sometimes felt overwhelmed, friends and family were often separated along political lines purely because of Facebook posts and discussions.
Facebook users seem to have had enough, and Facebook is trying to address this in its News Feed.
“Although your news feed contains posts from friends, groups, and pages you choose to follow, sometimes we know that even your closest friends and family share posts about topics you don’t really care about or don’t want. To resolve this, we’ll ask people if they want to see more or less posts on a particular topic, such as cooking, sports or politics, and based on their collective feedback we will strive to show people more content on topics that interest them more and show them fewer posts on topics they don’t want to see. ”
Although politics is only one of the three topics mentioned here, it seems likely to be the key focus. I mean, cooking and sports content can be a little boring, but they probably won’t be as sensitive as policy-related updates.
Facebook also focuses specifically on policy with its following audit point:
“Increasingly, we hear feedback from people seeing too much content about politics and too many other types of posts and comments that detract from their news feed experience. This is a sensitive area, so we will work over the next few months to better understand what types of content are associated with these negative experiences. “
In addition to asking for a direct response, Facebook says it will also seek to measure metrics after responses, such as ‘angry’ reactions, which will help it measure what users don’t want to see.
That could be a positive step. Facebook is often blamed causing increased social division more broadly, because of its seemingly blind focus on maximizing customer engagement, first and foremost.
Engagement is driven by an emotional response – people comment on a post because it elicits a strong emotional reaction, share the post for the same reason, react because the post moves them in some way. The most powerful emotional drivers in this regard are happiness and anger, and this is largely reflected in what you will see in your Facebook feed.
It will also lead to more companies changing access to content based on these triggers, to increase reach and clicks. It is arguable, for example, that news like Fox News is encouraged or even prompted by Internet algorithms like those that run the Facebook News Feed, which basically encourage party approaches to shared content, to encourage audience reactions that will then see their content. Get more comments, more sharing – and so attract more clicks to their website.
After the Trump presidency, and more specifically, the riots at the Capitol, it seems Facebook is now looking more closely at this element, and this new approach, assessing the impacts of political content, could reduce negative experiences on the platform, reducing shared content in user feeds.
A more direct answer
Finally, Facebook is also testing a new option to make it easier for people to hide posts they don’t care about in their feed, which will then inform their algorithms about what you don’t want to see.
“If you come across something that you find unimportant, problematic, or annoying, you can tap X in the upper-right corner of the post to hide it from the news feed and see fewer similar posts in the future. “
It could be an easy way to gather a more direct user response, while it could also create new habits to help users better control their feeds, with the understanding that what you tap on that X will notify the algorithm of your preferences.
As with any Facebook news update, there are significant implications here, and each of these changes potentially leads to new distribution shifts that could change what people see – and the way site managers approach their Facebook strategy.
The focus of the note, based on these explanations, is:
- More valuable content
- More ‘inspirational’ posts
- Less divided political views
The feedback that Facebook gets through the new tests could change these elements, but it seems to be the way Facebook relies.
This is worth considering in your Facebook marketing and outreach process going forward.
Also, keep an eye on your Facebook reach statistics for any significant changes.