Facebook is Making Big Changes: Will it Change Anything?

Facebook clearly wants to make major changes for 2018. What those changes look like, and why Facebook wants to change, is far less clear. Is it all about money? Do they want to genuinely improve the lives of their users? Or do they just want to avoid making truly difficult choices?

Jan 19, 2018

On January 11, Mark Zuckerberg posted an announcement on his Facebook page. The news? You’ll see less news on Facebook. Along with seeing less articles from blogs and news outlets in your feed, you’ll also be getting less promotional material and posts from businesses. According to the post, the purpose of the change is to reduce the amount of time people spent idly scrolling through facebook, and increase the percentage of posts that users interact with – by liking, commenting, or sharing the content. Idle time, Zuckerberg notes, is not good for users and can negatively impact their life and health. Spending time connecting with friends and family, in contrast, is fulfilling and good for one’s physical and mental health.

But does that explanation hold up to scrutiny? Are people more fulfilled on Facebook looking at their friends and family versus Tasty videos and Breitbart articles?

Facebook researchers say so, pointing to evidence from several university studies that show passively scrolling or “liking” posts decreased well-being, but actively commenting and sharing posts increased it. Thus, Facebook promoting more online interactions between individuals will increase well-being of Facebook users.

However, there is also evidence that interacting with friends’ social media personas also decreases happiness and health. Dr. Holly B. Shakya of UC San Diego conducted a longitudinal study that compared health and life-satisfaction outcomes based on how much time and interaction individuals had with posts on Facebook. Across almost all actions that you could take on Facebook, including posting to the respondent’s own wall, the more time spent on the social network, the less happy and less healthy respondents were.

Researchers looking at social media from a public health perspective have hypothesized that the reason for lowered well-being is that people put great care and attention into making their lives look great on social media. This causes users looking at their friends and families to think their lives are far worse by comparison, which affects their own well-being. Facebook’s researchers also admit that “negative social comparison” is part of why Facebook may be negatively impacting people.

But reducing posts from businesses and news outlets and increasing posts from your Facebook friends isn’t going to change the fact that seeing your friends’ perfectly curated lives depresses you. In fact, it will likely exaggerate “negative social comparison.”

Besides, why does Facebook claim to care about the well-being of its users anyway? If mindlessly scrolling through our news feeds was making Facebook money, why would they change their behavior?

Jaron Lanier, an early inventor of “virtual reality” technology and advocate for better use of the internet has often compared “engagement,” which social media platforms base their business models on, to “addiction.” The point is ultimately to addict us to Facebook and keep us coming back for more. Facebook makes money off our addiction to social media, so do they just want to start selling us a less depressing drug?

Maybe. The volume of engagement (or number of addicts) on Facebook’s News Feed has decreased in the last few years. Why? Likely for the same reason people try to cut themselves off from other addictions, they no longer want the negative impacts on their lives.

Similar things trends have occurred with tobacco and smoking addictions over the years. Users start out with a couple cigarettes, it makes them feel good, and lets them join in with the cool kids. Over the years, they can’t reach that same buzz anymore and they need more and more of the addictive product just to feel anything. Eventually, the addict is facing terrible health consequences, financial trouble, and low self-esteem. They decide it’s time to quit.

But the tobacco industry doesn’t want to give up on you as a customer like that. They have a whole backup plan for addicts who burn out. Cigarette ads are much rarer these days because of legislation and backlash against tobacco companies. Cigarette ads that do exist now focus more on social smoking and “smoking responsibly.” The message is clear, “don’t quit cold turkey. Just enjoy a couple cigarettes a day with friends the way you used to – back when cigarettes were cool.”

The same thing happened with Facebook. At first, Facebook was just a place to keep up with friends or distant relatives that you didn’t get to see that often. Then people started to foster friendships that originated online and never spilled over to the physical world. Then came advertisements and groups and pages. Then memes took over Facebook. Now it’s a place for you to engage in group think with like-minded people from across the world that share your tribal allegiance, read news that confirms your biases, and antagonize others for holding differing opinions.

The addiction to being part of an online community exploded into a vicious pack mentality, and users are cutting down or leaving Facebook altogether in response. That’s a problem for Facebook’s profits, so they are instead racing to offer users the same carefree experience of connecting with friends that the platform started off as. Back when Facebook was cool.

At least that’s what I figured when I started writing this article. But then, Mr. Zuckerberg released a second post on his Facebook page. This time he addressed new steps Facebook would take to ensure the news articles left on the updated Facebook would be from trusted sources. Facebook has been receiving criticism for the way it handles sensationalist and false news stories ever since it came out that Russian operatives used the Facebook platform to build a disinformation campaign for the 2016 election.

The new plan is to ask Facebook users to decide “which [news] sources are broadly trusted.” Now, to anyone who genuinely cares about factual accuracy in news stories, the fatal flaw in this plan should be quite clear. Everyone is just going to vote for the news that confirms their biases. As I’ve said before in my article about Roy Moore’s run for Senate in Alabama, truth does not get to be voted on, it just is.

Facebook doesn’t want to endure the criticism or blame that comes from making editorial decisions and judgement calls that are associated with organizations that want to distribute information that is factually correct and well-researched. They are afraid of being accused of bias or censorship. It’s fine to have that impulse, but then to instead pass the buck onto users is where I take issue.

Personally, I think if Facebook doesn’t want to take responsibility for the news that is posted across their website, then they should just take the percentage of news on Facebook from 5% or 4%, and just make it 0%. Then there’s no way Facebook can be accused of bias.

But of course, that’s bad for Facebook profits too. There are still plenty of addicts who crave the daily outrage and trolling that is associated with the majority, or at least the plurality, of the news on Facebook. Separating those users from their Occupy Democrats or Breitbart articles will absolutely cause a decrease in usership.

And ultimately, I think this is the reasoning behind all the new changes that Facebook is bringing to the news feed. They are being pressured from a number of different sides, and are trying to split the difference between them.

For those that have burned out of their Facebook addictions, they are trying to reel in the addictiveness of Facebook. But at the same time, they want to keep the SJWs and Info Warriors as addicted as ever. They fear being regulated the same way most media companies are, requiring disclosure and due diligence for advertisers and content. But they also don’t want to be accused of bias. Instead, they’ll just pretend that voting on the truth works, and still allow each tribe to see the “news” that it prefers to see.

I can’t be certain what the effect of Facebook’s changes on society will be. But it definitely won’t improve users’ well-being overall, and it won’t reduce the amount of biased and inaccurate news on the site.