Frances Haugen: Facebook harms children and stokes division – video https://t.co/9YnIPL8hGc
— The Guardian (@guardian) October 5, 2021
Apparently, Facebook puts profits before the well-being of people, but we also heard about things that we didn’t know before the hearing.
Many of us already had a feeling that Facebook doesn’t care as much about democracy and our well-being as they want us to believe.
Some of us saw the documentary “The Social Dilemma” and found out about the connection between the rise of politicians like Donald Trump and the almighty algorithms on social media.
But now, we got firsthand information from a former employee of Facebook. Frances Haugen explained at a congressional hearing how all of this actually works.
Mark Zuckerberg already started defending his multibillion-dollar platform by accusing Haugen of mischaracterizing the intentions behind algorithms and business practices.
As users, clients, and maybe even victims of social media, we can look at recent developments and decide for ourselves.
Haugen claims that Facebook algorithms push content based on popularity. The more people click or comment on something, the more people will get to see it. Facebook and Instagram do not show us all the content our friends share in chronological order (that’s what they used to do) but instead decide for us what we care about.
And it’s shocking how good these algorithms know our behavior and preferences. We might see ourselves as folks who care about others, the planet, and equality, but artificial intelligence also knows that we like gossip, controversy, and the feeling of knowing more than anyone else.
If you really want to get to know someone, ask them to show you their newsfeed on social media.
We decide what we post, but we do not pick what we see. Algorithms deliver content to us that is similar to what we cared about in the past—and they are nonjudgmental in the worst way we could think of. They don’t judge us because of liking cat videos more than climate change articles, and they don’t judge based on our preference for misinformation over actual facts—these algorithms only want us to keep scrolling.
The more time we spend scrolling, the more money they make.
Haugen also stated that Facebook developers are well aware that these dynamics not only cause teenagers to spend far too much time online but also shape their perception of the world based on what they see on social media. This can lead to depression, anorexia, and constant anxiety.
And we are not even talking about bullying yet. It sounds sarcastic, but triggering teenagers to hate each other, themselves, and the world actually causes them to spend even more time online—and that’s how Facebook makes money.
The toxic mix of fitness influencers posting photoshopped images and bullies who make fun of every average person has the potential to harm the sanity of our youth (and everyone else).
And if that wasn’t already shocking enough, Haugen also told us that social media is even more harmful in developing countries.
As we saw on January 6th, misinformation on the internet has the power to threaten established democracies like the United States. Things are even worse for developing nations in Africa, Asia, and South America. Countries with authoritarian leaders don’t have independent journalists—and social media actively fills that gap.
Many of us were celebrating the so-called “Arab Spring” 10 years ago. These protests were mainly organized with the help of social media. Unfortunately, most of the countries affected do not find themselves in a better situation than before these protests. Social media helped activists to take down authoritarian leaders, but guess who took their place? Other authoritarian leaders.
While the Arab Spring was based on positive intentions like fighting for human rights, we also see dictators take advantage of social media and use it to back up their cruel policies. Again, algorithms don’t judge the intentions behind the content; they only care about reach and how much time people spend online.
As mentioned in the beginning, Frances Haugen told us her side of the story, and Zuckerberg shared his. But we can speak from our own experiences as users of social media and make up our minds.
I often wonder how it is possible that I don’t see the content of authors whom I follow, but at the same time, I never miss any heated discussion between folks who I haven’t seen in years or actually care about. Why is it that I am almost thankful for everyone who insults me on Twitter because I know that it will help to reach more readers?
I can’t deny having a tendency to join endless discussions and stir up controversies with my articles, but I often ask myself if social media amplifies this behavior? And even more important, at what point does this become a threat to my personal well-being?
The bottom line is that there are three possible ways to move forward after hearing Haugen testify at Congress. We could keep things as they are, we could force social media to ban algorithms and go back to distributing content chronologically, or we could make Zuckerberg change the algorithms toward showing us only the good stuff.
But that’s where the problem starts: what is good stuff? Who is going to decide that? And isn’t that even more dangerous than nonjudgmental algorithms that only care about numbers?
I believe that it would be best to return to how it worked before 2015 and show us all the content posted in chronological order. I want to see everything that my friends post and decide for myself what to click, like, or share.
I don’t need Zuckerberg to weed out content that he thinks I won’t like. Maybe I do care about the pottery project of a friend, even though I usually don’t care much about pottery? Maybe my friends would like to read my articles on politics? Not because they care much about politics, but because they want to hear what I have to say about it?
Who are you, Mr. Zuckerberg, to decide what I care about? Why do you even spend so much money on finding out? Why don’t you let users determine what’s popular and what isn’t by their choices instead of choosing for them?
I want to see everything my friends share and decide for myself. Is that really asking too much? Do we really need politics to regulate social media, or can we just go back to the status quo before 2012?
The biggest social media company started changing its algorithms about 10 years ago—and we all saw what happened.
Why not make social media social again? Why not trust society to organically decide what to click on? Why not let the invisible hand of the market take over again? Isn’t that what capitalists usually ask for?
Dear Mr. Zuckerberg, please stop this irresponsible experiment on social engineering that you call algorithms and go back to what social media was meant for.
No more “move fast and break things,” let’s switch to “slow down and fix things.”
Read 4 comments and reply