Guest: Frances Haugen | Facebook Whistleblower
Category: 🌐 Digital | Social Media
Original: 27 min | Time Saved: 25 min
Podcast’s Essential Bites:
[3:10] FH: "I was working on how Facebook was influencing the information environment in some of the most vulnerable places in the world. Most people are not aware that Facebook went into countries around Africa and around South America, around Southeast Asia and said, use our products, the internet is free. And Facebook became the internet for at least a billion or a billion and a half people. The version of Facebook that we use, of Instagram that we use is the safest, cleanest version of Facebook in the world. And I genuinely fear that there are 10s of millions of lives on the line in places like African countries and Southeast Asia."
[4:20] FH: "Facebook had a team that was called the social cohesion team, which was the team that was responsible for fighting ethnic violence. And we had a thing called virality review every week or every other week, where they bring in translators and explain to us the most popular 10 posts in each country that were [...] tier one at risk countries. And every single post would be a beheading video, it'd be an accusation that the opposition was molesting children, that this child was kidnapped and tortured, like truly horrific misinformation. So there's this question of why was this the most popular content in those countries, while we don't see that here? Facebook was spending 87% of its operational budget for misinformation on English, even though only 9% of users spoke English."
[4:45] FH: "The reason why we see the most viral content [...] being these horrific pieces of content is that the algorithms push us towards extremism. That when you have algorithms and say, content is better if it gets more clicks, it ignores the fact that the shortest path to a click is hate, anger, division. [...] Their business model is ads, so the more attention you give, the more ads they get."
[7:26] TH: "Each of these [social media] companies has an ego or an embedded growth obligation. [...] They also have to grow the amount of attention that they get from humanity. And if I don't get that attention, the other one will. And so if Instagram doesn't add a beautification filter to match TikTok in the arms race for teenagers' mental health, Instagram is just going to lose the arms race. And so it's pretty simple game theory. [...] The reason that I came out [...] is that we can predict the future. I can tell you exactly what society is going to look like if you let this race continue: population centric information warfare, weakening teenage mental health, shortening attention spans, beautification filters, unrealistic standards of beauty for teenagers, more polarizing extreme content, more conspiracy theories."
[9.20] FH: "These are private companies that we're asking to run critical public infrastructure in a completely [non] transparent way. We're asking them to maintain public safety to maintain national security, when those are cost centers, they're not profit centers. And so you end up in a situation, where they may want to do better, but because they have to meet these market incentives each year, it's hard for them to get there."
[9:51] TH: "We have to notice, [...] per the E. O. Wilson quote, [...] that the fundamental problem of humanity is we have paleolithic emotions and brains, medieval institutions and accelerating godlike technology. [...] Part of the medieval institutions is that law always lags the new tech. We don't need a new conception of privacy until you have ubiquitous cameras that start getting rolled out in the 1900s. We don't need a right to be forgotten until new 21st century technology can remember you forever."
[13:14] TH: "In Facebook's own research, simply taking away the share button and having you [...] copy and paste the text manually, [...] so adding that one piece of friction in, where I have to [...] intentionally do it, [...] we're talking about a tiny change, something that an JavaScript engineer can spend a day and it's done, [...] would be more effective than [...] a billion dollars spent on content moderation, and all the other sort of trust and safety initiatives."
[16:38] TH: "Our society runs on trust, and Facebook and Twitter are trust-degrading machines, because they reward those who are more and more innovative and come up with new species of cynicism, and distrust. You will be paid 100x more likes, followers rewards and reach, then if you do not do that. [...] We've been living in this 24/7 magic trick that everyone's cynical about everything, when really, there's a handful of people who are cynical, and we gave them 100x reach."
[23:05] TH: "In the long term [...], I think what we need to do is get from [...] how do we get to less toxic social media, to instead tech plus democracy equals stronger democracy? Because right now, [...] the CCP is employing the full suite of exponential technologies to make a new form of 21st century authoritarianism. [...] When we look at democracies, we're not employing the full suite of 21st century technologies to make new and upgraded forms of democracy. We're instead allowing private business models to profit from the degradation of democracies. So the long term has to be what is a vision for a digital democratic future."
Rating: ⭐⭐⭐⭐
Additional Links:
The Council for Responsible Social Media (CRSM)
Beyond the Screen
#OneClickSafer Campaign