image
image
7 October, 2022

Facebook’s algorithm monetizes misery

icon Algorithm,facebook,Section 230
Facebook's algorithm monetizes misery to sell more ads

"British Ruling Pins Blame on Social Media for Teenager’s Suicide" - NYT

An inquest ruled that harmful online content contributed to the 14-year-old’s death. Ian Russell accused Meta, the owner of Facebook and Instagram, of guiding his daughter on a “demented trail of life-sucking content”, after the landmark ruling raised the regulatory pressure on social media companies. - The Guardian

Molly Russell’s father accuses Facebook of “monetizing misery” after an inquest ruled that harmful online content contributed to his 14-year-old daughter’s death.

How do social media algorithms work? How do Facebook and YouTube profit by sharing disturbing content? How do social media firms fan political violence? Should social media companies be held responsible for the harm they cause by spreading hate and misinformation? How does their greed divide people and polarize politics? What is blood money? What is Section 230? What's the case before the Supreme Court?

How Facebook algorithm is designed to push angry posts so that users will spend more time online and share those angry posts with others.

Profiting from anger and misery

Algorithms are designed to maximize the time users spend on the platform because that increases the number of ads they can be shown. More ads, more profits. And as users share posts with others, it creates more opportunities to sell ads. The algorithms don't pay much attention to the harm the content being shared might cause, just as long as it keeps users on the platform.

"Andrew Walker, senior coroner said algorithms that curate a social media user’s experience had pushed harmful content to Molly that she had not requested. He said some of the content “romanticized” acts of self-harm and sought to discourage users from seeking professional help. Concluding that it would not be safe to rule Molly’s cause of death as suicide, Walker said some of the sites she had viewed were “not safe” because they allowed access to adult content that should not have been available to a 14-year-old.

“It is likely that the above material viewed by Molly, already suffering with a depressive illness and vulnerable due to her age, affected her in a negative way and contributed to her death in a more than minimal way,” Russell said, delivering his findings of fact at north London coroner’s court. “It’s time to protect our innocent young people, instead of allowing platforms to prioritize their profits by monetizing their misery.” - The Guardian

What is blood money?

Blood money is money obtained at the cost of another's life. Merriam Webster Dictionary

Anger sells ads

Researchers analyzed 70 million “tweets” from 200,000 users and  separated them into four categories of emotions: Anger, joy, sadness, and disgust. They wanted to see how posts containing certain emotions spread from one user to the next. They found anger was the most viral emotion of all. Angry posts caused a ripple of more posts up to three degrees of separation from the original message, the researchers found. The spread of anger was particularly high when the rage originated from a user with a large follower base. - Digital Trends

Facebook Zuckerberg spreads hate and divides society

Who's responsible for the harm from posts? What's Section 230?

Section 230 of the Communications Decency Act of 1996 was enacted before the rise of today's major social media companies. It protects "interactive computer services" by ensuring they cannot be treated as the "publisher or speaker" of any information provided by other users. The lawsuit before the Supreme Court argues that such immunity should not apply when the company's platform recommends certain content via algorithms that identify and display content most likely to interest users, based on how people use the service. - Reuters

Supreme Court case

The U.S. Supreme Court will hear a challenge to federal protections for internet and social media companies freeing them of responsibility for content posted by users in a case involving an American student fatally shot in a 2015 rampage by Islamist militants in Paris. The justices took up an appeal by the parents and other relatives of Nohemi Gonzalez, a 23-year-old woman from California who was studying in Paris, of a lower court's ruling that cleared Google LLC- owned YouTube (both part of Alphabet Inc GOOGL.O) of wrongdoing in a lawsuit seeking monetary damages that the family brought under a U.S. anti-terrorism law. - Reuters

The Supreme Court decides whether the First Amendment protects Big Tech’s editorial discretion or forbids its censorship of unpopular views. The stakes are high not just for government and the companies, but because of the increasingly dominant role platforms such as Twitter and Facebook play in American democracy and elections. Social media posts have the potential to amplify disinformation or hateful speech, but removal of controversial viewpoints can stifle public discourse about important political issues. Governments that say conservative voices are the ones most often eliminated by the decisions of tech companies scored a major victory Friday, when a divided panel of the U.S. Court of Appeals for the 5th Circuit upheld a Texas law barring companies from removing posts based on political ideology. - WaPo

What happens when extreme right wing groups can spread threats and racism through online platforms - unrestrained by any checks? How do you balance the need for checks with the need to hold social media firms accountable? What happens when an extremist group post results in a killing? Who is responsible the person who posted it or the social media platform that distributed the post?

Facebook spreads harmful content disinformation to boost profits

How online posts cause real harm

  • Inquest looks into social media algorithms' role in Molly's death - Daily Mail
  • Molly Russell viewed online content which 'raised concerns' before her death. This included 3,500 shares during that time frame, as well as 11,000 likes and 5,000 saves.
  • An inquest is looking into social media algorithms and their role in her death. But lawyers said Facebook has redacted information and has been 'unwilling' 
  • But in the last six months of her life, she was engaging with Instagram posts around 130 times a day on average.
  • The family's lawyers have not been provided with the content, just links, some of which do not work or have been deactivated, Mr Sanders said.

After her death, Ian Russell discovered that Molly has been interacting with social media accounts that promoted self-harm. He recalled seeing an account Molly followed on social media, that featured an image of a blindfolded girl, hugging a teddy bear with the caption “This world is so cruel, and I don’t wanna see it anymore”. Most of the accounts Molly followed were of people who were depressed, self-harming, or suicidal. However, there was some content that was positive, perhaps of people who were trying to help each other out, find ways to remain positive and refrain from self-harming. - JakeBugs

Facebook spreads hate and racism

Facebook fans political extremism and conspiracy theories

Facebook’s news feed algorithm has been blamed for fanning sectarian hatred, steering users toward extremism and conspiracy theories, and incentivizing politicians to take more divisive stands. It’s in the spotlight thanks to waves of revelations from the Facebook Papers and testimony from whistleblower Frances Haugen, who argues it’s at the core of the company’s problems. - WaPo

Facebook has previously fought efforts to hold Zuckerberg personally accountable. In 2019, as the company was facing a record-breaking $5 billion fine from the Federal Trade Commission for privacy violations related to Cambridge Analytica, a political consultancy that abused profile data from tens of millions of Facebook users, Facebook negotiated to protect Zuckerberg from direct liability.

In 2018,Zuckerberg defined a new metric that became his “north star,” according to a former executive. That metric was MSI — “meaningful social interactions” — named because the company wanted to emphasize the idea that engagement was more valuable than time spent passively scrolling through videos or other content. For example, the company’s algorithm would now weight posts that got a large number of comments as more “meaningful” than likes, and would use that information to inject the comment-filled posts into the news feeds of many more people who were not friends with the original poster, the documents said. WaPo

How to make gazillions spreading misinfo with algorithms like Facebook

More on Facebook


TakeAway:
 Hold Facebook and YouTube responsible for the harm they cause. Enough of blood money profits.

Deepak
DemLabs

DISCLAIMER: ALTHOUGH THE DATA FOUND IN THIS BLOG AND INFOGRAPHIC HAS BEEN PRODUCED AND PROCESSED FROM SOURCES BELIEVED TO BE RELIABLE, NO WARRANTY EXPRESSED OR IMPLIED CAN BE MADE REGARDING THE ACCURACY, COMPLETENESS, LEGALITY OR RELIABILITY OF ANY SUCH INFORMATION. THIS DISCLAIMER APPLIES TO ANY USES OF THE INFORMATION WHETHER ISOLATED OR AGGREGATE USES THEREOF.

Facebook Youtube and Instagram should be held responsible for the harm they cause
Facebook helps Putin and Russia spread misinfo
Facebook divides American society
Facebook makes money by selling ads. Cozies up to Trump.
Facebook has no ethics
case study image
15 Jul’ 2024

Republican ABORTION BAN Playbook for Project 2025: Now available in both English and Spanish!

Banning abortions while claiming to be against bans at the same time is called 'DoubleSpeak'. Republicans are using DoubleSpeak word games to hide their Project 2025...

case study image
15 Jul’ 2024

Cannon Dismisses Case Against Trump For Stealing National Security Documents: Follow the corruption trail

"An Astonishing Ruling in Trump’s Classified-Documents Case" "The facts of this case appeared to be simpler than those in any of the others against him. Upon leaving...

case study image
15 Jul’ 2024

MAGA Dog Whistles: Learn to recognize the coded calls to Political Violence

Share this infographic freely with this link or as a GIF Dog whistles are coded messages that are understood by a particular group of people but not...

case study image
1 Jul’ 2024

Chevron Doctrine overturned: Mapping the deadly cost of MAGA Supreme Court Justices’ power grab

"Chevron Doctrine is overruled" "Chief Justice Roberts cast aside 40 years of judicial precedent in a power grab. Instead of deferring to the expertise of agencies...

case study image
15 Jun’ 2024

Exposing Hackers’ Tricks: Republicans Manipulate Videos to Smear Rivals

Republicans used misleading videos to attack Biden "Republican officials and allies of Trump repeatedly tried to turn Biden’s Normandy visit into a highlight reel of senior...

case study image
11 Jun’ 2024

Why do billionaire owned media outlets promote Trump? Follow the money!

Trump’s Total Lack of Fitness Doesn’t Stop the Washington Post From Pumping His Fascist Muscles American voters suffer every day from the disconnect between the reality...