Meta’s Section 230 Turns Out as Useless as ‘P’ in Psychology


MetaMeta was sued for its algorithm alleging that it has contributed to mental health issues!

Meta, the parent company of Instagram and Facebook, has been served with eight different lawsuits that contend the company deliberately adjusted its algorithm to hook young people. Focusing on Instagram in particular, one lawsuit contends that Meta is using the insecurities of teens and young people for profit by addicting them to remain engaged and causing depression, anxiety, and eating disorders. The eight lawsuits come just six months after a mother sued Meta for allegedly playing an active part in her 11-year-old daughter’s suicide. In that particular case, the mother said that the social media site is designed to hook younger users into repeated use, making it extremely difficult to stay connected to real life.

One of the recent lawsuits takes the addiction argument even further. Parents of a 19-year-old woman state that similar addictive behavior to Instagram, caused their 19-year-old daughter to develop an eating disorder. The lawsuit contends that at a young age, the girl became addicted to the app, and shortly thereafter, she began to exhibit signs of “addiction, anxiety, depression, self-harm, eating disorders, and, ultimately, suicidal ideation.”

A federal appeals court last year ruled that Snap could be held accountable for its speed filter, which allegedly encouraged reckless driving and caused a fatal car crash in 2017. That ruling opened the door for lawsuits like the ones filed against Meta, said Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University. The plaintiffs in the Snap case argued that the speed filter wasn’t considered third-party content, but was rather a design choice Snap itself had made. Because the court ruled that Snap wasn’t protected by Sec. 230 in this case, others are attempting to work around the law similarly.

But, Goldman argued, the Snap ruling and the cases against Meta are “qualitatively different,” because the algorithm and the content it serves are “all the same thing.”

“This idea that we can distinguish between dangerous software and dangerous third-party content on software is in my mind an illusion,” he told Protocol. “The algorithm only directs people to see the content. Ultimately, it’s the content that’s the problem. Then we’re back to the fact that that’s a Sec. 230 lawsuit.”

“Social media use among young people should be viewed as a major contributor to the mental health crisis we face in the country,” said Andy Birchfield, an attorney representing the Beasley Allen Law Firm, leading the cases, in a statement.

“These applications could have been designed to minimize any potential harm, but instead, a decision was made to aggressively addict adolescents in the name of corporate profits. It’s time for this company to acknowledge the growing concerns around the impact of social media on the mental health and well-being of this most vulnerable portion of our society and alter the algorithms and business objectives that have caused so much damage.”

The lawsuits have been filed in federal courts in Texas, Tennessee, Colorado, Delaware, Florida, Georgia, Illinois, and Missouri, according to Bloomberg.

“Meta has invested billions of dollars to intentionally design their products to be addictive,” the lawsuit stated, “and encourages use they know will be problematic and highly detrimental to their users’ mental health,” Meta says that the company has developed parental control locks on its website for minors and warns against prolonged usage of its products at a young age.