Meta and Snap sued by mother over alleged role in her daughter’s suicide
Meta and Snap sued by mother over alleged role in her daughter’s suicide
The 11-year-old had developed an “extreme addiction” to social media prior to taking her life.
A Connecticut mother has brought a lawsuit against Facebook and Instagram parent company Meta, as well as Snap, claiming the platforms to cause the sort of addiction her late daughter suffered prior to taking her own life at age 11 last July.
Social media companies have been the target of various lawsuits over the years related to alleged harm to minors — oftentimes for failing to adequately prevent that harm, as in the case of teen who was bullied via an anonymous messaging app within Snapchat, leading to his eventual suicide. Tammy Rodriguez is instead making the case that the sort of “stickiness” these platforms are built to engender is inherently harmful, especially to young users like her late daughter Selena.
Selena “struggled for more than two years with an extreme addiction to Instagram and Snapchat,” the suit notes, a claim apparently backed by an outpatient therapist who had “never seen a patient as addicted to social media” during their evaluation. Although technically too young to be on either platform per their terms of service — Instagram and Snapchat state their minimum age for account creation is 13 — the mother points to the absence of parental controls, as well as the lack of strong age verification checks, which made policing her daughter’s access to the services nearly impossible. “The only way for Tammy Rodriguez to effectively limit access to Defendants’ products would be to physically confiscate Selena’s internet-enabled devices,” the suit claims, “which simply caused Selena to run away in order to access her social media accounts on other devices.”
Use of the services, Rodriguez alleges, caused her daughter to suffer from depression, sleep deprivation, school absences, eating disorders, self-harm and led to her eventual suicide.
Rodriguez argues that Snapchat’s “unknown and changing rewards” are “akin to a slot machine but marketed toward teenage users who are even more susceptible than gambling addicts.” Similarly, Instagram’s design decisions “seek to exploit users’ susceptibility to persuasive design and unlimited accumulation of unpredictable and uncertain rewards,” in the form of likes and followers. These features, it’s argued, are highly detrimental to teen and pre-teen users whose brains are still not fully developed, particularly in the realms of “impulse control and risk evaluation.”
The claim mirrors, as well as quotes from, some of the concerns voiced by whistleblower Francis Haugen. Among the tranche of documents released to news organizations by Haugen was internal research showing that Instagram might be harmful to the well-being of users, especially young girls, as well as internal documents describing the loss of of this user cohort as an “existential threat” to the business. The effects of Instagram on children’s well-being is also the subject of a current investigation by a bipartisan coalition of Attorneys General.
We’ve reached out to Snap and Meta for comment and will update if we hear back.
Update 1/21/22 5:17pm ET: “We are devastated to hear of Selena’s passing and our hearts go out to her family,” a Snap spokesperson told Engadget. “While we can’t comment on the specifics of active litigation, nothing is more important to us than the wellbeing of our community.”
In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. Crisis Text Line can be reached by texting HOME to 741741 (US), 686868 (Canada), or 85258 (UK). Wikipedia maintains a list of crisis lines for people outside of those countries.
(26)