[ad_1]
Jamie Nguyen’s use of Instagram began innocuously in seventh grade. There were group discussions to schedule meetings with her volleyball team. She enjoyed finding sports-related memorabilia to share with her friends.
But very quickly, Nguyen, now 16, began spending many evenings of the week scrolling through Instagram, TikTok or YouTube. She sought validation from people who liked her posts and was obsessed with viewing an endless loop of photos and videos that entered her feed based on her search history. Disturbingly, some posts made her think she’d look better if she followed their advice on how to “get slim” or develop rock-hard abs in two weeks.
“Eventually I was on Instagram and TikTok so many hours a day that it became addictive,” said the San Jose Evergreen Valley High junior. Over time, she found it difficult to focus on homework and became irritable around her parents.
Experiences like these — with the potentially harmful consequences of a teenager’s increased time online — are at the center of a national debate over whether the government should ask social media companies to protect the mental health of children and teens.
As of August 1, California lawmakers will resume discussion on AB2408, a closely-watched bill that would keep Facebook, Snapchat and other big companies on the platform for as long as possible for disabled algorithms and other features. The bill passed the House in May and an amended version passed unanimously through the Senate Judiciary Committee on June 28.
Experts and industry indicators say these companies are knowingly designing their platforms to be especially addictive for young users and contributing to the growing crisis of youth depression, anxiety, eating disorders, insomnia, self-harm and suicidal ideation. The law allows state attorneys general and county district attorneys to sue major social media companies for up to $250,000 if their products are addictive.
The tech industry opposes AB2408 for several reasons. The bill provides an “over-the-top solution” to a very complex public health issue, said Dylan Hoffman, executive director for TechNet in California and the Southwest, a group of tech CEOs and senior executives. Many other factors affect the mental health of teenagers, he said.
But Leslie Kornblum, formerly of Saratoga, doesn’t buy the idea that there’s any connection between her 23-year-old daughter’s teenage bout with anorexia and her immersion in “thin” culture on Instagram and Pinterest. Her daughter, who is now in recovery, was inundated with nutritional advice about how to fill up on water or manage egg whites, Kornblum said.
Meta, the parent company of Facebook and Instagram, is facing a growing number of lawsuits from parents who blame the social network for their children’s mental health struggles. In a lawsuit filed against Meta and Snapchat in the U.S. District Court in Northern California, the parents of Connecticut girl Selena Rodriguez, who used Instagram and Snapchat before she took her own life in July 2021, led to psychiatric treatment for several patients. Parents said the platforms didn’t provide enough controls to monitor her social media use, and she ran away when her son took away her phone.
The debate over AB2408, known as the Social Media Platform Duty to Children Act, highlights a long-standing conflict between the ability of tech companies to grow and profit and the safety of individual users.
In the year In December, the US Surgeon General urged social media companies to take more responsibility for creating safe digital environments, with 81 percent of 14- to 22-year-olds saying they use social media “every day” or “almost constantly” in 2020. Between 2009 and 2019 — the period that coincided with the public’s adoption of social media — the share of high school students reporting sadness or hopelessness increased by 40 percent and those who considered suicide increased by 36 percent, the advisory found.
AB2408 is similar to bills recently proposed by Congress and other states. Assemblyman Jordan Cunningham (R-San Luis Obispo), who co-sponsored the bill with Buffy Weeks (D-Oakland), said he was “horrified” by the growing evidence that social media platforms push products, especially from Facebook hacker Frances Haugen. Knowing they are harmful.
“We’ve learned that (social media companies) are hiring some of the smartest software engineers in the world — they put men on the moon two generations ago, but now they’re designing better and better gadgets to put inside them. A father of three and a 7-year-old to keep kids busy and user engagement. There are platforms that will push you.
But TechNet’s Hoffman said the threat of AB2408 civil penalties could force some companies to ban minors from their platforms entirely. In doing so, young people, especially those from marginalized communities, can find online networks they can rely on for social contact and support.
Hoffman also argued that AB2408 is unconstitutional because it violates the First Amendment rights of publishers to choose the types of content they share and advertise to their audiences.
Cunningham’s disclaimer: AB2408 has nothing to do with regulating content. He said the bill targets “kids’ mind-controlling gadgets and gizmos.”
Jaimie Nguyen was able to step back from social media after her parents expressed concern. But you can only do this by removing Instagram and TikTok from her phone. Now, it is up to lawmakers to decide whether the government should step in.
Cunningham says, “There’s nothing in the 50 states or federal code that says you can’t design a product that is addictive with kids in mind. I think we need to change that.
[ad_2]
Source link