Connecticut Lawmakers Mull Parental Controls for Social Media
Bill Aims to Give Parents and Guardians More Control Over Children’s Online Activities
For the second time in less than two weeks, lawmakers in Connecticut have discussed giving parents and guardians more control over how their children use social media. The General Law Committee sought comment on requiring parental permission before social media platforms can target minors and limiting when those sites can send notifications.
Attorney General’s Concerns
“All my fellow colleagues and I, states attorneys general, Democrats and Republicans, are all really concerned that this is the next major public health crisis in America,” said Attorney General William Tong (D-Connecticut).
Similar Bill Introduced in Children’s Committee
Monday’s public hearing came after a similar bill was included during a Children’s Committee public hearing last Thursday. This proposal sets a higher bar, 18 years old or younger, when social media companies need a parent’s permission to target children with content.
Critics Warn of Harmful and Addictive Content
Platforms use algorithms to try and select content based on a user’s interactions and preferences. Critics have warned that content can be harmful and addictive, especially for children. “We have this industry out there that’s kind of been the wild west,” said Center for Children’s Advocacy Executive Director Sarah Eagan.
Research on Children’s Use of Social Media
She said research has found children spend hours a day on social media, time not spent building relationships and developing skills that could otherwise help them cope with social media. Some experts have also warned of possible links between social media and rises in depression and other mental health issues among adolescents.
Legislators’ Concerns
“Just like we wouldn’t allow access to alcohol, cigarettes or cannabis to a minor, we think that social media has some of the same addictive properties,” said Rep. David Rutigliano (R-Trumbull).
Social Media Company Response
Social media companies have argued such measures aren’t needed. However, during Monday’s hearing, little comment was received, but Technet, a lobbying firm, noted to the Children’s Committee that other states have run into legal challenges when trying to regulate content. The company also said social media companies already offer protection. “Our industry has a longstanding commitment to provide parents and guardians with resources to help ensure a safe online experience for their children, and the industry has been at the forefront of educating parents and guardians about online safety,” the company wrote in written testimony.
Parenthood Challenges
But Eagan said parents have struggled with those tools. “A lot of people’s kids know more about how to manage technology than their parents do,” she said. “As the parent of a high school-aged child, I can attest to that.”
Conclusion
The proposal is part of a broader effort to address concerns about social media’s impact on children’s mental health and well-being. While social media companies argue that their platforms are safe and effective, lawmakers and experts believe that more regulation is needed to protect children.
FAQs
* What is the proposal about?
The proposal aims to require social media companies to obtain parental permission before targeting minors and limiting when they can send notifications.
* What is the purpose of the proposal?
The proposal is part of a broader effort to address concerns about social media’s impact on children’s mental health and well-being.
* What do social media companies say about the proposal?
Social media companies argue that the proposal is not needed, as they already offer protection and resources to help ensure a safe online experience for children.