Mark Zuckerberg’s Instagram unveiled a major overhaul of safety features for kids on Tuesday – a move that online watchdogs quickly blasted as a bid to avoid a looming congressional crackdown on the social media giant.

Instagram said it will automatically place users under age 18 into “teen accounts” and block people who do not follow them from viewing their content or interacting with them.

It will also mute Instagram app notifications for teen users between 10 p.m. and 7 a.m. and send “time limit reminders” urging teens to close the app after 60 minutes per day.

Parents will be able to view which accounts their kid has recently messaged, set daily time limits and block teens from using the app during specific time periods.

Additionally, users under the age of 16 will need parental permission to make changes to their account safety settings.

The overhaul was announced as the bipartisan Kids Online Safety Act – a landmark bill that would impose a legal “duty of care” on Instagram parent Meta, TikTok and other social media firms to protect kids from online harm, gains momentum in Congress.

In July, the Senate passed KOSA and another bill called COPPA 2.0, which would ban targeted advertising to minors and data collection without their consent and give parents and kids the option to delete their information from social media platforms, in an overwhelming 91-3 vote.

The House Energy and Commerce Committee is set to markup the bills on Wednesday – a key procedural step that would clear the way for a floor vote in the near future.

Fairplay for Kids, one of the groups leading the charge for KOSA’s passage, decried Meta’s announcement as an attempt to skirt a meaningful legislative crackdown.

“Default private accounts for minors and turning off notifications in the middle of the night are safeguards Meta should have implemented years ago,” Fairplay executive director Josh Golin said. “We hope lawmakers will not be fooled by this attempt to forestall legislation.”

“The Kids Online Safety Act and COPPA 2.0 will require companies like Meta to ensure their platforms are safe and privacy-protective for young people at all times, not just when it’s politically expedient,” Golin added.

Alix Fraser, director of the Council for Responsible Media, took a similar view of the announcement.

“The simple fact is that this announcement comes as Congressional pressure is mounting and support for the bipartisan Kids Online Safety Act continues to build,” Fraser said. “It wouldn’t be the first time Meta made a promise to avoid Congressional action and then never followed through or quietly backed away.”

Policymakers have singled out Meta for failing to protect kids from “sextortion” scams and other forms of online sexual abuse.

Critics have also accused apps like Instagram of fueling a youth mental health crisis with negative outcomes ranging from anxiety and depression to eating disorders and even self-harm.

Last fall, a coalition of state attorneys general sued Meta, alleging the company has relied on addictive features to hook kids and boost profits at the expense of their mental health.

In January, Zuckerberg issued a stunning apology to the families of victims of online abuse during a tense hearing on Capitol Hill. 

Despite its easy passage in the Senate, KOSA’s final prospects in the House remain uncertain, with some critics on both sides of the aisle raising concerns about the impact on online free speech.

In July, US Surgeon General Vivek Murthy called for the implementation of a tobacco-style “warning label” for social media apps to raise awareness for their potential mental health risks, including depression and anxiety.

Share.
2024 © Network Today. All Rights Reserved.