On Tuesday, Instagram announced a major overhaul aimed at enhancing privacy and reducing the negative impact of social media for users under 18. This comes in response to mounting pressure over children’s online safety.

The platform, owned by Meta, revealed that in the coming weeks, accounts of users under 18 will be private by default, meaning only approved followers can view their posts.

Additionally, Instagram plans to stop notifications to minors between 10 p.m. and 7 a.m. to promote healthier sleep habits. New supervision tools will also be introduced, allowing parents to see the accounts their teenager recently messaged.

In June, U.S. Surgeon General Dr. Vivek Murthy suggested adding cigarette-like warning labels to social media, cautioning users about potential mental health risks.

The Senate followed suit in July by passing the bipartisan Kids Online Safety Act, aimed at enforcing safety and privacy standards for young users. Some states have even introduced restrictions on social media usage for minors.

Mark Zuckerberg, Meta’s CEO, has been a particular target for criticism due to social media’s impact on young people. Several state attorneys general have sued Meta, accusing it of knowingly hooking children on its platforms while downplaying the associated risks.

During a Congressional hearing in January, lawmakers urged Zuckerberg to apologize to families whose children had committed suicide following online abuse. In response, Zuckerberg expressed his condolences: “I’m sorry for everything you have all been through.”

Protection for children, peace of mind for parents

Adam Mosseri, Instagram’s head, explained that these new features address key concerns from parents, including inappropriate contact, content, and excessive screen time.

He emphasised that parents, rather than tech companies or lawmakers, are best suited to determine what is appropriate for their children. The changes, dubbed “Teen Accounts,” aim to ensure that minors have age-appropriate experiences on the platform.

This move is one of the most comprehensive efforts by a social media platform to protect young users. In recent years, parents and advocacy groups have raised alarms about platforms like Instagram, TikTok, and Snapchat exposing young users to harmful content, including bullying, sexual predators, and material promoting self-harm or eating disorders.

‘A good step forward but’

Despite Instagram’s latest changes, questions remain about their effectiveness. Meta has been making promises to protect minors since 2007, yet concerns persist. In 2021, Instagram introduced private-by-default settings for users under 16, but allowed younger teens to switch their accounts to public if they chose to. This time, while 16- and 17-year-olds will still have the option to opt out of the privacy defaults, those under 16 will need parental permission to make their accounts public.

Dr. Megan Moreno, a professor of pediatrics at the University of Wisconsin, praised Instagram’s updated youth settings as “significant,” noting that they set a higher standard for privacy and shift some responsibility away from teens and their parents.

However, a key issue remains unresolved: teenagers who lie about their age when registering for Instagram. The new settings will apply only to those who self-identify as minors, and while Instagram’s terms prohibit users under 13, the platform does not actively seek out or remove underage users. Instagram said it would require users to verify their ages if they tried to bypass the new privacy settings by creating accounts with a false birth date. The company is also developing technology to detect teens who pose as adults on the platform.

Some children’s advocacy groups criticised Instagram’s latest move as an attempt to preempt new federal regulations on children’s online safety.

Jim Steyer, CEO of Common Sense Media, remarked, “These are long overdue features that Instagram should have put in place years ago to keep young people safe online. They’re only acting now because they’re under pressure from lawmakers and advocates.”

While the changes might be welcomed by parents, teenagers, particularly influencers who rely on public profiles to grow their following, may be less enthusiastic. According to a Pew Research survey, nearly half of U.S. teens between 13 and 17 use Instagram daily, making it the fourth most popular social media platform among young Americans.

The safety measures could impact Meta’s business in the short term, as the company depends on growing its user base, particularly among younger users, to remain competitive. Mosseri acknowledged that the changes might hurt Instagram’s teen engagement but said the company is committed to taking risks to improve user safety.

Other platforms, like TikTok, have also implemented changes to protect younger users. In 2021, TikTok made accounts private by default for users aged 13 to 15 and disabled direct messaging for younger teens.

Instagram’s new settings and features will begin rolling out on Tuesday, with new accounts registered by minors automatically set to private. The app will also begin converting existing teen accounts in the U.S., Canada, Australia, and the U.K. to private mode.

Meta will continue restricting teens from sending direct messages to people they don’t follow and will limit their exposure to content from strangers.

Parents will also have more tools to monitor their children’s Instagram activity, including a feature that allows them to see topics their child has shown interest in and accounts they’ve recently messaged, although parents will not be able to view the content of these messages to protect user privacy.

While these changes aim to improve safety, experts warn that the new supervision tools could spark tensions between parents and teens, particularly if their personal identities or political views conflict with those of their families.