Meta has blocked livestreaming by teenagers on Instagram

Meta is expanding its safety measures for teenagers on Instagram with a block on livestreaming, as the social media company extends its under-18 safeguards to the Facebook and Messenger platforms.

Late last year, the Australian Senate passed a law prohibiting children under 16 from using social media.

Meta now moves to bar under-16s from using Instagram’s Live feature unless they have parental permission. They will also require parental permission to turn off a feature that blurs images containing suspected nudity in their direct messages.

The changes were announced alongside the extension of Instagram’s teen accounts system to Facebook and Messenger. Teen accounts were introduced last year and placed under-18s by default into a setting that includes giving parents the ability to set daily time limits for using the app, to block teenagers from using Instagram at certain times and to see the accounts with which their child is exchanging messages.

Read Also: Australian Senate approves landmark law banning under-16s from social media

Facebook and Messenger teen accounts will be rolled out initially in the US, UK, Australia and Canada. As with the Instagram accounts, users under the age of 16 will need parental permission to change the settings, while 16 and 17-year-olds defaulted into the new features will be able to change them independently.

Meta said the Instagram teen accounts were used by 54 million under-18s around the world, with more than 90% of 13- to 15-year-olds keeping on their default restrictions.

The NSPCC, a leading child protection charity, said it welcomed extending the measures to Facebook and Messenger, but said Meta had to do more work to prevent harmful material appearing on its platforms.

“For these changes to be truly effective, they must be combined with proactive measures so dangerous content doesn’t proliferate on Instagram, Facebook and Messenger in the first place,” said Matthew Sowemimo, the associate head of policy for child safety online at the NSPCC.

The announcement was made as the UK implements the Online Safety Act. Since March, every site and app within the scope of the legislation, which covers more than 100,000 services from Facebook, Google and X to Reddit and OnlyFans, is required to take steps to stop the appearance of illegal content such as child sexual abuse, fraud and terrorism material, or to take it down if it goes online.

The act also contains provisions for protecting children from harm and requires tech platforms to shield under-18s from damaging material such as suicide and self-harm-related content. Reports last week that the act could be watered down as part of a UK-US trade deal were met with protests from child safety groups, which said any compromise would be an “appalling sellout” that would be rejected by voters.skip past newsletter promotion

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

1,167,000FansLike
34,567FollowersFollow
1,401,000FollowersFollow
0SubscribersSubscribe
- Advertisement -

Latest Articles