An inevitable appeal —

Ireland fines Instagram 405M euro for failing to protect children’s data

Facebook is reviewing the decision.

Ireland fines Instagram 405M euro for failing to protect children’s data

Ireland’s data regulator has fined Instagram 405 million euro for violating the EU’s General Data Protection Regulation and failing to safeguard children’s information.

The fine from the Data Protection Commission followed a two-year investigation into the Meta-owned social media platform. The investigation covered complaints that Instagram defaulted the accounts of all users, including those under the age of 18, to public settings. It also related to how the contact information of children using business accounts on the platform was publicly available.

Instagram, which allows users over the age of 13, said the fine related to old settings that were updated more than a year ago. It said it had released features to keep teenagers’ information private, including automatically setting children’s accounts to private when they sign up since July last year.

Teens are now also prompted on business accounts that their contact information will be displayed publicly unless they choose to remove it.

“While we’ve engaged fully with the DPC throughout their inquiry, we disagree with how this fine was calculated and intend to appeal it. We’re continuing to carefully review the rest of the decision,” the company said.

The fine is one of the largest under GDPR and the third the Irish regulator has handed to Meta, which also owns Facebook and WhatsApp. Full details of the regulator’s decision on Instagram will be published next week.

Meta was fined 17 million euro in March by the Irish regulator following an investigation into data breach notifications on Facebook. Last year, it was fined 225 million euro for violating privacy laws on WhatsApp.

Meta is appealing against the WhatsApp ruling but has accepted the Facebook decision.

The platform last year paused plans to launch Instagram Kids, a bespoke version of the app for users under the age of 13, in response to global government scrutiny and concerns from child safety campaigners. It is unclear when it will launch, and the company confirmed the project is still paused.

In the UK, changes to social networks were introduced last year to protect children’s privacy when the Children’s Code, or age-appropriate design code, became law.

The regulations, which demand stricter requirements to collect and process children’s data, have inspired other countries, including Ireland, Australia, and Canada, to draw up similar rules.

Last week, California lawmakers in the state senate approved their age-appropriate design code, which could come into force in 2024.

“There is an urgent priority for a universal settlement for children’s privacy so that children across the globe are protected,” said Lady Beeban Kidron, who proposed the Children’s Code and is chair of children’s digital rights charity 5Rights.

“No environment is 100 percent safe, but social media companies have been slow to act and cavalier about accepting the negative impacts of their products on children. Safety-by-design is not an aspiration, it is the minimum we should expect,” she added.

Additional reporting by Jude Webber in Dublin.

© 2022 The Financial Times Ltd. All rights reserved Not to be redistributed, copied, or modified in any way.

Channel Ars Technica