Meta fined over $400 mn for failing to protect children's privacy on Instagram
Meta has been fined 405 million euros ($402 million) by the Irish Data Protection Commission (DPC) for its handling of children’s privacy settings on Instagram, which violated Europe’s General Data Protection Regulation (GDPR) laws. This is the second biggest fine issued to a tech company under the laws, the largest being a $887 million fine issued to Amazon in July 2021.
The long-running complaint concerned Instagram's failing to protect children's privacy on its platform. The fine was partly based on the fact Instagram had allowed children to operate business accounts on its platform, which showed the account holder's phone number and email address, meaning that data was exposed. The DPC said that it found the accounts of 13- to 17-year-olds were set to "public" as their default setting.
“This is the third fine handed to the company by the regulator,” Ireland's Data Protection Commissioner (DPC) said. Under the European Union’s (EU) data privacy rules, the Irish watchdog, with its headquarters in Dublin, is the lead regulator for several US tech companies.
Last year, the watchdog fined WhatsApp 225 million euros ($21.8 million) for breaching rules on transparency about sharing people’s data with other Meta companies.
Meta, the company that owns Instagram also, said that it planned to appeal against the decision. According to a statement by the company, “This inquiry focused on old settings that we updated over a year ago and we've since released many new features to help keep teens safe and their information private.”
"Anyone under 18 automatically has their account set to private when they join Instagram, so only people they know can see what they post and adults can't message teens who don't follow them,” Meta officials said, adding that they've been "engaged fully with the DPC throughout their inquiry" but "disagree with how this fine was calculated and intend to appeal it”.
National Society for the Prevention of Cruelty to Children (NSPCC) child-safety-online policy head Andy Burrows told BBC, "This was a major breach that had significant safeguarding implications and the potential to cause real harm to children using Instagram. The ruling demonstrates how effective enforcement can protect children on social media and underlines how regulation is already making children safer online."
The fine comes as Instagram has faced intense scrutiny over its handling of child safety issues. The company was forced to stop work on an Instagram Kids' app last year following a whistle-blower's claims that the app can have a negative impact on some teens’ mental health. Since then, the app has added more safety features, including changing default settings on teen accounts to private.