The Federal Trade Commission is looking to crack down on Meta’s use of young users’ data after an independent assessor found the company failed to comply with a 2020 privacy order.
The FTC on Wednesday said it is proposing changes to the privacy order that would prohibit Meta, Facebook’s parent company, from profiting from data from users under 18. The action comes after an assessor found “several gaps and weaknesses” in the company’s privacy program that “pose substantial risks to the public.”
“Facebook has repeatedly violated its privacy promises,” Samuel Levine, director of the FTC’s bureau of consumer protection, said in a news release. “The company’s recklessness has put young users at risk, and Facebook needs to answer for its failures.”
Meta spokesperson Thomas Richards argued that the company has “spent vast resources building and implementing an industry-leading privacy program under the terms of our FTC agreement.”
What does the FTC’s proposal say?
The proposed changes to the 2020 order would apply to various Meta brands, including Facebook, Instagram, WhatsApp and Oculus and would:
- Bar Meta and its entities from monetizing data from children and teens under 18.
- Pause the launch of new products and services without written confirmation from an assessor that Meta’s privacy program is in full compliance with the order.
- Require any companies Meta acquires or merges with to comply with the order.
- Require Meta to obtain user consent for any use future uses of facial recognition technology.
- Strengthen privacy program provisions from the 2020 order.
The FTC has also asked Meta to respond to allegations that the company from late 2017 through mid-2019 misled parents about who their children can communicate with through the Messenger Kids app.
While parents were told children using Messenger Kids could only communicate with contacts approved by their guardian, the FTC alleges that users could communicate with unapproved contacts in group text changes and group video calls. The FTC says this misrepresentation violates a previous FTC order from 2012, the FTC Act and the Children’s Online Privacy Protection Act, which says entities that operate websites and online services directed to children under 13 must obtain parental consent before collecting children’s personal information.
Meta’s history with the FTC
This is the FTC’s third time taking action against Meta for allegedly failing to protect users’ privacy.
- In 2012, the commission secured an order prohibiting the company from misrepresenting its privacy practices.
- Meta agreed to a second FTC order in 2019 following the FTC’s investigation around the Cambridge Analytica data scandal. The order, which took effect in 2020, required the company to pay a record-breaking $5 billion penalty to settle charges that it violated a 2012 FTC order by deceiving users about their ability to control the privacy of their personal information.
- The latest action from the FTC alleges that Meta violated the 2020 order and the COPPA Rule.
Meta was also fined about $400 million by Ireland’s data privacy regulator last year for mishandling children’s data on Instagram.
Meta spokesperson Richards called the proposal a political stunt.
“Despite three years of continual engagement with the FTC around our agreement, they provided no opportunity to discuss this new, totally unprecedented theory,” Richards said. “Let’s be clear about what the FTC is trying to do: usurp the authority of Congress to set industrywide standards and instead single out one American company while allowing Chinese companies, like TikTok, to operate without constraint on American soil. “
Richards also noted that Commissioner Alvaro Bedoya released a statement questioning the FTC’s authority to take this action.
What happens next?
The agency has given Meta 30 days to respond to the proposed findings. The commission will decide whether to modify the 2020 order after considering arguments.
Richards said Meta plans to “vigorously fight” the proposal.