The Federal Trade Commission (FTC) proposed to change a 2020 privacy order with Meta. The changes come after the FTC alleges that the company has failed to fully comply with the order, misled parents about their ability to control with whom their children communicated through its Messenger Kids app and misrepresented the access it provided some app developers to private user data.

As part of the proposed changes, Meta would be prohibited from profiting from data it collects, including through its virtual reality products, from users under the age of 18. It would also be subject to other expanded limitations, including in its use of facial recognition technology, and required to provide additional protections for users.

This is the third time the FTC has taken action against Meta for allegedly failing to protect users’ privacy. The FTC first filed a complaint against Meta  then Facebook  in 2011, and secured an order in 2012 barring the company from misrepresenting its privacy practices. But according to a subsequent complaint filed by the FTC, Meta violated the first FTC order within months of it being finalized engaging in misrepresentations that helped fuel the Cambridge Analytica scandal. In 2019, Meta agreed to a second order — which took effect in 2020 — resolving claims that it violated the FTC’s first order. Today’s action alleges that Meta has violated the 2020 order, as well as the Children’s Online Privacy Protection Act Rule (COPPA Rule).

The 2020 privacy order required Meta to pay a $5 billion civil penalty. The 2020 order also expanded the required privacy program, as well as the independent third-party assessor’s role in evaluating the effectiveness of Meta’s program. For example, the 2020 order required Facebook to conduct a privacy review of every new or modified product, service, or practice before implementation and document its risk mitigation determinations. The order also required Facebook to implement greater security for personal information, and imposed restrictions on the use of facial recognition and telephone numbers obtained for account security.

In addition, the FTC has asked the company to respond to allegations that, from late 2017 until mid-2019, Facebook misrepresented that parents could control whom their children communicated with through its Messenger Kids product. According to the FTC, despite the company’s promises that children using Messenger Kids would only be able to communicate with contacts approved by their parents, children in certain circumstances were able to communicate with unapproved contacts in group text chats and group video calls.

The FTC says these misrepresentations violated the 2012 order, the FTC Act and the COPPA Rule. Under the COPPA Rule, operators of websites or online services that are directed to children under 13 must notify parents and obtain their verifiable parental consent before collecting personal information from children.

The proposed changes to the 2020 order, which would apply to Facebook and Meta’s other services such as Instagram, WhatsApp and Oculus, include:

Blanket prohibition against monetizing data of children and teens under 18: Meta and all its related entities would be restricted in how they use the data they collect from children and teens. The company could only collect and use such data to provide the services or for security purposes, and would be prohibited from monetizing this data or otherwise using it for commercial gain even after those users turn 18.

Pause on the launch of new products, services: The company would be prohibited from releasing new or modified products, services or features without written confirmation from the assessor that its privacy program is in full compliance with the order’s requirements and presents no material gaps or weaknesses.

Extension of compliance to merged companies: Meta would be required to ensure compliance with the FTC order for any companies it acquires or merges with, and to honor those companies’ prior privacy commitments.

Limits on future uses of facial recognition technology: Meta would be required to disclose and obtain users’ affirmative consent for any future uses of facial recognition technology. The change would expand the limits on the use of facial recognition technology included in the 2020 order.

Strengthening existing requirements: Some privacy program provisions in the 2020 order would be strengthened, such as those related to privacy review, third-party monitoring, data inventory and access controls and employee training. Meta’s reporting obligations also would be expanded to include its own violations of its commitments.