The Federal Trade Commission is proposing changes to the Children’s Online Privacy Protection Act, or COPPA, that restrict online sites and services from using children’s data and profiting from it if they fall under the scope of COPPA, such as social media platforms and learning apps.
Key Facts
The COPPA rule, which first went into effect in 2000, requires online sites and services that collect personal data from children under 13 to provide notice and obtain verifiable parental consent, and limits the use of the data by sites.
Proposed Changes
The proposed changes expand the laws to require sites and services to obtain parental consent before sharing children’s personal information with third parties, such as third-party advertisers, unless that data is necessary for the site’s or service’s functioning.
Proposed Rules
The proposed rules would prevent sites and apps from using contact information, such as phone numbers, to send push notifications to children encouraging them to use the site or service more frequently.
School Exception
The exception for schools in the rule will be extended, allowing them to give consent for learning apps or educational technology providers to collect, use, and disclose students’ personal information for educational purposes, not for commercial purposes.
Updating Data Retention Limits
The data retention limits will be updated under the proposed changes to allow online sites and services to retain data only as long as necessary for the purpose it was collected for, but not for other purposes or indefinitely.
Expanding the Definition of “Personal Information”
The Federal Trade Commission proposes to expand the definition of “personal information” to include biometric identifiers such as facial recognition and voice recognition.
What to Watch For
The public has 60 days to comment on the proposed changes to the COPPA rule before the commission votes on it, according to the Federal Trade Commission’s announcement.
Major Critics
Some critics of COPPA argue that the rules are insufficient to protect young users, including Common Sense Media founder Jim Steyer who called COPPA rules “desperately outdated,” and journalist Kara Swisher who said the rules are “ineffective” as they only protect children under 13. Meanwhile, California and the European Union have enacted laws that protect children up to 16 years old.
Major Background
In 2019, the Federal Trade Commission conducted its latest review of COPPA, receiving over 175,000 comments including from members of the public, technology and advertising industry groups, academics, and members of Congress, about what changes should be made to the rule. Comments included calls to amend the COPPA definition of “online site or service directed to children” to include websites that do not necessarily target children but have a certain percentage of child users or include “child-attractive content.” The last changes to COPPA were made in 2013 with the increase in smartphone usage and social media among children. Apps like Instagram and TikTok restrict the creation of accounts for children under 13 in compliance with the law. However, the Federal Trade Commission and other government regulators have pursued major tech companies that fail to comply with laws protecting young users, including Google, which was fined $170 million for violating children’s privacy and profiting from children’s data by targeting them with ads, according to a New York Times report.
Looking Ahead
The proposed changes come amidst a clash between tech companies and state governments over protecting children online amid a mental health crisis for teenagers. In October, Meta, the parent company of Facebook and Instagram, was sued by a coalition of 33 states for misleading the public about harmful content and addictive features that target younger users to keep them on the platforms longer to generate profits. In response to the lawsuit, a Meta spokesperson stated that they share “the attorney general’s commitment to providing safe and positive online experiences for teenagers” and have provided tools to support younger users on the platforms, according to a report from The Verge. The video-sharing platform TikTok has also been sued by U.S. states, including Arkansas and Utah, over addictive features that could be harmful to children’s mental health. Last month, YouTube announced restrictions on recommended videos related to topics like body weight that target younger users after being mentioned in multiple lawsuits alongside other social media platforms for being “addictive and harmful” and for changing how younger individuals think, feel, and behave. YouTube and TikTok have responded to the accusations, stating that protecting younger users is a “priority” for the platforms.
Background
Error: Failed to call OpenAI API, HTTP Code: 429
Future
The proposed changes come amid a conflict between technology companies and state governments over protecting children online during a mental health crisis for teenagers. In October, Meta, the parent company of Facebook and Instagram, was sued by a group of 33 states for allegedly misleading the public about harmful content and addictive features targeting younger users to keep them on the platforms for longer periods to generate profit. In response to the lawsuit, a Meta spokesperson said the company shares the “Attorney General’s commitment to providing safe and positive online experiences for teenagers,” and has provided tools to support younger users on the platforms, according to a report by The Verge. Video-sharing platform TikTok was also sued by U.S. states, including Arkansas and Utah, for its addictive features that may be harmful to children’s mental health. Last month, YouTube announced restrictions on recommended videos related to topics such as body weight that target younger users after being mentioned in several lawsuits alongside other social media platforms for being “addictive and dangerous” and for changing how younger people think, feel, and behave. YouTube and TikTok have responded to the accusations by stating that protecting younger users is a “priority” for the platforms.
Leave a Reply