Children's Data
News

The Privacy Supply Chain: Managing Children's Data

May 18, 2022

7 minutes

This piece is part of our ongoing series on navigating privacy for data-enabled businesses. To learn more about how PDL manages across every link of the data supply chain, visit our first post.

Privacy is top of mind for anyone building products with data in 2022. An increasingly diverse ecosystem of state and federal regulations, as well as professional ethical norms, all demand that engineers and developers do their best to protect personal privacy, especially for vulnerable populations. Among these, there are few populations more vulnerable than children. Most states consider children’s data to be in a class by itself and subject to special rules and protections to ensure that it is handled safely and ethically if it is even collected at all. 

What are the rules today? 

There are a number of regulations that govern how children’s data is handled by U.S. businesses and that number is growing fast. Each covers different types of data at the state and federal levels. 

The Children’s Online Privacy Protection Act (COPPA) - Passed in 1998, COPPA is a federal law that limits the data that websites and other digital businesses can gather on children when they must get consent from parents, and how and when they can market to children. The rule prohibits most forms of digital marketing to children under 13, and while children are allowed to share their data with websites and other digital services they must receive parental consent. This requirement is so onerous that most digital businesses, including social media platforms, prohibit children under 13 from using their services rather than managing the logistical difficulties. 

California Privacy Rights Act (CPRA) - Though it only applies to California residents, the CPRA is generally followed by most businesses operating within the U.S. lest they be unable to do business in the nation’s most populous state. The CPRA divides children into two groups, those under 13 and those 13-15. Both groups require consent before their data can be sold or shared, children under 13 need consent from a parent or guardian, while those 13 -15 can give consent themselves. Children over 15 are treated as adults for the purposes of data collection and sharing. 

Emerging State Regulations - A number of additional states including Colorado and Virginia will also regulate children’s data starting in 2023. But each will have a slightly different take than COPPA or the CPRA law. For example, in Virginia, all data concerning children under 13 will be considered Sensitive personally identifiable information (We’ll address PII in greater detail in an upcoming post) For now, children’s data is managed state-by-state using a combination of COPPA and local regulations which may supersede the federal law or expand on its protections.   

 What the future holds

The future of privacy regulation is changing quickly as states move to implement their own policies where the federal government has not. Just like Virginia,  in 2023 new laws may expand on COPPA by classifying all children’s data as sensitive PII requiring an explicit opt-in to collect. 

A rapidly growing and complicated patchwork of regulatory constraints means a company selling children’s data will have to track each individual child, their parent’s consent, apply sensitive PII restrictions, do so differentially based on changing ages and geographies, and secure new consent when the child turns 13, or 16 or 18 - or all three depending on where they live. 

This is likely to create new risks for large platforms and other sites that still hold data from young children and still haven’t secured the required parental consent, These risks are inherited by anyone who would build products on top of that data. 

It’s also possible that next year will see new regulations that prohibit platforms from monetizing data from any users under 18. This would be an expansion of the current high-water mark of CPRA which treats anyone over the age of 16 as a legal adult. This move would also create significant risk for large platforms and anyone who holds data on 16-18-year-olds who have not explicitly consented. 

The risk to business is that inevitably, some companies will be left unknowingly holding or buying data that is non-compliant simply because keeping track of changing permissions and regulations on children’s data is so complicated.

The PDL Approach

When it comes to data compliance, especially with children, business models matter.  Children’s data is not part of our data sourcing strategy. People Data Labs does not collect, retain, or sell data about persons under the age of 18. We believe that the easiest way to avoid complications and serious risks associated with children’s data is to exclude it entirely from our dataset. Fortunately, our business model focuses on solving problems based mostly on professional background and virtually no one under 18 would be included in the types of data sources PDL uses to build our dataset. 

However, we don’t only rely only on our data sourcing requirements to protect children’s data. We also proactively check every record for a date of birth and exclude any records for people under the age of 18. PDL, as required by a growing number of regulations, then retains only enough data to prevent those profiles from ever being inadvertently recreated ensuring that any product built with PDL data is not at risk from any current or near-future data protection policies covering children and minors. 

The easiest way to solve the problem of children’s data is to never have those data in the first place.  

You can get started with our data today for free with our free API key or speak to one of our consultants to learn more!


Like what you read? Scroll down and subscribe to our newsletter to receive monthly updates with our latest content.

Steve Lappenbusch
Steve Lappenbusch

Dr. Steven Lappenbusch is the Head of Privacy at People Data Labs, leading the ongoing development and implementation of our privacy policy. Prior to joining People Data Labs he held senior roles at several Fortune 500 companies where he used identity analysis to create solutions that prevented millions in tax fraud, debt evasion, Medicaid fraud, and welfare fraud. Dr. Lappenbusch holds a Ph.D. in Human-Centered Design & Engineering from the University of Washington, College of Engineering. He has also been involved in user research at IBM and Microsoft and conducted independent research funded by the National Science Foundation.