On September 15, 2022, Governor Gavin Newsom signed AB2273, the California Age-Appropriate Design Code Act (the “Act”).1 The Act is intended to protect children’s privacy and data online, but it represents a considerable and potentially burdensome shift from existing requirements under the federal Children’s Online Privacy Protection Act (“COPPA”). In particular, the Act requires any websites “likely to be accessed by children” to include “strong privacy protection-by-design and by default,” and imposes restrictions on certain uses or practices involving children’s information. The Act’s broad reach and new, often vague, requirements on covered businesses will likely result in significant implications for a much broader swath of businesses than might otherwise be expected, including many businesses that operate websites or apps that are not directed to children.
Scope
The Act applies to businesses,2 that provide an online service, product, or feature “likely to be accessed by children” under the age of 18 (“Covered Business(es)”). An online service is “likely to be accessed by children” if:
- it is “directed to children,” as defined by COPPA;3
- it is determined, based on competent and reliable evidence regarding audience composition, to be routinely accessed by a significant number of children or substantially similar to, or the same as, an online service, product, or feature routinely accessed by a significant number of children;
- it has advertisements marketed to children; or
- a significant amount of the audience of the online service, product, or feature is determined, based on internal company research, to be children.
Although the Act’s criteria aim to capture those websites most likely to be accessed by children, some of the standards – such as “routinely accessed by a significant number of children” – are sufficiently vague to arguably include most major websites.
Key Obligations
Covered Businesses that are swept up by the Act’s broad scope are responsible for discharging numerous obligations. In particular:
- Default Settings Configuration. Privacy settings offered by the Covered Business must be configured to offer the highest level of privacy by default, unless the Covered Business can demonstrate a compelling reason that a different setting is in the best interest of children.
- Age-Appropriate Disclosures. Privacy information, terms of service, policies, and community standards must be provided using clear language suited to the age of children likely to access the online service, product, or feature. Accordingly, website privacy notices, which are already required to be simplified and clear under the CCPA, may need further simplification.
- Age Verification. The Act requires Covered Businesses to either “estimate the age of child users with a reasonable level of certainty appropriate to the risks” or to “apply the privacy and data protections afforded to children to all consumers. ”This creates a significant conundrum for Covered Businesses. On the one hand, if a Covered Business cannot accurately identify any user below 18 years old, the Covered Business must treat all its users as children, restricting content and limiting functionality. On the other hand, age and identity verification methods might require Covered Businesses to collect additional data about the user that the Covered Business otherwise would not collect – from driver’s licenses and government IDs to school records and birth certificates – increasing the privacy risks to the consumer attendant with having additional copies of sensitive data in circulation, while increasing the complexity and cost of compliance for the Covered Business that must store and protect that information.
- Data Protection Impact Assessment. The Act requires Covered Businesses to complete a data protection impact assessment before offering new online services, products, or features likely to be accessed by children. The data protection impact assessment must identify (i) the purpose of the online service, product, or feature, (ii) how it will use children’s personal information, and (iii) the potential risks of harm to children that might arise from the service, product, or feature. Covered Businesses are also required to review these data protection impact assessments biennially, document any material risks that arise and plans for mitigating those risks, and provide copies of the data protection impact assessments to the California Attorney General upon written request.
- Easy User Reporting. Covered Businesses must provide prominent, accessible, and responsive tools to help children or their parents/guardians to exercise their privacy rights and report concerns.
Prohibited Activities
In addition to imposing the numerous obligations described above, the Act also includes numerous prohibitions on certain practices. Specifically, these prohibitions include:
- Profiling by Default. Covered Businesses are prohibited from profiling a child unless (i) profiling is necessary to provide an online service, product, or feature or (ii) the Covered Business can demonstrate a compelling reason that profiling is in the best interests of children.4
- Purpose Limitation. Covered Businesses are prohibited from using a child’s personal information for any reason other than for which it was collected unless there is a “compelling reason” that the use is in the best interests of the child.
- Limitations on Geolocation Information. Covered Businesses are prohibited from collecting, selling, or sharing any precise geolocation information regarding a child unless the collection of that precise geolocation information is strictly necessary for the business to provide the service, product, or feature and is for a limited time.
- Using Dark Patterns. Covered Businesses are prohibited from using dark patterns to lead or encourage children to provide personal information beyond what is reasonably expected to provide that online service, product, or feature.5
- Health of the Child. Covered Businesses are prohibited from using a child’s personal information in a way known or believed to be “materially detrimental” to a child’s physical or mental health and well-being.
Children’s Data Protection Working Group
The Act also establishes the California Children’s Data Protection Working Group (“Working Group”), which is tasked with developing recommendations and best practices, as well as clarifying or providing guidance on some of the Act’s provisions, including (i) identifying the types of online services, products, or features likely to be accessed by children; and (ii) ensuring that the age assurance methods used by Covered Businesses are “proportionate to the risks arising from data management practices, privacy-protective and minimally invasive.” The Working Group will consist of 10 appointed Californians with expertise in the areas of children’s data privacy and children’s rights. The California Governor, the President Pro Tempore of the Senate, the Speaker of the Assembly, the Attorney General, and the California Privacy Protection Agency are each responsible for the appointment of two members of the Working Group.
Enforcement
The State’s Attorney General is granted the authority to enforce the Act and may seek an injunction or civil penalty against any business that violates its provisions up to $2,500 per affected child for each negligent violation, and up to $7,500 per affected child for each intentional violation. Covered Businesses that substantially comply with the Act will receive a written notice from the California Attorney General before any action is initiated against them and will be provided 90 days to cure any noticed violation and provide the Attorney General a written statement explaining that the violations have been cured and sufficient measures have been implemented to prevent future violations. There is no private right of action.
Important Dates
January 1, 2024 The Children’s Data Protection Working Group to deliver its report to the California Legislature.
July 1, 2024 The Act comes into effect.
Conclusion
Although the Act’s stated purpose is to increase protections for children’s online activity, it imposes requirements that are just as likely to increase risk to children because they will result in more children’s data being collected by covered websites. In addition, its broad applicability to any website “likely to be accessed by children” means that a vast number of businesses may be required to comply with the Act’s onerous requirements. With less than two years before the law takes effect, companies should begin assessing whether any of their activities and associated platforms might be covered by the Act to determine whether they need to undertake a compliance program as soon as possible.
1 Assembly Bill No. 2273, “The California Age-Appropriate Design Code Act” available here. Although this act breaks new ground in the United States, it is modeled on, and borrows its name from, a U.K. law, the Age-Appropriate Design Code, which was passed in September 2020.
2 The Act borrows the definition for “businesses” from the California Consumer Privacy Act (“CCPA”). The CCPA defines a business as a “for-profit organization that does business in California and either (i) has an annual gross revenue of more than $25 million, or (ii) buys, receives, or sells the personal information of 50,000 or more California residents, households, or devices; or derives 50% or more of their annual revenue from selling California residents’ personal information.” Assembly Bill No. 375, “the California Consumer Privacy Act,” Sec. 1798.140 (c), available here.
3 Children’s Online Privacy Protection Act, 15 U.S.C. Sec. 6501 et seq.
4 “Profiling” means any form of automated processing of personal information that uses such personal information to evaluate certain aspects relating to a natural person, including analyzing or predicting aspects concerning a natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements. See Assembly Bill No. 2273, supra note 1 at Sec. 1798.99.30.(b)(6).
5 Dark Patterns are defined by the California Privacy Rights Act as: “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision making, or choice.” For instance, dark patterns designs might mislead or pressure users into taking certain actions such as providing consent. See Assembly Bill No.1490, “California Privacy Rights Act” at Sec. 1798.140.(l), available here.