Add a bookmark to get started

8 de maio de 202312 minute read

California’s Age-Appropriate Design Code Act – and the looming state patchwork of online child protection laws

Since its passage in September 2022, the California Age-Appropriate Design Code Act (CAADCA or Act) has generated considerable controversy. Modeled on the United Kingdom’s Age-Appropriate Design Code (UK AADCA), the Act imposes far broader obligations on a far broader range of businesses than required by the US Children's Online Privacy and Protection of 1998 (COPPA) and contains stiff penalties for noncompliance.

With congressional efforts to extend targeted privacy protections to teens and children still up in the air, and despite a pending suit in California to block the CAADCA on First Amendment and other grounds, lawmakers in several states have already proposed similar legislation of their own.

Covered entity

The CAADCA applies to companies that (1) meet the definition of a “business” under the California Consumer Privacy Act (CCPA) and (2) develop and provide an “online service, product, or feature” (Online Service) that is “likely to be accessed” by consumers who are under 18 years of age. It specifically exempts providers of broadband internet access services, telecommunications services, and physical delivery services.

Under the CCPA, a “business" is any for-profit California entity that collects and processes the personal information of California residents and (1) had annual gross revenues in excess of $25 million in the preceding calendar year, (2) alone or in combination, annually buys, sells, or shares the personal information of 100,000 or more California residents or households, or (3) derives 50 percent or more of its annual revenues from selling or sharing the personal information of California residents.

Under the Act, “likely to be accessed by children” means it is reasonable to expect children to access the Online Service based on “indicators” that it is:

  • Directed to children as defined by COPPA
  • Determined, based on competent and reliable evidence regarding audience composition, to be routinely accessed by a significant number of children
  • Marketing advertisements to children
  • Utilizing design elements known to be of interest to children (including, but not limited to, games, cartoons, music, and celebrities who appeal to children) and
  • Reaching an audience that internal company research has determined contains a “significant amount” of children.

Whereas COPPA applies only to those websites and online services that collect, use, or disclose personal information from children under 13 years of age, the Act applies more broadly, including to companies that do not specifically target children, but might have a significant number of children under 18 years of age accessing their Online Service.

Data Protection Impact Assessment requirement

Before offering an Online Service to the public, a business must complete, document, and biennially review a Data Protection Impact Assessment (DPIA) for such Online Service.

The DPIA must address:

  • The purpose of the Online Service, how it uses children’s personal information, and the risks of material detriment to children that arise from the business’s data management practices
  • Whether the design of the Online Service could harm children – including by exposing them to (1) harmful content, or (2) targeting or exploitation by harmful contacts
  • Whether algorithms and/or targeted advertising systems used by the Online Service could harm children
  • Whether and how the Online Service uses system design features to increase, sustain, or extend use of the Online Service by children, including the automatic playing of media, rewards for time spent, and notifications, and
  • Whether, how, and for what purpose the Online Service collects or processes sensitive personal information of children.

A business must document any risk of material detriment to children that arises from its DPIA and create a timed plan to mitigate or eliminate such risk before children access the Online Service. In response to written requests from the Attorney General, the business must provide within three business days a list of all DPIAs it has completed, and within five business days the DPIAs themselves.

Age estimation requirement

A business must either (1) estimate the age of child users of its Online Service with a reasonable level of certainty appropriate to the risks that arise from its data management practices or (2) apply the Act’s privacy and data protections to all consumers who use the Online Service.

Default privacy settings

A business must configure its default privacy settings for children to offer a high level of privacy unless the business can demonstrate a compelling reason as to why a different setting is in the best interests of children.

Transparency requirement

A business must provide any privacy information, terms of service, policies, and community standards concisely, prominently, and using clear language suited to the age of children likely to access that Online Service.

Monitoring signal

If a business’s Online Service allows the child’s parent, guardian, or any other consumer to monitor the child’s online activity or track the child’s location, the business must provide an obvious signal to the child when the child is being monitored or tracked.

User reporting

A business must provide prominent, accessible, and responsive tools to help children (or their parents or guardians) exercise their privacy rights and report concerns.

Prohibited acts

A business that provides an Online Service likely to be accessed by children is prohibited from several acts:

  • Using child health information: A business may not use the personal information of any child in a way it knows, or has reason to know, is materially detrimental to the physical health, mental health, or well-being of the child.
  • Profiling a child by default: A business many not engage in any form of automated processing of a child’s personal information to evaluate certain aspects relating to the child unless it can demonstrate that it has appropriate safeguards in place and either (1) the profiling is necessary to provide the aspects of the Online Service with which the child is actively and knowingly engaged or (2) the business can demonstrate a compelling reason why profiling is in the best interests of children.
  • Collecting, selling, sharing, and retaining personal information: A business may not collect, sell, share, or retain any personal information that is not necessary to provide an Online Service unless it can demonstrate a compelling reason that such activity is in the best interests of children likely to access the Online Service.
  • Collecting, selling, or sharing precise geolocation information: A business may not collect, sell, or share children’s precise geolocation information by default unless strictly necessary for the business to provide the Online Service (and then only for the limited time that such activity is necessary). In addition, it may not collect any precise geolocation information of a child without providing an obvious sign to the child for the duration of that collection.
  • Using dark patterns: A business may not use dark patterns to lead or encourage children to provide personal information beyond what is reasonably expected to provide its Online Service or to take any action that it knows, or has reason to know, is materially detrimental to the child’s physical health, mental health, or well-being.
  • Estimating age: A business may not use any personal information collected to estimate age or age range for any other purpose or retain that personal information longer than necessary to estimate age. Age assurance must be proportionate to the risks and data practice of the Online Service.

Effective date

The Act takes effect July 1, 2024.

Enforcement and penalties

The CAADCA is enforceable by the California Attorney General. Before initiating any action, the Attorney General must provide a business in “substantial compliance” with the Act a 90-day period during which to cure any alleged violation. The Act does not include a private right of action.

Remedies for violations include injunctive relief and civil penalties ($2,500 per affected child for each negligent violation and $7,500 per affected child for each intentional violation).

Legal challenge

In December 2022, a tech industry trade association filed suit against the California Attorney General seeking an order declaring the CAADCA invalid and enjoining its enforcement. The complaint asserts that, given the Act’s “draconian penalties,” businesses will face overwhelming pressure to over-moderate content to avoid liability for content the state deems “harmful.” As alleged, such over-moderation will restrict the availability of information for users of all ages.

In addition, the suit claims that the Act will require businesses to verify the ages of their users, which – to the extent it can even be done to the state’s satisfaction – "will frustrate anonymous and casual browsing, magnify privacy concerns, and wrest control over minors’ online activities from parents and their children.”

In short, the plaintiff argues that the CAADCA is unconstitutional on at least four grounds and is preempted by two federal statutes – it violates the First Amendment, Fourth Amendment, and the Due Process and Commerce Clauses of the United States Constitution; violates Article I, Sections 2(a) and 7(a) of the California Constitution; and is preempted by COPPA and Section 230 of the Communications Decency Act.

On April 28, 2023, advocacy groups and a bipartisan group of former elected and appointed federal and state officials filed an amicus brief in support of the CAADCA. Contending that the Act is consistent with Section 230 and the First Amendment, the groups wrote that platforms “would not have to remove or even demote any content to comply with these requirements” and “could show users whatever content they like – as long as the companies ensure that that they are not using children’s data to target information to them in violation of the law.”

Other state initiatives

Despite the current legal challenge in California, several other states have introduced legislation similar to the CAADC, including:

  • Connecticut: In January 2023, the Connecticut Age-Appropriate Design Code bill (HB6253) was introduced in the House of Representatives.
  • Maryland: In February 2023, the Maryland Age-Appropriate Design Code bills (HB901) and (SB844) were cross-filed in the House of Representatives and the Senate.
  • Minnesota: In February 2023, the Minnesota Age-Appropriate Design Code bill (HF2257) was introduced in the House of Representatives. It passed the House in April and will go to conference committee for reconciliation with a Senate companion bill (SF2810) introduced in March.
  • Oregon: In January 2023, the Oregon Age-Appropriate Design Code bill (SB196) was introduced in the Senate.
  • New Jersey: In December 2022, the New Jersey Age-Appropriate Design Code bill (A4919) was introduced in the General Assembly. In January 2023, a companion bill (S3493) was introduced in the Senate.
  • New Mexico: In February 2023, the New Mexico Age-Appropriate Design Code Bill (SB319) was introduced in the Senate.
  • Nevada: In March 2023, the Nevada Age-Appropriate Design Code bill (AB320) was introduced in the Assembly.

Key takeaways

In view of the CAADCA’s July 1, 2024 effective date and potentially significant penalties, companies should review their information practices to determine whether they could be subject to the Act (and similar legislative proposals in other states). In so doing, they should consider the following:

  • Breadth of businesses affected: The CAADCA applies to any business that provides an Online Service “likely to be accessed by children” under 18 years of age based on certain indicators. Along with social media services, the Act on its face sweeps in an extremely broad range of general audience products and services, including but not limited to connected devices (eg, smartphones, smartwatches), connected cars, virtual assistants, online video games, video and music streaming services, online education programs, blogs and discussion forums, and online self-help services. Moreover, terms relating to these indicators are not defined (eg, “significant number of children”).
  • DPIA challenges: Similarly, the CAADCA does not delineate key terms concerning the DPIA that companies must complete before offering a new Online Service to the public. For example, the DPIA must (1) describe “the risks of material detriment to children that arise from the data management practices” related to the Online Service and (2) state whether it could “harm” minors. However, neither “material detriment” nor “harm” are defined.
  • Age estimation requirement: In same way, the Act requires companies to “[e]stimate the age of child users with a reasonable level of certainty appropriate to the risks that arise from the data management practices of the business or apply the privacy and data protections afforded to children to all consumers” – but does so without defining “reasonable level of certainty appropriate to the risks.”
  • Foreign guidance: Remarkably, the Act explicitly points companies to the UK Information Commissioner’s Office for guidance. It states that “[i]t is the intent of the Legislature that businesses covered by the California Age-Appropriate Design Code may look to guidance and innovation in response to the Age-Appropriate Design Code established in the United Kingdom when developing online services, products, or features likely to be accessed by children.” Looking to a foreign regulator charged with enforcing the UK’s implementation of the General Data Protection Regulation – a law that differs significantly from the CCPA – is all but certain to introduce further complexity, uncertainty, and confusion into the mix for companies trying to comply with the Act.

While its future is still unclear, the CAADCA affords yet another example of the “California Effect,” whereby California promulgates laws and regulations that help shape the standard for other states. Companies subject to the Act should determine whether it makes sense to adopt separate processes for California children or to universally follow the CAADCA in order to simplify their business processes.

For more information on the shifting regulatory landscape in the United States around online child protection, please contact the authors or your regular DLA Piper contact.

Print