As more children spend their time online exploring and learning, government bodies in the United States and internationally have enacted policies to ensure safer spaces, privacy, security, and protection for children online. The California Senate Judiciary Committee recently voted to advance two California bills to protect children’s online activities.

Closely modeled after the UK’s Children’s Code, California’s bill, AB 2273, the Age-Appropriate Design Code Act, was drafted with the intent that “[c]hildren should be afforded protections not only by online products and services specifically directed at them, but by all online products and services they are likely to access.” AB 2408, the Social Media Platform Duty to Children Act, was drafted with the intent that “California should take reasonable, proportional, and effective steps to ensure that its children are not harmed by addictions of any kind.”

Both bills are now pending in the Senate Appropriations Committee and will be heard when the Legislature returns from its summer recess on Aug. 1. The Legislature will have until Aug. 31, 2022 to present the bills to the governor, and the governor will have until Sept. 30 to sign or veto the legislation.

AB 2273, Age-Appropriate Design Code Act

The enactment of AB 2273 would require businesses that provide an online service, product, or feature likely to be accessed by a child to comply with specified requirements, including:

  • configuring all default privacy settings offered by the online service, product, or feature to the settings that offer a high level of privacy protection offered by the business;
  • providing privacy information, terms of service, policies, and community standards concisely and prominently; and
  • using clear language suited to the age of children likely to access that online service, product, or feature.

The bill defines “likely to be accessed by a child” to mean that a child would reasonably access the online service, product, or feature if:

  • it is directed to them (as defined by the Children’s Online Privacy Protection Act, 15 U.S.C. Sec. 6501 et seq.);
  • it is routinely accessed by children through academic, market, or internal company research;
  • it advertises to children; or
  • it has design elements that are known to be of interest to children, including, but not limited to, games, cartoons, music, and celebrities who appeal to children.

If passed, the Age-Appropriate Design Act would prohibit a business subject to the Act from “using the personal information of a child for any reason other than the reason or reasons for which the personal information was collected.” It also seeks to prevent the use of any dark patterns that encourage children to divulge excess personal information, and, similar to the Federal Children’s Online Privacy Protection Act (COPPA), it would require that websites provide privacy notice information in clear language suited to the age of children likely to access the online service.

The Act would create the California Children’s Data Protection Task Force to provide recommendations on best practices. The Attorney General would have authorization to seek an injunction or not more than $2,500 per affected child for negligent violations, and not more than $7,500 per affected child for intentional violations.

AB 2408, Social Media Platform Duty to Children Act

AB 2048 was proposed to address studies finding that children are at risk of addiction to social media.  Noting that “addictions . . . have had a demonstrable negative effect on state economies,” the drafters of AB 2048 assert that California has a compelling interest in protecting the mental health of its children from social media platform addiction that is foreseeable.

If passed, the Act would add section 17052 to the California Business and Professions Code, which would declare that “a social media platform shall not use a design, feature, or affordance that the platform knew, or which by the exercise of reasonable care should have known, causes child users to become addicted to the platform.” The proposed law would apply to businesses generating more than $100 million in gross revenue and as currently drafted excludes video game companies. If passed, the Act would authorize public prosecutors to “bring an action to recover or obtain certain relief, including a civil penalty of up to $250,000 for a knowing and willful violation, and an award of litigation costs and attorneys’ fees.”

The current bill shields social media platforms from civil penalty when they can demonstrate they instituted and maintained a program of at least quarterly audits of its practices, designs, features, and affordances to detect practices or features that have the potential to cause or contribute to the addiction of child users, and corrected, within 30 days of the completion of such audits, any practice, design, feature, or affordance discovered by the audit to present more than a de minimis risk of violating this subdivision.

The current bill carves out liability for a social media platform in relation to: (a) content that is generated by a user of the service, or uploaded to or shared on the service by a user of the service; (b) passively displaying content that is created entirely by third parties; (c) information or content for which the social media platform was not, in whole or in part, responsible for creating or developing; or (d) any conduct by a social media platform involving child users that would otherwise be protected by 47 U.S.C. 230, or by application of case law interpreting the First Amendment of the United States Constitution or Section 2 of Article 1 of the California Constitution.

* Special thanks to Roya Linda Butler for her valuable contributions to this GT blog post.