California lawmakers Buffy Wicks (D-Oakland) and Jordan Cunningham (R-Templeton) introduced on Wednesday evening a bill that seeks to protect children´s data online. The California Age-Appropriate Design Code Act would require businesses that create goods or services for children to comply with specified standards, including considering the best interests of children when designing, developing and providing these goods and services.
The proposed bill, if approved, will also limit the amount of data that online companies can collect from children aged 18 and below, including precise geolocation or data that is not necessary to provide the service. Additionally, the bill establishes prohibitions against using personal information of any child in a way that can be harmful for the child or to use techniques like “dark patterns” to encourage users to provide more information than what is necessary.
The bill, which would enter into force in July 2024, would also create the California Children´s Data Protection Taskforce to evaluate best practices for the implementation of these provisions and to provide support to business seeking to comply with the law.
“It’ll be a first-in-the-nation bill,” said Wicks. “Given the size and scope of California and you have a lot of these companies based in California, we have the ability to have a ripple effect,” she said.
The bill is modelled on the U.K.´s age-appropriate design code that become law in the U.K. in 2021 and has served as a blueprint for other countries. The U.S. bill adopts a similar approach in the form of a code, with an agency supervising the compliance of the law and with the powers to issue recommendations. In case of non-compliance, California attorney-general would be responsible for enforcing the state´s rules.
This proposal came the same day that Senators Richard Blumenthal and Marsha Blackburn announced their intention to introduce the Kids Online Safety Act, also aimed at holding social-media platforms responsible for harm they cause to children. The difference between these two bills is not only that one is a federal law and the other a state law, but also that the Kids Online Safety Act will require companies to regularly assess their algorithms and advertising systems to make sure that they are not harming children. The obligations for Big Tech companies in the federal bill are further reaching than those in the California Code.
Read More: US Lawmakers Propose Bill To Impose Platform Content Moderation
Also this week, another two California lawmakers, Senator Richard Pan and Assemblymember Evan Low introduced separate measures targeting online companies to stop the spread of misinformation. Pan’s proposal would require online platforms that use algorithms to report how these features rank content and compel disclosure of data for legitimate research purposes. Low’s initiative would insert language in state law that says physicians and surgeons who promote misrepresentations related to COVID-19 constitute unprofessional conduct.
Read More: California Lawmakers Target Online Platforms’ Algorithms
Countries around the world are stepping up their fights against Big Tech companies to strengthen privacy protection for users and introducing new rules for content moderation. Europe recently approved the Digital Service Act (DSA) that will hold Big Tech companies accountable for the illegal and harmful content posted in their platforms, and the U.K. is also proposing new legislation, the Online Safety Bill, making some practices a criminal offence to ensure that companies do their best to guarantee that harmful content is removed.
The California Age-Appropriate Design Code Act has bipartisan support, and the state is a strong supporter of privacy-enhancing rules. Thus, despite being the host state of some of the Big Tech companies that will be affected by the bill, it may find sufficient support among lawmakers.
Sign up here for daily updates on the legal, policy and regulatory issues shaping the future of the connected economy.