On September 15, California Governor Gavin Newsom (D) signed the Age-Appropriate Design Code Act into law, which passed unanimously in the state Senate in late August, despite opposition from the tech industry.
Modeled after the UK Children’s Code, which went into effect last year, California law protecting children’s privacy and online wellbeing requires companies to assess the impact of any product or service designed for children. Requires or is “likely to be used by children.”
The law will go into effect on July 1, 2024, after which companies that violate the law could face fines of up to $7,500 per affected child. While this may seem like a small amount, similar legislation in the European Union has allowed Ireland’s Data Protection Commission to fine Meta $400 million over the way Instagram has treated children’s data. (In the case of the new law, California’s attorney general would impose the fine.)
California’s Age-Appropriate Design Code Act defines a child as anyone under the age of 18 compared to the Children’s Online Privacy Protection Act (COPPA) of 1998, for which the cutoff age is 13.
COPPA codified Protection for Children’s Data, “Prohibits unfair or deceptive acts or practices with respect to the collection, use and/or disclosure of personal information from and about children on the Internet.”
California’s new law goes further. It requires that the highest privacy settings be the default for young users, and that companies “provide a clear indication” to let children know when their location is being tracked.
“This is a very significant victory for children and families,” Jim Steyer, founder and CEO of Common Sense Media, one of the bill’s major sponsors, told HuffPost.
The law comes down strongly in favor of protecting children over profit, stating: “If a conflict arises between the business interests and the best interests of children, companies must put the privacy, safety and welfare of children over business interests.” should be given priority.”
In a 2019 interview with The New York Times, UK Children’s Code chief architect Baroness Beeban Kidron elaborated on his meetings with technical executives.
“The main thing they’re asking me is: ‘Are you really expecting companies to give up profits by limiting the data they collect on children?’ Her response? ‘Of course I am! Of course, everyone should.'”
“If a conflict arises between the business interests and the best interests of children, companies must prioritize the privacy, safety and welfare of children over business interests.”
– California Age-Appropriate Design Code Act
How will the Age-appropriate Design Code Act protect children online?
The dangers of the Internet to children extend beyond children being contacted by strangers online (though by making high privacy settings the default, the California Act seeks to prevent such interactions).
Increasingly, parents worry about the amount of time children spend online, the fascination of platforms with autoplay and other addictive features, and the risk of children being exposed to content that can harm themselves and eat. Promotes dangerous behaviors such as eating disorders.
The Age-Appropriate Design Code Act requires companies to write a “Data Protection Impact Assessment” for each new product or service, detailing how children’s data may be used and whether such use There could be some damage.
“basically, [companies] It remains to be seen whether their product design exposes children and adolescents to harmful content, or allows harmful exposure by others, or uses harmful algorithms,” Steer said.
Under the law, Steyer explained, YouTube, for example, would still be able to recommend videos. The difference is that they will have less data to pull from when making these recommendations. Companies will also be responsible for assessing whether their algorithms are amplifying harmful content, and if so take action.
Haley Hinkle, policy consultant at Fairplay, an organization “dedicated to ending marketing to children”, told HuffPost that by mandating an impact assessment, “big tech companies will be responsible for assessing the impact of their algorithms on children.” . before this They provide a product or new design feature to the public.”
Hinkle continued, “This is critical in shifting the responsibility of protecting digital platforms to the platform itself, and away from families that don’t have the time or resources to decode the endless pages of privacy policies and settings options.”
Under the law, a company may not “collect, sell, share or retain” any young person’s information unless it is required to do so by the App or Platform to provide its service. The law instructs businesses to “estimate the age of child users with a reasonable level of certainty” or to provide data protection to all users.
“You can’t profile a child or teenager by default unless the business has reasonable security measures in place,” Steyer said. “And you may not collect accurate geolocation information by default.”
Hinkle explained the motivation for companies to collect such data. “The online platform is designed to capture as much time and attention of children as possible. The more data a platform collects on a child or teen, the more effectively it can target them with content and design features to keep them online. ,
Although the scope of the law is limited to California, it is expected that it could lead to further reform, as some companies changed their practices around the world before the Child Code was implemented in UK Instagram, for example, teenagers. The accounts are made private by default. , disabling direct messages between children and adults they do not follow. However, how they define “adult” varies by country – it is 18 in the UK and “some countries” but 16 elsewhere in the world, according to their statement announcing changes.
While it is uncertain whether Instagram will now raise this age limit to 18 in California, the Age-Appropriate Design Code Act requires companies to take into account the “unique needs of different age ranges” and developmental stages, which are defined by law as follows. Has been: “0 to 5 years of age or ‘educated and early literacy,’ 6 to 9 years of age or ‘main primary school year,’ 10 to 12 years of age or ‘transition year,’ 13 to 15 years age or ‘early’ adolescent,’ and 16 to 17 years of age or ‘nearing adulthood.
“Child development and social media don’t align optimally.”
– Devora Hetner
What is the biggest threat to children online?
Some threats to children come from large, impersonal corporations that collect data to subject them to targeted advertising, or to profile them with targeted content that may promote dangerous behavior, such as disordered eating.
Other threats come from people your child knows in real life, or even your child yourself.
Devora Hetner, author of “Screenwise: Helping Kids Survive (And Thrive) in Their Digital World,” told HuffPost that in addition to “interpersonal harm from people they know,” like cyberbullying.There are ways in which they can compromise their reputation. ,
“When you’re 12, what you share can stay with you for a really long time,” Hetner explained.
While no law can prevent a child from posting something they probably shouldn’t, the Age-Appropriate Design Code Act requires that businesses “take into account the unique needs of different age categories”, setting an example. Given that children and adolescents are developmentally different from adults and require different protections.
“Child development and social media don’t align optimally,” Hetner said.
What can parents do now to protect the privacy and safety of their children?
Parents don’t have to wait for big tech companies to change their behavior before the new California law goes into effect. There are a few things you can do now to increase your child’s online privacy and security.
Hinkle suggests keeping children away from social media until they are at least 13 years old. To do this, she says, it can be helpful to communicate with the parents of your child’s friends, as the presence of their peers is one of the biggest draws to social media. Children.
Once they have social media accounts in place, Hinkle “Review the settings with your child, and explain why you want the most protective settings.” These include turning off location data, opting for private accounts, and disabling contacts with strangers.
Hetner advocates an approach she calls “advice on surveillance.” Because security settings can only do so much, and because kids are so good at finding workarounds, they say your best defense is an ongoing conversation with your child about their online habits, and their actions on their own. What effect can it have on others? ,
Your children will see harmful content during their online hours. You want them to feel comfortable telling you about it, or, when appropriate, report it.
When it comes to examining your own behavior, kids need to know that you are open to discussion and will not be quick to make decisions. Hetner “I.” suggests using phrases like Know you’re a good friend, but it might not sound that way if you posted it.”
Children should understand how what they post can be misinterpreted, and why they should always think before posting, especially when they are feeling angry.
There’s a delicate balance of respecting how important your child’s online life is to them, as well as teaching them that social media can “make you feel terrible, and that [companies] Benefiting from my time there,” Hetner said.
The goal of parents should be to make kids aware of these issues, and to “buy kids into a healthy skepticism” of big tech, Hetner said.
In addition to the resources available in Common Sense Media, Steyer recommends that parents take advantage of Apple’s privacy settings, which Common Sense Media helped develop.
He also suggested that parents be role models in their own media consumption.
“If you’re spending all your time [there] Myself, what message is this sending to your child?”