A new law designed to keep kids safe online will change the internet for adults, too | Media Pyro

[ad_1]

California Governor Gavin Newsom signed the agreement in September AB-2273, the California Age-Appropriate Design Act (Cal-AADC) into law. A bill designed to protect minors online goes beyond today’s simple parental controls. This requires fundamental changes in the design of platforms that protect children’s online privacy and reduce harms such as bullying, exploitation and inappropriate content.

The law requires new default settings for children, tools to manage privacy settings, impact assessments before new products are released, and adjustments to behavioral manipulation aimed at getting children to use the product.

While the law is aimed at the safety and well-being of children, the impact of the law could be much broader, says Stanford HAI’s Privacy and Data Policy Officer Jennifer King. “It’s aimed at children, but also at an adult audience,” she says. “What we’re seeing is a shift to a world where you get more choices about what and how you want things delivered to you, which isn’t just a company’s version of personalization.”

In this conversation, she explains the implications of the new law, how it will affect AI developers, and what’s next for US privacy and AI regulation.

What will this new law actually do?

This law shifts the baseline of defaults. This is a feature privacy by design. It’s similar to how Apple introduced ad tracking transparency, where you now have to opt-in to mobile app tracking instead of opting out. More than 75 percent of Apple customers say no when asked if they want to be tracked. This law is similar, although the basic level isn’t whether you’re being asked if you want to be tracked — you’re just not being tracked. This new option will default to children under 18, but can apply to any user of a website if the operator decides that offering such an option is easier than identifying children who visit its website.

While most of this new bill deals with privacy, there is also a section on health and well-being that is based on the assumption of algorithmic manipulation. There is growing evidence that algorithmic systems, such as news feeds and recommendation engines, are having a harmful effect. They generate addictive behaviour, especially among vulnerable users such as children, and have a negative impact on mental health and self-esteem. This bill states that if your violations are causing such harm to children, you must change them.

Companies can respond in a variety of ways. They can choose, for example, to verify their age and prevent children from visiting their site. Or they can change all their defaults and use the basic defaults set by law for all visitors.

Last year, we saw a story about Instagram’s algorithm making young girls’ body image issues worse. So does this bill say that Instagram can no longer use this algorithm?

Instagram will likely be banned from using this algorithm to display content to children, as there is both research and anecdotal evidence that it causes harm. The company could split its product into an over-18 version and an under-18 version, and in that under-18 version, the algorithms couldn’t serve content in a way that perpetuated that harm. One of the solutions proposed by the whistleblower Francis Haugen, is to move away from displaying content based on engagement to one that is ordered by time, such as where the most recent posts are displayed by default. There are many potential design changes that companies can consider, ranging from simple feature changes (such as turning off autoplay, which requires children’s codea similar law in the UK) to rethink how an algorithmic welfare-promoting system would work.

Does this only apply to large tech companies?

Generally yes, unless your personal website is aimed at children or you know that a significant number of your visitors are under 18. I think EdTech [educational technology] it will affect sites and products.

This is California law, not federal. How will this add additional difficulty to compliance or enforcement?

First, it will be necessary to determine that the existing federal law, COPPA, does not contradict this law. COPPA, for example, only covers children under the age of 13. As I mentioned, the UK version of this law already exists, so I think that’s one of the reasons why we didn’t see a big public pushback from the big platforms in the run-up to the bill. Googles and Metas already have to comply with the UK Children’s Code for a year. But for some US-focused businesses, Cal-AADC will be brand new. So we can see some companies segmenting their customers through IP filtering; for example, you may be in Nevada and I may be in California, and we may be visiting the same website, but I will be presented with the California version and you will not. This is already happening today with CCPA conformity.

So what’s next for this law?

It won’t take effect until 2024, and it will go through a rulemaking process that includes an opportunity for public comment.

What is the most important thing for you?

Like the UK Children’s Code, Cal-AADC is the first attempt to regulate algorithms specifically from a health and welfare perspective. This signals a shift to what I think of as “AI safety”: placing the burden on AI developers to demonstrate that an algorithm does not harm humans before running it. You may not have to demonstrate to the authorities before the algorithm is released that it does not cause harm, but you do have an obligation to come up with a way to present content that does not exacerbate the set of harms that Cal-AADC identifies. Very few of these online platforms considered what would be safe and healthy for children during product development. Now they will have to do it.

Stanford HAI’s mission is to advance AI research, education, policy, and practice for the betterment of the human condition. Learn more.

[ad_2]

Source link

Avatar photo

About the author

Media Pyro is a site giving interesting facts about acer brand products. We also Provide information about your online Privacy Laws.