The latest version of the Kids Online Safety Act (KOSA) aims to remove online information that people of all ages should be able to see. Letting governments—state or federal—decide what information someone needs to see is a dangerous endeavor. Also, this bill, ostensibly designed to protect our privacy, actually requires tech companies to collect more data about Internet users than they do.
The EFF has long supported comprehensive privacy protections, but the details matter. KOSA consistently gets the details wrong, and that’s why we urge members of Congress to oppose this bill.
Although KOSA has been updated since lawmakers introduced it in February and made some improvements, it’s still a dangerous bill that presents censorship and surveillance as solutions to some of the legitimate and some of the not-so-legal problems facing today’s young Internet users .
KOSA is a sweeping update to the Children’s Online Privacy Protection Act, also known as COPPA. COPPA is the reason why many websites and platforms ask you to verify your age, and why many services require their users to be over 13, because data privacy laws are much stricter for children than for adults. Lawmakers have hoped to expand COPPA for years, and there have been good proposals to do so. KOSA, for its part, has some good ideas: more people should be protected by privacy laws, and the bill expands COPPA protections to include minors under 16. In theory, this would be beneficial: the more people we can protect under COPPA, the better. But why stop at protecting the privacy of minors under the age of 16? The EFF has long supported comprehensive data privacy legislation for all users.
Another good provision in KOSA is the obligation for sites to allow minor users to delete their accounts and personal data, and to limit the sharing of their geolocation data and to report if they are tracking them. Again, the EFF believes that all users—regardless of age—should have these protections, and their gradual expansion is better than the status quo.
But the main task of KOSA is not to protect the privacy of young people. The main goal of the bill is to censor a wide range of speech in response to concerns that young people are spending too much time on social media and being exposed to harmful content too often. KOSA requires sites to “prevent and mitigate mental health disorders,” including promoting or exacerbating “self-harm, suicide, eating disorders, and substance use.” Make no mistake: this is a requirement for platforms to censor content.
This broad set of content restrictions won’t just apply to Facebook or Instagram. Platforms covered by KOSA include “any an online platform that connects to the Internet and is used or likely to be used by minors.” As we’ve said before, this will likely cover everything from iMessage and Apple’s Signal to web browsers, email programs and VPN software, as well as platforms like Reddit, Facebook and TikTok – platforms with wildly different user bases and ways of use, as well as with a huge variety. content monitoring capabilities and expectations.
Thus, a huge number of online services will be forced to make a choice: filter excessively so that no one comes across content that can be interpreted as ambiguously harmful, or raise the age limit for users to 17 years. Many platforms can even do both.
Let’s understand the dangerous consequences of KOSA censorship. The vague standard would prevent both adults and children from accessing health and medical information online. This is because it will be virtually impossible for the website to make individual decisions about which content promotes self-harm or other disorders, and which provides the necessary health information and advice to those who suffer from them. This will significantly affect children who do not have family, social, financial or other opportunities to obtain health information elsewhere. (Research showed that the vast majority of young people used the Internet for health-related research.)
Another example: KOSA also requires that these services ensure that young people do not see content that exacerbates a substance use disorder. At first glance, this may seem simple enough: just remove content that talks about drugs or hide it from young people. But how do you find and flag such content? Simply put: not all drug-related content increases drug use.
There’s no real way to find and filter just that content without also removing a huge chunk of useful content. Here’s just one example: Social media posts describing how to use naloxone, a drug that can reverse opioid overdoses, can be seen as promoting self-harm because it can reduce the potential danger of a fatal overdose, or as providing essential health information. But KOSA’s vague standard means that the website owner is in a better legal position if it removes the information, avoiding potential claims later that the information is harmful. This will reduce the availability of important and potentially life-saving information on the Internet. KOSA pushes website owners into government-sanctioned censorship.
To ensure that users is of the correct age, KOSA forces massive data collection efforts that lead to even greater potential privacy invasions.
KOSA would allow a federal investigation into the creation of an age verification system at the device or operating system level, “including the need for potential hardware and software changes.” The end result is likely to be a complex age verification system operated by a third party that maintains a huge database of all Internet users.
Many risks of such a program are obvious. They require every user, including children, to hand over personal data to a third party simply to use the website, should that user ever want to step outside of government “parental” controls.
In addition, the bill allows Congress to decide what is appropriate for children to watch online. This verification scheme would make it much more difficult for real parents to make individual choices for their own children. Because it is very difficult to distinguish minors who discuss many of these topics in a way that encourages them, as opposed to a way that discourages them, the safest course of action for services under this bill is to block all discussion and viewing of these topics by minors and teenagers If KOSA passes, instead of letting parents decide what young people see online, Congress will make it for them.
Recent a study of attitudes towards age verification showed that majority parents are “willing to make an exception or allow their child to bypass age requirements altogether, but then require direct supervision of the account or a discussion about how to safely use the app.” Many also fudge the numbers a bit to ensure that the websites do not include their children’s specific birthdays. KOSA’s strict national age verification system will make it much harder, if not impossible, for parents to decide for themselves what sites and content a young person can be exposed to. Instead, the algorithm will do it for them.
KOSA also fails to recognize that some parents do not always have their children’s best interests at heart or are unable to make appropriate decisions for them. These children are affected by the KOSA parental regime, which requires services to establish the highest level of parental control for children under the age of thirteen.
KOSA is a poor substitute for real online privacy
KOSA’s attempt to improve privacy and security will actually have a negative impact on both. Instead of using heavy-handed age verification to determine who gets the most privacy, and then using that same determination to limit access to vast amounts of content, Congress should focus on creating strong privacy safeguards for everyone. Real privacy protections that prevent data collection without consent address children’s privacy concerns by making age verification unnecessary. Congress should take privacy protections seriously and pass legislation that creates a strong, comprehensive privacy framework with strong enforcement tools.