Children spend a lot of time now playing online games, but the gaming industry has received less attention than social media or streaming platforms due to privacy concerns. As games develop, so does the scale of user vulnerability. Many young people do not understand the data risks that online games pose. Fortunately, privacy regulators are increasingly focusing their efforts on protecting the safety and autonomy of children online.
Young people love online games because they allow them to play, learn and socialize. Parents express concern about the amount of time spent playing games, but research on the effects is just emerging. Recent research from the Oxford Internet Institute on gamers aged 18 and over found that time spent playing video games had “little or no effect” on well-being. Its authors pointed to the need for further research, which is likely to include research into the mental health consequences of gamers under the age of 18.
Games often encourage children to unwittingly share personal information for “free” access to benefits. Minors who play games for mobile applications face different data risks than their peers who play games on a personal computer or console. The computer and console gaming industry is dominated by large and established companies. Industry leaders such as Sony, Microsoft and Nintendo have established privacy programs and understand the risks associated with the misuse of personal data. There are many new and small companies in the mobile gaming sector that do not have internal privacy resources and may not adhere to data protection principles. As a result, regulators quickly turned their attention to online gaming growing sector which often markets digital products to young people. The discussion below examines some of the key privacy issues related to children’s online gaming, as well as urgent policy solutions.
Risks for game developers and children
Gaming companies will face significant legal and business risks if they treat all players the same. Age verification processes help them manage the risks to children and their personal information in online gaming environments. Many jurisdictions and the UN Convention on the Rights of the Child, identify the child as anyone under the age of 18. This segment of the population is uniquely vulnerable and therefore deserves special conditions and enhanced protection.
The Age Guarantee should not be seen as a silver bullet or as sufficient protection on its own. It will only work effectively as part of a broader privacy-by-design approach. Age verification has flexibility compared to the corresponding age verification process. The former provides a greater choice of solutions to reduce risk; the latter sometimes imposes disproportionate measures. Age verification processes include age verification or self-declaration. If the data protection risk profile for children is low in certain situations, the age guarantee requirements may become less burdensome, even unnecessary. When the risks to children are high, companies should implement effective risk reduction measures for the use of personal data in online games.
No standard process for identifying adults and children has emerged. There is ongoing debate as to whether such measures will provide reliable protection or adversely affect privacy. In some circumstances, the collection of additional personal information may increase the risk of harm to children online. Some youth advocates worry that “tagging” an underage user can attract the attention of predators just as surely as protective controls will work.
We believe that proportionate and risk-based age guarantee requirements are necessary and inevitable, despite the difficulties. The reality of politics and the intentions of the government indicate this. Before the Internet becomes a safe place for children, systems and services will need to demonstrate how they create an environment that is appropriate for the age of child gamers. Knowing the age of users is a key component in creating a safe online experience. Age assurance processes should support entertainment, research, and communication by defaulting to a high level of privacy. The goal should not be to keep children away from online gaming; instead, we should work to keep them safe and feel empowered.
Guaranteed age requirements now appear in international standards, legislation, regulatory codes and guidelines. Example:
The UK Information Commissioner’s Code on Appropriate Design requirements a risk-based approach to estimating the age of individual users. Companies must ensure that their online services effectively apply the standards of this Code to child users. It provides a choice: either to set an age with a level of confidence proportionate to the risks associated with the processing of children’s data, or to apply the code’s standards to all users.
• The OECD recommendations for children in the digital environment was adopted in 2021, and related Instructions for digital service providers focus on this issue as well. They require service providers to “regularly take steps necessary to prevent children from accessing services and content that should not be available to them and that may harm their health and well-being or undermine any of their rights.”
• The EU law on digital servicesagreed in April 2022, requires providers of online platforms accessible to minors to implement appropriate measures to ensure a high level of privacy, security and safety for minors who access their services.
• Currently in the UK House of Commons Internet Security Bill proposes a safety duty that would require proportionate systems and processes designed to prevent children of all ages from encountering content that is harmful to children.
European currents moving towards comprehensive protection of children on the Internet have now spilled onto the shores of the United States. Recently, the most populous US state have passed in California Age Appropriate Design Law, which follows similar fences and requirements now enshrined in British law. August 2022 article in The New York Times speculated the legislation “could herald a shift in how lawmakers regulate the technology industry” more broadly. The article reflects the fact that regional and national laws tend to affect how large technology companies operate across the board, in part because of the amount of effort required to implement different treatment of users based on geographic location or age.
Privacy by design approach to game development requires consideration of many issues beyond age assurance. When a gaming company identifies a young user, how will it inform that person of their privacy rights? How easy is it for children to understand how their data is used? What controls or settings will be presented? Online games often hide privacy settings, making it difficult for even relatively experienced users to control their personal information.
Game developers may feel that presenting standard privacy statements would spoil the fun. But when they don’t provide age-appropriate and timely information about data collection, developers put themselves in an asymmetrical relationship with gamers. Users of all ages deserve to know whether personal information will be shared with third parties and how one service connects to other digital platforms—for example, through login partners like Google or Facebook. Understanding data collection by mobile gaming companies seems particularly important as they often record sensitive information, including geolocation and close contacts, from a mobile device. Policymakers and game developers must grapple with a difficult question: Can an underage player consent to the extensive use of their data?
The very process of the game sometimes leads to unfair use of personal information. Many games monitor the behavior of users and encourage them to interact with the digital environment for a long time. Collected personal information may be used to prepare highly targeted in-game advertising. In addition, companies sometimes use personal information to maintain offline connections between players. And this can lead to contact risks for young people. Gamers who reveal patterns of behavior and personal data can be manipulated into making certain social connections or purchases that contain their avatars’ “boxes” with attractive in-game features.
In this regard, policy experts discussed the gray area where gaming and gambling intersect. Online games allow young users to collect assets and then spend digital tokens or actual currency. Such activities can be addictive, so rules must play a role in regulating the relationship between developers and gamers. Some of these issues go beyond data protection, but should not fall into regulatory gaps.
The regulatory environment for online gaming companies puts them under the crosshairs of data protection authorities. The UK ICO has published statutory guidance in its Age-Friendly Design Code. This landmark guidance will enable companies to take practical steps to protect children’s privacy. It also helps them prepare for future internet security legislation. And ICO recently engaged in widely across the online gaming sector regarding compliance with the Children’s Code. New principles and best practices that will be needed to work with future gaming innovations include data minimization, privacy by design, responsible governance and risk-based treatment of young users.
Game developers already have a track record of creating user experiences that make their products intuitive, interactive and engaging for kids. It is now up to the industry to apply this experience to protect privacy. In the UK, the ICO will require evidence of effective and principled design. To prepare for possible investigations, game developers should document their decisions in this regard. Companies need to show that they understand and address children’s privacy concerns. Regulators put responsibility for responsible data use on service providers, not users. To that end, the default settings should set a high privacy bar.
Regulators appear to be catching up with the online gaming industry, which has largely avoided the intense scrutiny that social media platforms receive. But the gaming sector is a moving target. We’re already seeing what gamers might call “boss level” challenges on the horizon. Advances in virtual reality and the emergence of the metaverse are making gaming more and more exciting. The immersive game reveals unprecedented volumes of information about gamers, their connections and vulnerability to manipulation. Data protection principles still guide the regulation of new online gaming services, although additional guidance will be needed in light of developments such as the metaverse. Great article by Notary Data Protection Officer Gary Weingarden, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPM, CIPT, FIP, PLS, and Deutsche Bank Senior Advisor Matthias Artz, CIPP/E . turns out how data protection will be applied in this virtual context.
The speed of change in online gaming is insane and is far ahead of law enforcement today. Yet, overall, regulatory positions and signals from governments indicate that current and future laws will place increasingly stringent requirements on companies that collect and use children’s personal information. These requirements should be practical and take into account all the benefits and risks associated with online gaming. And they must grow out of solid evidence of how we can effectively protect and ensure safe online gaming research. Let the children play – safely.