The latest attempt to solve the online data and privacy crisis | Media Pyro

[ad_1]

Some crises hit companies quickly, corporate executives deal with them, and soon disappear from the spotlight.

Other crises attract public attention, but eventually fade into the background, unsolved. But they can be moved to the front at any time.

Consider the case of the online data and privacy crisis, which made international headlines a year ago when whistleblower Frances Haugen told Congress that Facebook and Instagram were negatively affecting the mental health of teenagers.

Unsurprisingly, there have been several rounds of blame and pointing at who is to blame for the crisis, the extent of social media’s impact on mental health, and what should or should be done about it. Since then, the crisis has been simmering in the background.

Senators seek new reforms

That crisis could come to the fore again thanks to U.S. Sen. Edward Markey (D-Mass.), who is leading a group of Senate colleagues asking the Federal Trade Commission to update the Children’s Online Privacy Protection Act.

Markey said in a press release that he wants the government agency to:

  • Expand the definition of “personal information” covered by the Act.
  • Require social media platforms to protect the privacy, security and integrity of children’s data.
  • Introduce regulatory safeguards that reflect the increased use of online platforms for educational purposes.
  • Limit the data a child has to share when participating in online activities.

“Experts agree we’ve reached a tipping point for children and teens online as rates of mental health problems for them soar, and the U.S. chief medical officer has called on tech and social media companies to address these threats to young people,” lawmakers wrote the FTC.

“In countries around the world, government agencies have begun to take action by implementing policies to combat harmful online threats to children. Now the United States must do the same,” they said.

“On the right track”

“Markey’s proposed changes are the right way to go, and it’s great to see lawmakers looking at the best way to protect children,” Eva Mahler, ForgeRock’s chief technology officer and an expert in digital identity and cybersecurity, said via email. .

“One step further”

“To make a significant difference, I recommend going even further by expanding current regulatory requirements to include ‘smarter’ consent: stronger permissions to collect, use and share data,” Mahler said.

“Many of today’s requirements for compliant online consent do not apply to smart devices or mobile applications, which often work in ecosystems such as smart homes and can remember preferences over time,” she noted.

“I also recommend strengthening the ability of parents and guardians to maintain adequate control and supervision of their child’s current digital experiences until they are old enough to understand and manage their online data.”

“Great example”

“One great example of a company that has done it right is the BBC, which provides content for people of all ages,” Mahler said.

“The broadcaster has a website called BBC Bitesize which offers quizzes and tutorials to parents and young learners. It had to comply with both historical (Article 8 GDPR) and new (Children’s Code) provisions on age-appropriate processing and design of personal data.”

The BBC “enables children to have an online experience that is not only age-appropriate, but also compatible, by guiding them to voice authentication methods. Context-sensitive and adaptive verification of children’s identity is an important prerequisite for safe and simple handling of personal data,” she commented.

What companies are doing wrong

“Businesses that are casual about the accuracy of age data are doing it wrong,” Mahler argued.

“To protect children, companies need to understand when they are interacting with a child. Given today’s regulatory environment and the increasingly low tolerance for offensive or inappropriate online experiences for children, it’s fair to say that services that don’t verify a user’s age are doing it very wrong.

“Extraordinary pressure to change”

“There is enormous pressure to change the current widespread practice of assuming users’ own age, as children can easily provide false information to gain access to age-restricted content,” she said.

“In fact, a recent study by Ofcom, the UK telecoms regulator, found that around one third of 8-17 year olds create fake profiles to sign up for an adult account, and almost half of 8-15 year olds have accounts that who claim to be 16 or older,” Mahler concluded.

[ad_2]

Source link

Avatar photo

About the author

Media Pyro is a site giving interesting facts about acer brand products. We also Provide information about your online Privacy Laws.