top of page

Children’s Privacy: Are we doing enough?


Privacy and AI in the Education Industry

With the increasing integration of technology into daily life, it has become challenging to monitor what data is being shared online and how it is used. While significant progress has been made in safeguarding individual privacy, children remain particularly vulnerable due to limited oversight and protection online. Recent legislative efforts are attempting to address these gaps, but challenges persist. One of the biggest challenges in protecting our children’s privacy is a lack of awareness among both children and their guardians. Parents often begin sharing their children’s information online without realizing they are creating a digital footprint. This could start with a simple act of sharing the baby’s in-utero images.


The digital footprint, while often created innocently, can later expose children to significant risks in their online interactions. Children’s online activities also expose them to a variety of risks, including physical, mental, and emotional harm. Access to inappropriate content can amplify struggles such as body image issues and suicidal thoughts, highlighting the need for stricter online safeguards.


In 2023, Google settled an FTC investigation for $320 million due to continued violations of COPPA related to YouTube’s ad-targeting practices for minors. The settlement also required stronger AI-driven age verification mechanisms and default privacy settings for all users under 18. Corporate accountability in addressing these issues has prompted new technological solutions to protect children online.


In response to growing concerns over AI-generated deepfakes and child exploitation risks, the FTC has proposed new age-verification standards using encrypted identity verification (EIV) technology to strengthen online age-gating mechanisms. These standards aim to prevent minors from circumventing age restrictions on platforms like TikTok, Discord, and gaming networks. However, many companies claim they don’t allow kids under a certain age, even though kids can bypass age restrictions easily.


The U.S has historically relied on COPPA (Children’s Online Privacy Protection Act) which applies to websites and online services directed at children under 13. However, COPPA’s limitations have become apparent as children’s digital habits evolve.


In early 2024, Meta introduced “zero-data tracking” accounts for minors on Instagram and Facebook, restricting behavioural ad tracking. While this is a step forward, experts argue that stronger enforcement mechanisms still needed at the federal level in the U.S. Instagram accounts of child users were set to “public” by default, thereby also making their social media content public. Users had to manually change it to “private” through the account privacy settings, which also violated the Data Protection by Design and by Default rule under Art 25 of the GDPR.


The California AADC (Age-Appropriate Design Code Act (Bill 2273)) was challenged in federal court in 2023 by tech industry groups arguing that it violated the First Amendment. As of 2024, the law’s enforcement has been delayed pending appeals, but similar federal legislation is gaining traction in Congress.


In January 2024, Congress reintroduced a revised version of KOSA (Kids Online Safety and Privacy Act) with bipartisan support incorporating stricter AI content moderation requirements and increased penalties for social media platforms that fail to protect minors from harmful content. However, debates over free speech protections and platform liability have stalled the bill’s progress.


In September 2024, the FTC released a report criticizing major social media companies for their extensive data collection practices, particularly concerning children. The report emphasized that these platforms often collect and retain user data indefinitely, including metadata embedded in shared images, which can be exploited by AI-powered facial recognition systems.


In response to the growing concerns over children's online privacy, the FTC has proposed amendments to COPPA, including updates to the definition of Personal Information (PI) to reflect modern digital data collection practices. These updates aim to strengthen protection for minors in the evolving digital landscape.


Parents and guardians play a crucial role in protecting their children’s online privacy. Reviewing app permissions, educating children about data sharing risks, and advocating for safer digital environments are essential steps. However, until laws are enacted and enforced, children remain vulnerable to data exploitation, exposure to harmful content, and long-term privacy risks. The responsibility lies not only with lawmakers but also with businesses and parents to create a safer digital world for the next generation.

Comments


Featured Posts

Recent Posts

Follow Us

  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page