Privacy and AI in the Education Industry
Data privacy has emerged as a growing concern in the education industry in the U.S. for several reasons. First is the lack of comprehensive federal privacy regulation in the country, leading to states enacting their own different privacy regulations each with their own slight variations and nuances. Secondly, the expansion and importance of education technology (EdTech) services, especially since the 2020 pandemic. And third, the rapid growth of AI-based technology.
When it comes to children’s privacy, there are complex issues to be solved. Australia recently imposed a ban on minors using social media, but is this the right approach? Does a social media ban completely safeguard children’s data privacy? What about entrusting parents with safeguarding children’s privacy? Is this effective? What about in cases of parental neglect or abuse? How can regulators age gate? Children’s privacy is nuanced, and we are still looking for effective answers to these questions. In this article we will look at the current state of data privacy in specifically the education industry affecting children’s privacy, including:
Regulatory and ethical considerations
Current and upcoming laws
Steps to take to avoid negative legal and customer actions
Recent key cases of data privacy enforcement in the education industry
A recent landmark case of violation was CollegeBoard collecting students’ data and licensing this data to colleges, scholarship programs, and various customers. The Office of the Attorney General (OAG) looked into the matter and found that College Board solicited students to provide information, such as their GPA, anticipated course of study, interest in a religiously affiliated college and religious activities, and parents’ level of income, during the administration of PSAT, SAT, and AP exams. Students were also made to provide information when they signed up for a CollegeBoard online account. While providing this information was optional, students were coerced to provide this data in the context of an important exam and were encouraged to sign up because it would connect them with scholarship and college opportunities. It was also found that CollegeBoard used students’ data for their own marketing. In February of 2024, The Office of the New York Attorney General and New York State Education Department reached a $750,000 settlement with CollegeBoard for this violation.
In another noteworthy case, the Federal Trade Commission sued the EdTech company Edmodo for violating the Children’s Online Privacy Protection Act (COPPA) rule by collecting personal information (PI) like device information, IP addresses, location, etc. from school children without consent from parents, and using the data for advertising purposes. Edmodo had placed the COPPA compliance responsibility on the schools and teachers However, to this the FTC stated that Edmodo violated the FTC Act’s prohibition on unfair practices by relying on schools to obtain verifiable parental consent.
In a case of student data security, U.S. Senator Tom Cotton launched in March 2024 an inquiry into Tutor.com (which was recently acquired by private Chinese firm, Primavera Capital Group) for allegedly collecting personal data like location, IP addresses, etc. of students, which can then be accessed by the offshore parent company In the Senator’s letter to the Secretary of Defense, Lloyd Austin, he wrote, “In January 2022, Primavera Capital Group—a Chinese-owned corporation associated with TikTok’s parent company, ByteDance—acquired Tutor.com.” “While providing educational services, Tutor.com collects personal data on users, such as location, internet protocol addresses, and contents of the tutoring sessions.” U.S. Senator Bill Cassidy also launched an inquiry into Tutor.com urging the company provide information on its policies to secure users’ personal information, and how it is ensuring that Americans’ sensitive data remains safe. According to Chinese law, companies based in China are required to “support, assist, and cooperate with state intelligence work,” meaning that a company can be compelled to share information with the Chinese regime. The inquiry is currently ongoing.
Overall, a lack of safeguards and knowledge around children’s data privacy prevails largely due to the absence of appropriate regulations. Currently in the U.S. there are two significant laws that regulate children's data privacy: FERPA and COPPA.
The current legal landscape protecting children’s data
The first, older regulation is the Family Education Rights and Privacy Act (FERPA) that intends to safeguard the confidentiality of student education records. However, FERPA, enacted in 1974, fails to appropriately regulate the advancements in the EdTech world today. Further, the law allows for data sharing with “school officials” who have a “legitimate educational interest” in student data. But lawmakers could not have foreseen that some platforms like Google classroom might one day be designated by schools as “school officials”. The issue is that FERPA only applies to schools, creating gaps in protection when private EdTech companies are involved in education.
The second, more widely enforced law in the EdTech space is the Children’s Online Privacy Protection Act (COPPA). COPPA, enforced by the Federal Trade Commission (FTC), is, according to the Interactive Advertising Bureau, the global gold standard in data privacy rules for children. And even though the law is 20 years old, the momentum for COPPA enforcement has only increased due to heightened consumer awareness of privacy breaches and precedents set in litigation against ad buyers and sellers. In January 2025, the Federal Trade Commission (FTC) made amendments to COPPA aiming to address children’s privacy concerns considering current technological advancements. These amendments include a broader definition of Personal Information (that include biometrics and government issued identifiers), parental consent for data disclosures, stronger data retention limitations and more. Complete information on the amendments can be found here.
The lack of a comprehensive federal privacy law has resulted in more than a dozen states enacting their own laws regulating the collection, use, and storage of residents’ personal data, several of these state privacy laws include regulations for children’s data privacy.
In January 2024, California, which has already established itself as a frontrunner in U.S. state privacy legislation, introduced two new bills aimed at bolstering children’s privacy rights. The Children’s Data Privacy Act (AB 1949) proposes amendments to the CCPA to strengthen protections for those under 18, including raising the age limit for affirmative authorization for data selling or sharing from 16 to 18. The second bill proposed is the Social Media Youth Addiction Law, which intends to regulate addictive features on social media platforms for users under 18. Both bills are yet to be passed.
While some state privacy laws like Utah, Florida and Texas cover children’s data privacy, this is largely aimed at social media platforms. The Utah Consumer Privacy Act, for example, does not permit minors to have social media accounts without express consent from their parents. The Texas Data Privacy and Security Act requires parental consent for specific collections and uses of a minor’s information. On the other hand, the Florida Digital Bill of Rights limits the access of minors under the age of 18 to social media and other “online platforms” which are defined to include online games and online gaming platforms. This act requires affirmative authorization for minors under 18.
The Maryland Online Data Privacy Act, which will go into effect in October of this year, has specific requirements for higher education institutions to comply with under Code Title 10, Subtitle 13A. Under this law, universities should have a privacy program in place that is periodically reviewed by a third party with expertise in data security. They are required to display their privacy notices clearly on their websites to ensure visibility. And they are also required to include in their contracts, requirements for third parties to adhere to so that the third parties also comply with the institutions privacy policy.
To address concerns around the health information collected by schools and universities, in 2023, the United States Department of Education (DOE) recently released two new guidance documents aimed at educational institutions, reminding them of their continued obligations to protect students’ rights under the Family Educational Rights and Privacy Act (FERPA). The first document provides educational institutions with an overview of FERPA, the second is directed towards students’ and is a one-page ‘know your rights’ guide tailored to health data and its disclosure.
Current concerns for children’s data
As of 2024, safeguarding data privacy in the education and EdTech industry faces several roadblocks.
One of them is third-party vendors and contracting. This is when schools, universities and educational institutions use third-party software for administration and learning programs. These third parties often are not compliant with privacy regulations. They use the data they are provided with for advertising purposes or in some cases sell the data to other parties. In the case of Edmodo, FTC found that they used the data they collected for their own advertising purposes.
Another growing concern for children’s data comes with the rise in the use of AI. While AI undoubtedly provides plenty of benefits to education such as automated assessments, personalized learning, translation platforms, collaborative learning platforms and more, it also has its disadvantages. AI-powered educational tools, by nature, collect the data of students. This data is often personal data and/or sensitive data. This collection of data is often done without prior informed consent. Further, AI systems are often found to have bias, which can result in discriminatory output. This is detrimental for children depending on AI systems for their learning.
Third is the lack of proper infrastructure and funding for appropriate data security measures. A report by Comparitech found that since 2005, schools and universities in the U.S. have experienced more than 3500 data breaches. Over the past few years, colleges and universities have been the targets of such attacks, so much so that the FBI issues a warning for higher education in March 2021 following a string of ransomware attacks that affected colleges like Florida International University, the University of Arkansas, and others. Most schools and some universities have little funding and technical expertise to handle the amount of sensitive data that they possess. Even large school districts find it difficult to keep up with the continual security alerts, patches, and updates needed to maintain secure systems of their own. Educators are increasingly reliant on EdTech vendors for basic tasks. While upgrading core systems is an expensive undertaking, Higher Education institutions are prioritizing this endeavour, a report by Gartner found, as the average cost to remediate a ransomware attack is $1.42 million according to Educause.
And finally, a general lack of knowledge and definitions by the public. This includes the ways in which data is collected, for what it can be used, how AI systems are powered, and existing laws and their guidelines. This prevents parents and guardians from protecting their children from data privacy risks.
Solutions going forward
As technology continues to evolve, and more robust laws are enforced, it will become vital for schools, teachers and parents to be provided with appropriate knowledge on the data privacy concerns around EdTech, AI, and data privacy, and how the three work together. Some immediate steps we can take include:
Parents should be given the opportunity to provide informed consent when necessary. This requires schools to go beyond the (consent) legalese of opt-ins and terms and conditions and plainly share with parents (inform) how student data is collected and for what use. This shows good faith with regulators, and builds goodwill with the students and parents
Vendor/School contracts must be clear. Contracts between schools and EdTech vendors should include all privacy and security laws applicable to both parties and clearly state the data types that will be collected, used, and disclosed as part of the services. Utilize trusted partners to review both vendor risk and what should be communicated with students and parents
Ongoing vendor transparency is key. EdTech vendors and schools should maintain transparency with their data collection and processing practices which includes keeping parents in the loop through privacy policies and other such notices. Schools, and their privacy advisors, should closely monitor changes in vendors’ data usage, transfer, and policies and update parents and students with any changes.
Comments