INTRODUCTION
The advent and evolution of the internet have led to innovation by personnel to make learning easier for children. This has rapidly shifted the use of digital education for learning. It is a school without walls, whereby children can learn globally. From virtual classrooms to interactive apps, online learning platforms have become indispensable. However, this shift has also ushered in new challenges, particularly regarding the privacy and security of children's data. These platforms collect, store, and process vast amounts of personal information. This concerns the over data misuse, surveillance, and long-term digital footprints have gained prominence without following the guidelines of the use of data provided by the Data Privacy Act. Also, examining these risks, assessing the legal safeguards in place, and proposing best practices for protecting children's rights in the digital education ecosystem is crucial.
CHILD DATA AND LEGAL FRAMEWORK
Child data is any information that can directly or indirectly identify a child. Online learning platforms typically collect various categories of data, which include sensitive personal data, including:
- Personal Identifiable Information: Names, addresses, school IDS, dates of birth.
- Educational Records: Assessment results, performance analytics, learning patterns.
- Behavioural and Usage Data: Time spent on tasks, interaction logs, and click patterns.
- Biometric Data: Facial recognition (for login), voice recordings, and even eye movement tracking in some AI-based tools.
- Device and Location Data: IP addresses, device identifiers, geolocation.
While data collection is often justified on the grounds of personalization and improvement of educational services, we must understand that the best interest of a child is to be made the paramount consideration in any action undertaken by an individual, public or private body, which will be essential to question the necessity, volume, and security of such data gathering, especially when it concerns minors.
Article 8 General Data Protection Regulation (GDPR) provides that:
- Where point (a) of Article 6(1) applies, in relation to the offer of information society services directly to a child, the processing of the personal data of a child shall be lawful where the child is at least 16 years old. 2Where the child is below the age of 16 years, such processing shall be lawful only if and to the extent that consent is given or authorised by the holder of parental responsibility over the child. Member States may provide by law for a lower age for those purposes, provided that such lower age is not below 13 years.
- The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.
- Paragraph 1 shall not affect the general contract law of Member States, such as the rules on the validity, formation or effect of a contract in relation to a child.
The Nigeria Data Protection Act 2023 is silent about the usage of children data.
However, section 8 of the Child Rights Act provides that:
- Every Child is entitled to their privacy, family life, home, correspondence, telephone conversation and telegraphic communications except as provided in subsection (3) of this section.
- No child shall be subjected to any interference with his right in subsection (1) of this section, except as provided in subsection (3) of this section.
- Nothing in this provision of subsections (1) and (2) of this section shall affect the rights of parents and, where applicable, legal guardians, to exercise reasonable supervision and control over the conduct of their children and wards.
It is also to be noted that Chapter IV of the 1999 Constitution of the Federal Republic of Nigeria, as amended, also applies to a child and must be respected. Likewise, Article 16 of the United Nations Convention on the Rights of the Child states that:
- No child shall be subjected to arbitrary or unlawful interference with his or her privacy, family, home or correspondence, nor to unlawful attacks on his or her honour and reputation.
- The child has the right to the protection of the law against such interference or attacks.
We must understand that children are vulnerable; this makes them vulnerable data subjects. The GDPR, considering this, provides a minimum age for lawful data processing without parental consent, with room for national laws to lower the threshold to 13. The Children's Online Privacy Protection Act (COPPA) of the United States of America mandates verifiable parental consent before collecting data from children under 13. The Age-Appropriate Design Code (Children's Code) of the United Kingdom mandates that online services likely to be accessed by children provide high levels of privacy protection by default. The Nigerian Data Act 2023 is not child-specific; it only requires data controllers to process personal data fairly and with the subject's consent.
Despite these laws, the enforcement remains inconsistent. Many jurisdictions lack clear, child-specific data protection frameworks, and where such laws exist, institutional capacity to enforce compliance remains weak.
RISKS ASSOCIATED WITH THE DATA COLLECTION
1. Invasion of Privacy
Children often lack the legal and cognitive maturity to provide informed consent. These EdTech platforms are required to obtain verifiable parental consent however, this practice could be said not to. They do not use child-friendly privacy notices to risk violating privacy rights.
2. Commercial Exploitation
Many EdTech companies use collected data for profiling and behavioural advertising, which is a data breach. This act is particularly concerning where monetisation strategies are disguised within educational tools and fail to anonymise or pseudonymize, thereby compromising children's autonomy and exposing them to consumerism at a young age. This is an integrity breach due to their tampering with the child's data.
3. Data Breaches and Cybersecurity Threats
Most EdTechs do have inadequate encryption, weak authentication systems and poor data storage practices, making them frequent targets for hackers.
4. Permanent Digital Footprints
Children's digital activities create lasting records. These digital footprints are sensitive personal details that need to be safeguarded, and individuals may have rights over the data. When the EdTech platforms fail to comply with the privacy guidelines in line with the provision of the statute by improperly storing, sharing, or repurposing without consent, it may affect the children's future educational or employment opportunities.
5. Lack of Transparency
Many platforms provide ambiguous privacy policies that are neither accessible nor understandable to children or their guardians. The act of the EdTech can amount to the consent not being given and the data subject not being informed. This obfuscation undermines accountability and informed decision-making.
RESPONSIBILITIES OF STAKEHOLDERS
1. Platform Developers
The developers must entrench privacy-by-design and privacy-by-default principles, understanding the essence to align with the privacy guidelines. They should ensure minimal data collection, and the data collected should equate to the purpose for which the data is needed. Also, to avoid dark patterns and restrict third-party tracking to ensure accountability for their data privacy.
2. Educational Institutions
It is important that Schools and ministries of education understand that they owe a duty of care to the children by accessing data protection practices of third-party vendors. The procurement policies should prioritise platforms that demonstrate compliance with international child data protection standards and the privacy guidelines.
3. Parents and Guardians
Parents and guardians should be digitally literate, as this will make them proactive in supervising their children's online learning, reading privacy terms, and using platform controls to limit data sharing.
4. Government and Regulators
Governments should enact or strengthen data protection laws that specifically address children's rights in digital spaces and also provide institutions for the enforcement of these laws. Regulators must conduct regular audits in compliance with the privacy guidelines regarding the retention of data. There should be penalties for non-compliance issued against the defaulter, and public awareness through sensitisation.
BEST PRACTICES AND SAFEGUARDS
The words of Steve Wood, “Regulation has an effective impact in protecting children's safety and privacy online. There is a shift towards substantive design changes that build in safeguards by default - from private account settings to restrictions in targeted advertising”. When protecting a child's safety, there are safeguards which can be put in place, such as:
- Data Minimisation: The platforms should only collect specific data for the legitimate purpose needed. These data should be strictly necessary for educational purposes.
- Parental Control Dashboards: The parents and guardians should be allowed to review, limit, or delete their children's data. To elaborate this, we should understand that the children are in the custody of their parents because they are vulnerable. Being a vulnerable data subject, the parent is obliged under the law to enforce their rights, which is the right of a data subject as enshrined by the law.
- Privacy Notices for Children: The notice is to be simplified, visual, and age-appropriate, explaining how the data is used.
- Independent Audits and Certifications: Third-party reviews must be conducted to verify compliance.
- Transparency Reports: The platforms are required to publish regular reports on data processing and sharing as prescribed by law. (Nigerian Data Regulation 2025)
CONCLUSION
Digital platforms have excellent benefits, as the introduction of online learning has benefited children and parents and cannot be ignored, but it must not come at the cost of children's privacy and dignity. Protecting children's data is a shared responsibility that demands ethical design, robust regulation, and conscious parenting; a child's rights are of the utmost concern.
As children increasingly learn, interact, and grow in digital environments, we must ensure their rights are upheld and their vulnerabilities respected. A child's rights are not negotiable; future policies and platform innovations must be grounded in the best interests, not profit or convenience.
REFERENCES
- United Nations, Convention on the Rights of the Child (adopted 20 November 1989, entered into force 2 September 1990) 1577 UNTS 3 (CRC), Art 16.
- Elizabeth Denham, ‘Children and the GDPR: A Rights-Based Approach to Data Protection' (ICO, 2021).
- 5Rights Foundation, ‘Disrupted Childhood: The Cost of Persuasive Design' (2018) https://5rightsfoundation.com/uploads/Disrupted-Childhood.pdf accessed 17 April 2025.
- CRC (n 1) Arts 3, 16.
- Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 (General Data Protection Regulation) [2016] OJ L119/1, Art 8.
- Children's Online Privacy Protection Act of 1998 (15 USC §§ 6501–6506).
- Nigeria Data Protection Act 2023 (NDPA), National Information Technology Development Agency (NITDA).
- Information Commissioner's Office (ICO), ‘Age Appropriate Design Code' (UK, 2021) https://ico.org.uk/for-organisations/childrens-code-hub/ accessed 17 April 2025.
- The Child Right Act 2003 of Nigeria
https://www.linkedin.com/company/gresyndale-legal/
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.