With the dawn of an era which intends to preserve the rights manifested with respect to data privacy and associated rights bestowed upon an individual, leaving behind the rights accruing in favor of the "children of men" will be unbecoming. Given the fast-paced digital ecosystem, which is now moving towards focusing on targeted and child centric business activities, social platforms, it is time to make a move towards a conforming, and morally encompassing privacy framework. With very little control that a child exercises on their interests, and the excesses they wish to indulge in, it is for their guardians, "digital custodians" to ensure that nothing goes awry.

The proliferation of smart toys, wearables, mobile devices, has led to unchecked generation, collection, and processing of personal and sensitive personal data. A child's uninhibited desire to partake in online activities, and willingness to try "new and trending devices", only adds to the woes of their guardians. As these devices blend into the background of their users' daily lives, the information on the children who carry these devices or access the digital platforms, gets seamlessly integrated into the systems of the systematic collectors.

The onset of augmented reality, virtual reality, artificial intelligence enabled tools, which are intrusively reliant on the incessant churning, processing, and analysis of user data, necessitates overhaul of the current data privacy framework for safekeeping of children. The blatant and excessive display on the profane digital space, makes the children vulnerable to unfiltered and possibly irresponsible content. Conventionally, the approach taken by parents in controlling the sphere of their children's activities is very parochial and does not afford the children the opportunity to explore, beyond what is deigned "right" for them. This has been thwarted by the proliferation of automated, digital systems, which for the first time empowered the children to decide for themselves. While as a society we are progressing towards allowing people (children included) to decide for themselves, it is not unfounded that children belong to the demographic bracket which is vulnerable, incapable and for the lack of a better word, uninitiated. It is with this premise, that we proceed to examine the consequences of having children be exposed to the pervasive, ubiquitous digital access, in the backdrop of the applicable data privacy and protection framework.

Online Excesses

Recently, the technology giant, Facebook's photo and video sharing social networking service, Instagram, attempted to make their way into a child centric platform, which would have allowed children to build their own profiles on their platform, without having to deal with adults per se. The company reportedly admitted that their intent was to allow the children in the range of 10-12 years to become part of this closed and protected ecosystem, while being online1. The company contended that kids of this age group are already participating in this ecosystem, in the guise of an older individual; so, they might as well allow them access with stricter terms, under parental guidance.

Without getting into the merits of this, it does seem like a safer opportunity to have kids in a closed door environment, without having "peeping toms" in the vicinity. However, much like the kids having false accounts in the name of older people, there is nothing stopping an offending adult to do the same, while bending the rules to their own advantage. Coupled with issues like early onset of social aggression, low self-esteem, lack of physical activities and interactions in the real world, there could be a tendency in the affected kids to gravitate towards greater psychological problems in the long run.

There have been reports of highly disturbing incidents involving AR/VR experiences, wherein the users have been subjected to unwanted sexual advances on the metaverse. The experience left older users traumatized and scarred, due to the VR experience, which gives a heightened sense2 of stimulus to users in the real world, for any action made in the virtual space. Exposure of children to such debase and coarse virtual experiences may create an indelible impression in their minds and create lasting psychological trauma in the long run. This behavior outstrips the existing severity of online bullying, cyber harassment issues; and this translating into the domains which children frequent to, has far reaching consequences than what a child can comprehend of.

Entities like Oculus, disclaim liability from any inconvenience being caused to users from a variety of sources when using Oculus Products and seeks an acknowledgement to the effect that content may be inaccurate, offensive, indecent, or otherwise objectionable3. While the company allows only children above the age of 13 years to participate in their virtual worlds over their systems, the possibility of younger kids becoming part of this vile experience is real. Worse is the fact that even children above the age group of 13 years, may also not be prepared well to encounter the misgivings that this ecosystem has to offer, without any appropriate remediation mechanisms being put in place by these digital platforms.

Regulatory Landscape

In the realm of data privacy, consent plays a major role; this is inherently absent in the case of a minor who is availing services of any service provider, for a child's consent will not qualify to be free and/ or valid consent.

In view of the aforementioned, and the differential levels of psychological capacity demonstrated by children based on external influences, regulators across jurisdictions have placed different age limits to qualify consent being sought from any age group. Accordingly, whereas the draft Data Protection Bill, 2021 (DP Bill) in India has identified any person under the age of 18 as a child, the European union General Data Protection Regulation (GDPR) and the Children's Online Privacy Protection Act [USA] (COPPA) have benchmarked this threshold at 16 years and 13 years respectively.

COPPA stands as the highest standard that is in place to monitor, regulate and afford the children their rights to privacy as a core and inalienable construct in the online world. Enacted in 2013, COPPA was the culmination of a series of actions, investigations conducted by the Federal Trade Commission (FTC) to ascertain the means and practices of data collection, processing activities of e-commerce websites in the USA. COPPA places the responsibility on parents to authorize, verify any data collection, processing activities conducted by websites, to ensure a degree of control over children's data being shared/ transmitted/ collected by online portals. Accordingly, entities regulated under COPPA are mandated to post a clear, readable privacy policy to describe processing activities for children.

Further, entities must provide direct notice to parents at an identifiable location on the website and obtain verifiable consent from parents before collection of children data and provide parents with the right to access their children's information for review. Outside of these specific protocols, COPPA requires the entities to abide by the general principles of privacy in relation to data integrity, storage minimization and data retention.

Internet of Toys

Unlike smart toys, which enable interactions with the users, connected toys facilitate connectivity to web-based servers and the device available at the hands of the child users, and allow information to be sought from the end users and be pushed into the servers to build a profile of the kid using the device. Where there is mere use of persistent identifiers, the devices would ideally be left out of the scope of applicability of some applicable laws, not necessarily qualifying as personal information. A slight overstep from this, and moving into storing or processing a fragment / snippet of a conversation, may qualify this as personal information, and expecting the same level of compliance that is required for all other processing activities.

Verifiable parental/ guardian consent must remain the primary ground for data processing, collection, to ensure oversight over the children's activities, and keep their legal guardians informed of their wards' activities in the online sphere. To combat issues related to the vulnerabilities that children may be subjected to, digital platforms have been adding riders to block entry by underaged / unqualified persons. Age gating has been viewed unfavorably by netizens, but must be regarded as a necessary tool, to desist the unsavory elements from making their way into the accesses or privileges that children become privy to.

The earlier iteration of the data privacy legislation in India created a new segment of data fiduciary by the name of "guardian data fiduciary" to take up the responsibility of affording minors a semblance of control in re the information pertaining to them. This category of custodians have not made their way in the present iteration, possibly owing to the fact that such a relationship between a data fiduciary and a data principal (minors) will not only be onerous, but will also be fraught with unnecessary challenges, inaccuracies, and uncoordinated implementation.

In case of adults, there has been introduction of a class of data fiduciaries who function as consent managers; such data fiduciaries are entrusted with the task of managing consent of the data subjects and in practice, some consent managers have also evolved into owning up the task or performing the task of sifting through best industry practices of service providers with whom a user might want to engage with. Similarly, parents/ legal guardians must take up the roles of consent managers for their wards and take up the role of managing consent across platforms for their kids, keeping the children's best interests at the core of their role as a guardian.

In order to ensure that there is no inherent bias seeping into the information which is being fed as base datasets for artificial intelligence/ machine learning algorithms, the European regulators have taken a strict view towards the regulation of artificial intelligence4. The fact that there is a requirement to assess the impact that will be unleashed on the children who participate in this digital ecosystem, there is a requirement to have an equivalent set-up in the context of the current set-up providing for AR-VR solutions which are all-pervasive.

Conclusion

In the context of the foregoing, in an economy like India where there is a disparity amongst socio-economic classes, it is imperative that considerations be made with respect to the awareness that is caused to the people who participate in this digital ecosystem. Entities which facilitate instances like gaming, gambling, social interactions taking place between people from distinct classes (where the level of awareness and understanding stands as an inherent disparity), it is imperative that there are requirements in place prior to onboarding of such facilities or allowances being made to such participants.

As the data subjects belong to an age group which cannot be treated to be responsible for their own actions and may also suffer from the disability to understand the consequences of their participation in this ecosystem, it is pertinent that an additional layer of comfort is provisioned for such categories of data subjects. While elements of data privacy focuses on preservation of rights of a data subject, the qualifications and observations that are made by a data subject who may be of a questionable age would be susceptible to scrutiny in terms of admissibility and enforceability.

It is with the intent to serve a larger number that the privacy norms must account for a larger populace, including that which is not capable of making prudent choices, in the simplest of circumstances. There was a time where the choices varied between having plots of land to be cultivated for a singular crop (Farmville), to a point where we have moved to having children building territories to defend their own interests in an AR/VR world (Age of Empires).

To this end, it is imperative to have the differences be met out for what is to be consumed by a child, as opposed to what is for general public consumption.

Footnotes

1. https://about.instagram.com/blog/announcements/pausing-instagram-kids; last accessed on February 23, 2022, at 1040 hrs.

2. https://www.nytimes.com/2021/12/30/technology/metaverse-harassment-assaults.html; last accessed on February 23, 2022 at 1130 hrs.

3. https://www.oculus.com/legal/terms-of-service/; last accessed on February 22, 2022 at 1037 hrs.

4. Draft Artificial Intelligence Act, 2021

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.