| Ayşe ADIGÜZEL Trainee Attorney | Duygu AYTAÇ Trainee Attorney |
Keywords: Digital Privacy, Protection of Children, Verification Models, Age Verification.
INTRODUCTION
It is a fact that the digital world offers significant advantages in terms of access to various products and services, and that both adults and children benefit extensively from these products and services. However, given the current state of the digital world, it cannot be said that protecting children’s privacy is among the primary concerns. Children’s inexperience and vulnerability in the digital world render them extremely susceptible to online threats and abuses. Ensuring children’s digital security is not limited to protecting their data privacy; it is also crucial for establishing a healthy and secure digital identity by safeguarding their social, psychological, and legal well-being.
Research conducted in partnership with ECPAT, INTERPOL, and UNICEF¹ reveals that more than 175,000 children go online for the first time every day, and children constitute approximately one-third of internet users worldwide. Furthermore, reports on online child sexual abuse material indicate that online child sexual abuse has increased by 15,000% over the past 15 years. The findings of these studies further substantiate the necessity of protecting children’s safety in the digital environment.² In this context, the main discussion topic of this article is how responsibility should be shared among parents, online service providers, and regulatory authorities in ensuring children’s online safety.
1. PROTECTION OF CHILDREN’S RIGHT TO PRIVACY IN A GLOBAL PERSPECTIVE AND THE RESPONSIBILITY OF REGULATORY AUTHORITIES
Significant steps have been taken recently to protect children’s privacy in the digital world. Below, this matter will be addressed by examining how data protection standards have developed and diversified on a global scale, with current examples from various regulatory authorities.
The UK Information Commissioner’s Office (“ICO”) published “Age-Appropriate Design: A Code of Practice for Online Services”³ (“Guide”) on 12.08.2020, and compliance with the Guide became mandatory for both parents and online service providers as of 02.09.2021. Within this scope, for children’s safety and privacy, not only services designed specifically for children but also all digital services that children may access, even unintentionally, must comply with the Guide’s standards. Through this approach, standards have been established to provide children with a safe experience in the digital environment and to minimize risks that may arise from their access to online services. One of the important standards introduced is that age verification pages must be designed to effectively and transparently prevent children’s access to adult content. In this context, the ICO recommends accurately determining the user’s age and avoiding potential risks; where accurate age determination is not possible, it recommends applying these standards to all users. In this regard, it can be said that the Guide’s primary purpose is not to protect children from the digital world, but to protect them within the digital world.
In Australia, recognizing the importance of children’s privacy in the digital world, new legislation has recently been introduced under the Online Safety Amendment (Social Media Minimum Age) Bill 2024 (“Bill”). Under the Bill, platforms that enable online social interaction among multiple users and allow content sharing have been defined as “age-restricted social media platforms.” Providers of these platforms are subject to obligations to prevent users under 16 years of age from creating accounts, and platforms failing to comply with these obligations face administrative fines of up to 49.5 million dollars.
One of the most recent developments at the global level is the decision issued against OpenAI by the Italian Data Protection Authority (“Garante”).⁴ In the decision, OpenAI’s failure to use an effective age verification model and, in this context, its failure to take adequate measures to prevent children from being exposed to inappropriate content generated by artificial intelligence was one of the key factors leading to an administrative fine of 15 million Euro.
In our country, work is being carried out specifically on the protection of children’s personal data, and the Personal Data Protection Authority (“Authority”) has published announcements, posters, and brochures addressing children, parents, and online service providers. The Authority has also created a cartoon character named Verican⁵ that provides recommendations for keeping children safe online, aiming to raise children’s awareness and attract their interest for a secure and digital future. Meanwhile, the Personal Data Protection Board (“Board”) decided to impose a total administrative fine of 11.5 million TL against Meta because, in the relevant process, necessary security measures concerning the protection of children’s privacy were not taken, as child users’ private Instagram accounts could be converted to business accounts, thereby making these accounts accessible to all users.⁶ It should be emphasized that this administrative fine is the highest administrative fine imposed by the Board to date. The Ministry of Family and Social Services (“Ministry”) has made statements indicating that age verification models will be implemented for online service providers to reduce the risks children face on online platforms in our country, and the model called “Digital Identity” mentioned in these statements will be examined below.
In light of this information, it is evident that authorities and governments worldwide are making complementary decisions to protect children in the digital world.
3. RESPONSIBILITY OF ONLINE SERVICE PROVIDERS
Online service providers may range from search engines to platforms that operate in the online environment and enable users to access content or information, share content, or interact. As can be seen from the various legal regulations mentioned above, the responsibility of online service providers regarding the protection of children’s privacy is being significantly increased. For example, processing biometric data by using facial recognition technology to verify the age of young children may result in violation of the Guide standards rather than compliance with said standards. Therefore, determining an appropriate model that can both accurately verify a child’s age and offer a privacy-friendly approach is of critical importance.
3.1. Age Verification on Online Platforms: Common Models and Associated Risks
The need to verify user ages in line with the responsibilities of online service providers has become increasingly critical due to the large user base of these digital platforms and, in particular, the necessity of protecting children. The design and implementation of age verification models aim to protect children’s safety in the digital world by preventing their exposure to harmful content. The models commonly used today are set out below.
Declaration-Based Verification: This involves age verification through users declaring their date of birth, confirming they are above a certain age, or checking a confirmation box.⁷ The declaration-based model can be easily circumvented by children as it lacks additional procedures to verify the accuracy of the information provided.
Document-Based Verification (eng. Hard Identifiers): This involves age verification through users submitting identity documents such as driver’s licenses or passports that are accepted as valid by competent authorities. However, in this model, the issue arises that other personal data contained in the documents submitted by users is also shared. Moreover, the processing of said additional data raises concerns as it may constitute a violation of the data minimization principle in a manner incompatible with the purpose of age verification and may enable practices that could further endanger privacy.
Third-Party Age Verification Services: This involves online service providers verifying their users’ ages through third-party age verification products and/or services. Third-party age verification service providers can perform age verification through various methods such as identity document verification, biometric analysis, or credit card verification. However, it must be acknowledged that it is of great importance for such external providers to ensure compliance with data security and privacy and to adhere to predetermined security protocols for this purpose.⁸ These models can be implemented through examples such as age verification platforms, verification via social media, or biometric verification. Age verification platforms primarily provide age verification services using methods such as identity verification, knowledge-based identity verification, or credit card checks. On the other hand, the verification model via social media performs age verification by analyzing data in users’ social media profiles and their activities on the platform.⁹ However, this model contains potential threats such as the use of collected data for profiling, targeted advertising, and unlawful purposes, and may endanger children’s online safety. Among the predominantly used models, the most controversial issue is the biometric verification model capable of performing age analysis based on facial features through AI-powered facial recognition systems. However, biometric verification methods raise substantial concerns, primarily including third-party access to the camera and the ability to access not only a person’s age but also their gender and even health status as a result of analyzing facial details.
As mentioned above, obtaining services from third-party providers for age verification purposes creates a new market for business models focused on data privacy in the online environment. Since the scope of data collected for this purpose may allow for more detailed profiling of individuals’ digital identities, third-party providers must not engage in data processing activities beyond the limits of the purpose, and children’s digital privacy must not be endangered. How the obligations that may arise within the scope of the results of age verification services obtained by online service providers from third parties will be distributed among the relevant parties should be separately regulated by the competent authorities.
Payment-Based Verification: This involves age verification by users providing valid credit card information or by temporarily holding an amount from the credit card (provision) or collecting a micro-payment to verify the card’s validity. Today, since many children have access to both prepaid cards and parent-linked cards, the effectiveness of this model in age verification will be debatable unless information flow regarding the cardholder being a child is ensured through systems integrated with banks.
User-Based Age Verification: These verification models can be implemented as a knowledge-based verification model where users are asked questions based on personal information that individuals in a certain age group would know, or as a device-based verification model where age is estimated based on the user’s device type, browser history, or other digital traces. User-based verification is a model that the Authority has referenced by recommending that questions that a child of that age could answer be asked for age verification purposes in the brochure it has published on the subject.¹⁰ Of course, in cases where the model is implemented, the relevant questions must be prepared with the assistance of experts. However, in this model, children may also unknowingly share a great deal of personal data with the digital world. In cases where device-based verification is implemented, monitoring children’s online behavior and recording all their digital activities through their devices may lead to data collection beyond the intended purpose, thereby endangering children’s digital privacy.
Account Holder Consent / Parental Consent (Indirect Age Verification Models): This is a model where the user’s age is verified based on the consent of an existing account holder who is confirmed to be an adult. This model is particularly preferred in subscription-based services and allows adults who are account holders to create child profiles, restrict access through passwords and similar methods, and perform verification in similar ways. In this model, the primary responsibility is attributed to parents.
3.2. The Future of Age Verification Models: Can Digital Identity Address Concerns?
Digital identity is a method that enables individuals to access online services through verifiable identity information. Today, China, Canada, and Australia have implemented digital identity applications for their citizens, and it is anticipated that this application will be adopted on a global scale. On the other hand, European Commission officials have stated that digital wallets are being developed to protect children on social media platforms, and that this technology will be implemented primarily within the framework of the Digital Services Act. Additionally, it has been stated that the Age Assurance Toolkit¹¹ will be published to raise awareness about existing age verification methods that are effective and protect privacy.¹² All these developments demonstrate that the digitization of age verification systems is rapidly gaining momentum and that the importance of this transformation is increasingly growing.
As mentioned above, in our country as well, it is aimed to adopt the Digital Identity model by transitioning to a phased implementation based on the 16-age criterion in line with expert opinions. This step planned to be taken in Turkey carries the quality of ensuring compliance with digital security policies on a global scale.
For protecting children in the digital world, age verification models must not only meet legal requirements but also prioritize user safety and privacy. Indeed, as also indicated in the example methods above, the fact that existing solutions can be easily circumvented by children who are extremely familiar with the digital world and may enable excessive data processing activities creates significant questions regarding the effects of age verification systems. At this point, developing a usability approach that will attract children’s interest and implementing innovative, reliable solutions is of critical importance. Ultimately, whichever age verification model is preferred, since the target group consists of children, the process must be conducted with much greater care, sensitivity, and diligence.
4. RESPONSIBILITY OF PARENTS
Although it is an important matter for online service providers to fully fulfill their obligations in accordance with current regulations, it is clear that parents also have a critical role to play regarding children’s safety. In this context, the Convention on the Rights of the Child (“UNCRC”), one of the most fundamental regulations globally regarding parental responsibility, has become one of the most signed international conventions. The right to privacy guaranteed under Article 16 of the UNCRC is regulated in direct relation to other fundamental rights of the child, such as freedom of expression and access to information. Thirty-two years after the publication of this convention, in light of the fact that one out of every three internet users is a child, the United Nations Committee on the Rights of the Child published General Comment No. 25 on Children’s Rights in the Digital Environment on 04.02.2021, containing provisions that draw attention to balancing children’s privacy rights with parental supervision.
The aforementioned Guide, prepared by the ICO, states that it aims to reduce the risks children may encounter in the online world through parental controls. Digital tools for controlling browsing time, granting access only to approved sites, and restricting in-app purchases provide effective support for parents in the process of protecting their children. However, in line with the child’s increasing right to privacy as they grow older, the monitoring of such online behavior should also be gradually reduced by parents. Within the scope of the Guide, methods for protecting children without violating their privacy are detailed for parents according to age groups.
In Garante’s decision¹³ regarding the father sharing photos of a child under joint custody of divorced parents without the mother’s consent, attention is drawn to parents’ obligation to protect children’s privacy and parents’ joint responsibility. With this decision, a dual responsibility is imposed by drawing attention to both the responsibility of both parents in ensuring children’s digital security and the requirement for service providers to obtain the consent of both parents in legal approval processes.
CONCLUSION
It is evident that protecting children from risks they may encounter in the digital world requires multi-stakeholder collaboration and an effective monitoring mechanism. In this context, it plays a critical role for competent authorities to implement legal regulations that will protect children’s digital rights and security and to ensure that these regulations are effectively monitored in practice. Online service providers must act aware of their responsibilities and take various measures ranging from the effective implementation of age verification systems to strengthening content filtering and monitoring processes to prevent children from being exposed to harmful content and risks. However, in this process, it is also essential for platforms to act transparently and cooperate with relevant regulators. On the other hand, parents must be aware of their children’s online behavior and the content they consume and effectively utilize the tools provided by digital platforms and authorities. Increasing parents’ digital awareness will not only enable their active participation in processes to protect their children but also make the measures taken by authorities and platforms more effective. Children’s secure existence in the digital world depends on competent authorities, online platforms, and parents acting as supporting and complementary pillars of one another. The absence of any one of these three stakeholders will lead to disruption of the balance and leave children vulnerable in the digital world. Therefore, children’s secure existence in the digital world will be possible not through the efforts of a single party but through the continuous cooperation and coordination of these stakeholders.
FOOTNOTES
1 ECPAT International, “Disrupting Harm”, Access Date: 19 December 2024 https://ecpat.org/disrupting-harm.
2 ChildFund Alliance, “Every Day Needs to Be a Safer Internet Day for Children”, Access Date: 18 December 2024, https://childfundalliance.org/2024/02/05/every-day-needs-to-be-a-safer-internet-day-for-children/#:~:text=In%20the%20more%20than%20three,third%20of%20internet%20users%20worldwide.
3 ICO, “Age-Appropriate Design: A Code of Practice for Online Services” Access Date: 20 December 2024, https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/childrens-information/childrens-code-guidance-and-resources/age-appropriate-design-a-code-of-practice-for-online-services/.
4 Garante per la Protezione dei Dati Personali, “Chatgpt, the italian data protection authority closes the preliminary investigation.”, Access Date: 23 December 2024, https://www.garanteprivacy.it/home/docweb/-/docweb-display/docweb/10085432.
5 Kişisel Verileri Koruma Kurumu, “Verican İle Kişisel Verileri Öğreniyorum.” Access date: 19 December 2024 https://www.kvkk.gov.tr/Icerik/8033/Verican-Ile-Kisisel-Verileri-Ogreniyorum.
6 Anadolu Ajansı, “KVKK’den Instagram’ın Sahibi Meta’ya ‘Çocuk Hesapları’ Cezası”, Access Date: 21 December 2024, https://www.aa.com.tr/tr/gundem/kvkkden-instagramin-sahibi-metaya-cocuk-hesaplari-cezasi/3421186.
7 Prove, “Approaches to the Complex Issue of Age Verification” Access Date: 20 December 2024, https://www.prove.com/blog/approaches-to-the-complex-issue-of-age-verification.
8 ICO, “Age-Appropriate Design: A Code of Practice for Online Services” Access Date: 20 December 2024, https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/childrens-information/childrens-code-guidance-and-resources/age-appropriate-design-a-code-of-practice-for-online-services/.
9 Prove, “Approaches to the Complex Issue of Age Verification” Access Date: 20 December 2024, https://www.prove.com/blog/approaches-to-the-complex-issue-of-age-verification.
10 Kurum, KVKK Bulletin p.5 “Privacy in the Digital Age: Protection of Children’s Personal Data”, Access Date: 19 December 2024, https://www.kvkk.gov.tr/SharedFolderServer/CMSFiles/35b708d7-449d-417d-87cb-d52f46dd570f.pdf.
11 European Commission, “Guide to Age Assurance: Section Updated, Resources Now Available” Access Date: 20 December 2024, https://better-internet-for-kids.europa.eu/en/news/guide-age-assurance-section-updated-resources-now-available.
12 Euractiv, “European Authorities Press on with Digital Wallets for Social Media Age Verification” Access Date: 20 December 2024, https://www.euractiv.com/section/tech/news/european-authorities-press-on-with-digital-wallets-for-social-media-age-verification/.
13 Garante, “Newsletter del 3 dicembre 2024” Access date: 23 December 2024, https://www.garanteprivacy.it/home/docweb/-/docweb-display/docweb/10076607#3
