Can self-certification of applications by developers protect children’s sensitive data? This is the subject of a study by Vincent Lefrère, a lecturer in digital economics, and Grazia Cecere, a professor of economics, both at Institut Mines-Télécom Business School. In an article published in the Level 4 Academy of Management Perspectives journal, they present their research into data protection in mobile apps designed for children.
In our hyper-connected societies, data collection has become commonplace. This intrusive phenomenon affects everyone, including children, and poses an obvious security problem, making it a major issue of our time. The research team therefore examined the influence of self-regulation and the Designed for Families (DFF) system offered by certain platforms. Google Play, for instance, enables developers to subscribe to the DFF programme, which is intended to make applications accessible to children while safeguarding their sensitive data. By subscribing to this programme, developers agree to comply with the US Children’s Online Privacy Protection Act (COPPA), which safeguards the personal information of children under 13 years of age.
What is COPPA and how does it apply?
COPPA is a US law from 1998 designed to protect children’s sensitive data online. It concerns the collection of information on minors under the age of 13 by individuals under US jurisdiction. The law defines ten criteria for what constitutes sensitive data on a minor, including their surname, first name, physical address (including street name and municipality), online contact information (email address, telephone number, etc.), a user name that can be used for online contact, a telephone number, a national insurance number, an identifier that can be used to recognise the user (such as an IP address or customer number), a photograph, a video or an audio recording containing the image or voice of a child, and data concerning the child or their entourage that has been collected from a child who is the target audience for the content.
This work is particularly timely given that, during the course of the research, several major companies were fined for failing to comply with COPPA’s rules. For instance, YouTube and Google were fined $136 million in April 2019 for breaching COPPA. ByteDance, the parent company of TikTok, was also fined $5.7 million for failing to obtain parental consent before collecting data from underage users.
An empirical study of the issue.
The research is based on three years of data collected from over 27,000 apps and 11,000 developers in 128 countries. A total of 1,509,000 observations were made between 2018 and 2021. The research team found that 70.6% of apps aimed at children under 13 have joined the Designed for Families (DFF) programme. They also found that, of all the apps included in the DFF, only 23% request at least one type of sensitive data; this figure rises to almost 45% for apps that have not joined the programme.
One of the aims of this research is to demonstrate where the DFF is implemented around the world, how it is applied in different countries, and whether it has a tangible impact on the volume of data collected. Our findings show that:
- Countries that are members of the Organisation for Economic Co-operation and Development (OECD) and apply the DFF collect less data than countries outside the OECD. This is particularly true of the United States.
- Other countries that have incorporated the DFF programme into their legislation are also seeing a reduction in the collection of sensitive data.
- Only completely independent authorities, such as those in Australia, Morocco and Albania, continue to collect more sensitive data than OECD countries, with or without the DFF.
Results
This study is the first attempt to understand the complexity of the children’s applications market, as well as the impact of national regulations on companies’ decisions around the world. This paper’s uniqueness also lies in its empirical basis, providing a quantified statistical perspective.
Future research could examine the relationship between data protection and the quality of children’s apps. Apart from improving the application and predicting user behaviour, the study does not explain why developers collect this data.
In this paper, the researchers demonstrate that participating in the DFF programme has a positive impact on the collection of data from minors. The DFF programme and related texts help to protect children’s sensitive data more effectively and reduce abusive data collection. The programme also promotes these applications, which experience a slight increase in user comments after joining. Ultimately, the study demonstrates that it is more effective to allow developers to opt in to the DFF programme and comply with existing legislation, rather than imposing it without consultation. Future research could explore the relationship between data protection and the quality of children’s applications. Apart from improving the application and predicting user behaviour, however, the study does not specify why developers collect this data.