The Age Appropriate Design Code, launched by the United Kingdom’s Information Commissioner’s Office, is a non-legally binding framework for digital companies like Google and Facebook that seeks to explain how the General Data Protection Regulation (GDPR) applies in the context of children using digital services. The code officially launched on 2 September 2020 and will apply to all online services that are accessible to children in the UK. A significant part of the code tries to set standards for data collection. Digital companies are required to collect the “minimum amount of data needed to offer the service the child is actively and knowingly engaged in”. The code also instructs businesses to set child privacy settings to “high privacy” by default, keep location tracking off until switched on, and disable targeted advertising settings as the default.
The digital advertising industry has pushed back on the new code, both during its drafting and after the code was officially announced. The Internet Advertising Bureau UK (IAB UK) has called for the code to be redrafted, and companies like Facebook and Google have argued against using default settings to minimise data collection on children.
THE IMPLICATIONS OF CHILD PRIVACY IN OUR DIGITAL AGE
In the wider context, the modern advertising industry is built on mining personal information from internet users to build datasets that marketers can apply towards targeted advertising. This is an exchange that adults barely understand, so how can children fully comprehend what they are giving up or what is even happening? In an article by Which-50 magazine, “kid tech” company Super Awesome’s CEO Dylan Collins stated that his “research estimates that by the time a child reaches the age of 12, more than 72 million pieces of personal data have been collected about them by advertising technology designed for adults”.
Target advertising can have detrimental effects on the wellbeing of children, especially in the advent of the COVID-19 pandemic and subsequent shutdowns. A report by VicHealth, the Foundation for Alcohol Research and Education (FARE), and the Obesity Policy Coalition (OPC) reveals the impact of personal data collection on Australian children and the implications of data being used by industries to aggressively market inappropriate products. Advertisers can gather a wealth of information about a child: their age, likes, gender, sexuality, location, and school. Such information is gold for advertisers to understand how to engage, and ultimately influence, children towards becoming consumers. The report states that “an estimated 72 million data points will have been collected by companies on each child by the age of 13”.
Aggressive marketing tactics through influencers on social media, “advergames,” and apps encourage both the consumption and the sharing of content. While advertising is not a crime, and there is nothing inherently illegal about companies buying child data to create targeted campaigns, we must recognise that such actions have consequences. When children are exposed to products or negative content, the consequences shape children’s lives, both on and offline. The VicHealth report states that predatory advertising targeted at kids can put children at risk of “developing heart disease, stroke and cancer later in life”.
COMMON SENSE SOLUTION
Recently, UK Information Commissioner Elizabeth Denham put out a press release noting how intuitive it should be to protect a child’s data. She stated that “in a generation from now, we will look back and find it astonishing that online services weren’t always designed with children in mind”. When one considers the current lengths governments go to inorder to protect children and considers the negative effects data collection can have on a child, it seems logical to have a framework like the Age Appropriate Design Code. Governments and regulators take special steps to protect children through driving restrictions, film and video game ratings, and national drinking ages. The Age Appropriate Design Code falls inline with these existing regulations and with the general consensus that regulations to protect children are necessary. This code is one step in the right direction to ensure children are as protected online as they are offline.
Mayowa is a final year Honours Politics student with a certificate in Computing. As an aspiring technology lawyer, she is passionate about internet governance, digital innovation and believes that technology should be used to create an equitable future. When not writing, she can be found compiling JavaScript.