In 2021, Facebook’s market capitalisation was 870.5 billion US Dollars, making it the fourth largest company in terms of market capitalisation worldwide. Meanwhile, the company’s growth has come at the cost of its users’ mental well-being, particularly the youth. Due to a lack of adequate internal reforms and governmental oversight, these issues continue to persist.
FACEBOOK’S ADDICTIVE ALGORITHMS
Facebook’s immense value is generated by its wide user base, which has increasingly become younger as the company continues to acquire apps used primarily by young people, such as Instagram. In addition to acquiring such apps, Facebook is also looking to attract the younger generation by developing new apps targeted at them. Although such steps will generate immense value for the company, it also comes at the risk of damaging young people’s mental well-being and infringing children’s rights.
Facebook deliberately creates addictive algorithms to augment engagement with its platforms. To increase users’ addiction, the company employs a method known as the infinite scroll. This enables and induces users to incessantly scroll through an abyss of content. The user’s brain is unable to control the impulse to perpetually scroll through the mass of content, causing the user to continuously stare at their phone screen for lengthy periods of time.
According to recent studies, 5-10% of users are psychologically addicted and unable to control the amount of time they spend online. The brains of users addicted to social media are akin to the brains of drug-dependent adults. Brain scans showcased explicit changes in the regions of the brain that controlled emotions, attention, and decision-making. Social media addiction causes broken reward pathways in users’ brains hereby, the user’s brain rewires itself to make the user desire immediate rewards in the form of likes, comments, and retweets. Positive social feedback that the user receives stimulates the brain to release dopamine, rewarding the user’s behaviour and prompting the user to stay on social media.
Staring at a screen for indefinite periods of time has detrimental effects on users’ mental well-being and adversely affects psychological thoughts, particularly amongst young people. A National Vital Statistics report showed that suicide among the age group 10-24 increased by nearly 60% between 2007 and 2018. Studies conducted by Facebook itself showcased detrimental effects on young people’s well-being. According to the Wall Street Journal, Facebook’s own study showed that 32% of teen girls felt that Instagram made them feel worse about their bodies. Facebook downplayed the issue in public and did not make adequate efforts to address the problem.
PROFIT FIRST: FACEBOOK’S BUSINESS MODEL
Facebook is well aware of the harmful ramifications engendered from its deliberate product designs. The Wall Street Journal reviewed Facebook’s internal communications that indicate how acutely cognisant Facebook is of the detrimental effects of its products. Ex Facebook, Google, and Apple employees also insinuated that tech companies choose to design addictive platforms to addict their users. In his TED talk, Tristan Harris, a former design ethicist at Google, explained why Facebook newsfeeds operated the way they did: “The newsfeed control room is not accountable to us. It’s only accountable to maximi[s]ing attention.”
Despite internal and external data pointing towards calamitous effects of the methods that Facebook routinely implements, the company continues to employ the use of these deleterious algorithmic procedures. Why? To make profit.
Facebook’s business model is simple. It collects user data and provides this data to advertisers so they can target their ads toward segments of Facebook’s vast user database. It benefits Facebook to have users spend time on its apps: the more time a user spends on the app, the more user data Facebook can collect, and subsequently the more profitable the company becomes.
In order to continue its profitable trajectory for the decades to come, Facebook has been employing new methods to try to get the next generation of users to become addicted: children. Frances Haugen, a former Facebook product manager, testified before the US Senate Panel on October 5th, 2021. In her testimony, Haugen divulged Facebook’s motives to premeditatively addict children to its platforms before they developed the ability to self-regulate. In order to continue its path toward amassing multitudinous profits, the social media giant recognised that it needed to “hook” the next generation of users to its apps. This way, it would ensure that the next generation of users is equally as engaged with Instagram as the current generation. By addicting kids to its platforms early on, Facebook guarantees greater user engagement and subsequently greater profits in the future.
After the hearing, Facebook announced it was going to pause work on the Instagram Kids app. However, Instagram’s head Adam Mosseri defended the decision to launch Instagram Kids by claiming that it was a better alternative for kids who were already on online apps.
Senator Richard Blumenthal compared Facebook’s methods to those of big tobacco, which the Senator himself had led litigation against. Similar to Facebook, big tobacco had purposefully hooked children on nicotine by prompting them to begin smoking as young as 7 or 8. He expressed urgency to regulate Facebook in order to protect minors from the company’s exploitative methods.
Haugen believes that government oversight for issues pertaining to Facebook and children should be led by a coalition of smaller countries instead of superpowers such as the United States and the European Union. She claims that this would drive change as Facebook would be forced to make global changes instead of customising its services across several countries, which would be harder to achieve.
GOVERNMENTAL PROGRESS TO PROTECT CHILDREN
There have been promising developments to protect children from the detrimental effects of Facebook’s profit-motivated agenda. As an example, on September 2nd, 2021, the United Kingdom’s Children’s Code came into effect. The code ensures that online services that are accessed by children must respect a child’s rights and freedoms. Following the code, Facebook was forced to make significant changes to protect children’s rights. Facebook limited users under 18 from several forms of targeted advertising and Instagram made teens’ accounts private by default.
The Children’s Code is having a domino effect on governmental involvement in Facebook’s ability to hook kids on its platforms worldwide. In the United States, members of the US Senate and Congress have asked Facebook to voluntarily adopt the standards of the code. In Ireland, the Data Protection Commission is also planning on introducing a similar set of standards: the Children’s Fundamentals.
These changes fall in line with the United Nations Convention on the Rights of the Child (UNCRC). The UNCRC recognises that special safeguards are to be put in place to protect children’s communications, privacy, and access to information. Similarly, a United Nations Report presented to the Human Rights Council recognised the alarming rate at which children’s privacy was being threatened and called for urgent action by both parents and law-makers to protect childrens’ interests. The report identified the vulnerability of children who were exposed to personal data collection for advertising purposes and the impact it can have on their self-development and mental well-being.
Overall, there is a dire need to limit the current and potential influence of conglomerates, such as Facebook, on children and teens on a global level. Since Facebook is exhibiting a lack of interest in upholding this moral responsibility, governmental oversight has become more imperative and urgent than ever.
Sarah is a recent graduate from University of Toronto. She specialized in Digital Enterprise Management and majored in Political Science. She is currently working as a Compliance Analyst for the G7 and the BRICS Research Groups. Her research interests are digital rights, political violence and internet regulation.