Human Rights Pulse

View Original

The Social Dilemma and the Human Rights Risks of Big Tech

The Social Dilemma, a recently released Netflix documentary, alerts viewers to the societal and psychological problems arising from the products and practices of big tech companies through testimonials, dramatised scenes of family life ruined by social media, and data on matters such as mental health and suicide rates. The film features interviews with former big tech executives and employees including Tristan Harris, former Design Ethicist for Google; Tim Kendal, former Facebook executive and former President of Pinterest; Aza Raskin, former Firefox & Mozilla Labs employee and inventor of the infinite scroll; and a host of others including former YouTube and Twitter engineers. 

The majority of these individuals state that they left their previous jobs in big tech companies due to ethical concerns. They all use their role in the documentary to express such concerns, which centre around the ethics of the persuasive technology used by their former employers to keep users active on their sites for as long as possible in order to sell more advertisements. The effects of such practices range from damage to users’ mental health, social media addiction, and over-exposure to photoshopped and filtered images warping users’ self-image; to the spread of disinformation and fake news, threats to democracy, and extreme political divisiveness and polarisation. 

 ISSUES RAISED

The documentary reveals the algorithms designed to keep users engaged on sites for as long as possible, often by recommending content which will hold their attention or cause them to comment, share, and repost. The most effective sort of content for this purpose is particularly shocking, divisive, idealised, or untrue. According to an MIT study cited in the documentary, fake news spreads six times faster than truthful content on Twitter. Therefore, it is in the financial interests of companies like Twitter and Facebook to promote this kind of unregulated content to users to keep them engaged on the site, so that they see more advertisements and make the companies more money. 

An internal Facebook report in 2018 found that 64% of the people who joined extremist groups on Facebook did so because its algorithms steered them there. As Roger NcNamee, an early Facebook investor, points out, “if everyone's entitled to their own facts, there's really no need for compromise, no need for people to come together - we need to have some shared understanding of reality”. This shared understanding of reality is currently under threat as Facebook and Twitter feeds, and even Google searches, are heavily tailored according to the interests, political leanings, and personal information of whoever is searching or scrolling. 

Another related issue briefly raised in The Social Dilemma is Facebook’s deal with telecommunication companies in developing countries to allow individuals buying a new phone (which comes with Facebook already installed) to use Facebook for free, without paying for the data it would normally require. Facebook Inc. carry out these deals under an arm of their company they profess to be a charitable, development-focussed venture, with the aims of bringing internet access to more people, called internet.org

The problem here is that users are not gaining access to the free internet, but rather a filtered version through Facebook’s corporate lens. In fact, they are not being granted the freedoms and information that the internet can provide, but rather they are simply being granted the freedom to use Facebook and become a unit of engagement for monetisation on its site.

HUMAN RIGHTS PERSPECTIVE 

In her interview for the documentary, Cynthia M Wong, former senior internet researcher for Human Rights Watch, highlights the case of Myanmar and the use of Facebook by the military and other bad actors to incite hate speech towards the Rohingya. This case is a clear example of how unregulated social media can be used by government actors to legitimise and incite gross human rights violations and crimes against humanity – in this case, genocide

Further, the issues raised in The Social Dilemma can be seen as slow violence inflicted on the mental health of its users, on democracy, and on the notions of truth and unity within our societies – for the sake of tech companies’ profit margins. Slow violence encapsulates actions wherein acts of violence, for example environmental degradation, take place very gradually over time but ultimately have largely violent consequences - in contrast to the kinds of explosive, attention-grabbing violence which dominates the front page of newspapers. Whilst the behaviour and business models of large tech companies are not necessarily forms of direct violence, they are just as powerful and dangerous as their effects largely go unseen or underdiscussed in our society. In this case, it is our mental health, democracy, and human rights being damaged. This can be seen, for example, in the prevalence of gendered hate speech on platforms like Twitter leading to the erosion of the freedom of speech and expression of many women. This kind of problem demands regulation which is grounded in internationally recognised principles of human rights, and which offers protection against abuses of power by governments who may regulate to limit their citizens’ freedom of speech. 

ISSUES WITH THE DOCUMENTARY

Whilst The Social Dilemma brings the issues around big tech further into public view, particularly providing an “insider” perspective on the workings of Facebook, Twitter, and Google, it may be criticised for focusing too intently on the perspectives of those who had a large part in causing these problems in the first place. 

Maria Farrell highlights the issue of focussing on the voices of the former inventors of the dangerous technology. She states that doing so will stop us from getting “to the bottom of how and why we got here” and “artificially narrow the possibilities for where we go next”. She also emphasises the problem with turning to these figures and listening when they speak, but not uplifting the voices of activists who have campaigned on these issues for years without ever having been a key part of the problem.  

This issue can be seen in some of the recommendations for change made by the former tech execs in The Social Dilemma. Can keeping your device out of the bedroom and turning off your notifications really do anything to stop the political polarisation and threats of civil war that these very individuals just described as the risks and effects of their own creations? Perhaps these small changes can momentarily address the mental health damage for individuals. However, it does nothing to make the tech itself less addictive and manipulative, and less prone to spreading disinformation and encouraging political polarisation. Whilst the documentary does point out that it cannot be up to the companies to regulate themselves, it seems to make a glaring omission of how ineffective it may be for users to self-regulate their social media use. The likes of Tristan Harris spend the documentary explaining just how manipulative and addictive this technology is, only to turn around and encourage the user to outsmart the tech by turning off notifications. This suggestion relies on users becoming aware of these issues and self-regulating, and completely overlooks those unaware or still truly addicted to their feeds. Of the two main problems highlighted - the individual harms and the societal harms of this tech – the film only presents some potential solutions to the former and next to none for the latter. 

Whilst The Social Dilemma does feature a number of experts and academics who allude to other solutions such as regulation and removing the companies from existence all together, it serves somewhat as an example of the problems that come with centring tech industry insiders in the development of solutions to the problems caused by their own creations. The path to a solution cannot be paved through the tech itself, nor can it focus on repairing only the harms to individuals through their own self-control. As this documentary has compellingly highlighted, this is an issue that goes beyond individuals and into the very fabric of our society. What is needed here is regulation which centres internationally recognised human rights principles and protects against abuses of power. 

Siân is a recent graduate with an MA in Human Rights from University College London. She currently works coordinating a network of UK supporters for AdvocAid, an NGO providing holistic access to justice for women and girls in Sierra Leone. Her research interests include human rights and global supply chains, the garment industry, gender, and global health.

LinkedIn