Skip to main content

Sustainable and Responsible Investing

Social media: moving towards more moderation?

The role played by social networks, regarding their impact and social responsibility are more prominent than ever. Content found on platforms such as Meta (Facebook), Twitter or Youtube has particularly become a key issue, given the fact that the negative impacts of hate, conspiracy or racist speech have become both visible and significant.

Geneva - CH

The health crisis and the war in Ukraine are prime examples of this. The (potential) takeover of Twitter by billionaire Elon Musk has rekindled discussions on this topic and raised concerns about how the company intends to manage freedom of expression on its platform. The multi-billionaire's relentless promotion of libertarianism reflects a total freedom of expression with a minimum of self-regulation. This vision, which is more prominently found on the other side of the Atlantic, is in keeping with the spirit of the First Amendment of the American Constitution, which is devoted to upholding freedom of expression. In Europe, on the other hand, the defence of freedom of expression on social networks goes hand in hand with the safeguarding of other human rights, including the right to non-discrimination. This is the direction taken by the "Digital Services Act", the new regulation presented by the European Commission in February 2022 (which will come into force at the beginning of 2023): Large platforms such as Facebook, Instagram, Twitter or TikTok will have to seriously demonstrate that they are fighting hate speech, misinformation, harm to minors, and other similar issues in Europe. Social media will also be required to be transparent regarding algorithms used to rank or recommend content.

Online hate speech and its effects on society

The last digital regulation in Europe took place in the early 2000s, when digital technology was in its infancy. Meanwhile, social media platforms, through the content they share and disseminate to hundreds of millions of subscribers, have become digital giants with considerable power to influence their users and society as a whole. The impact of platforms such as TikTok on the mental health of young users is now well known and documented (anxiety, depression, low self-esteem, etc.). The influence of platforms in making and breaking elections or steering votes (2016 US election and Brexit vote) has also been widely studied. Online hate or harassment messages have multiplied in recent years, particularly in the anxiety-inducing context of the pandemic. According to a survey conducted in 2021, 41% of Americans say they have experienced some form of online harassment, the majority of which related to their sexual orientation, religion, or background[1]. In addition to the consequences for the individual, hate speech and images also contribute to the discrimination of minorities and pose a threat to society. This is particularly the case with the videos of the attack on mosques in Christchurch, New Zealand, which caused 51 deaths in 2019. These videos had time to make an impact online before they were finally removed from YouTube. The Myanmar government used Facebook to spread propaganda against the Rohingya Muslim community. In Germany, far-right German Facebook posts have been explicitly linked to physical attacks on refugees.

[1] Online Hate and Harassment: The American Experience 2021, ADL Center for Technology & Society, https://www.adl.org/online-hate-2021

Social media business model increases risk

Whether in terms of editorial policies or content control –via artificial intelligence or through the work of moderators– the moderation measures taken by Twitter or Facebook are a step in the right direction but are still insufficient[2]. For example, it was not until 2020 that Twitter suspended the account of one of the leaders of the Klux Klux Klan, David Duke, who used it to spread his anti-Semitic and racist messages for ten years. One of the main reasons for this inadequacy relates to the social media business model. The main objective of these platforms is the collection of individual data and its sale to third parties for advertising purposes – much more than for the promotion of a global public space for free speech. Social media use algorithms that feed a certain audience with controversial and hateful content. This same public represents a large target for advertising, thanks to their personal data; therefore, the platforms have no commercial interest in self-censorship. Furthermore, as mentioned above, the measures taken remain voluntary and vary from company to company.

 

[2] In 2019, Mark Zuckerberg announced that 5% of Facebook's revenue was spent on moderation, or USD 3.7 billion.

A risk for investors too

The issues of human rights abuses and social licence to operate have been traditionally associated with heavy industries, such as mining, or sectors with a large and complex supply chain, such as the fashion or food industries. Nowadays, ESG analysis of human rights by responsible investors extends to tech giants and their social networks. In particular, it examines the issue of content and pinpoints major ethical, financial and regulatory risks, which could potentially jeopardise their long-term value. It should be remembered that with its new law, the EU will be able to impose sanctions of up to 6% of their turnover, or even ban them. Social media, which are already under fire for anti-competitive behaviour, privacy and data security issues, must now be clear about their commitment to fighting online hate and, above all, to implementing it. This task, which is essential if they are to maintain their legitimacy, will not be easy in view of another important responsibility: maintaining freedom of expression in the face of the censorship efforts of certain governments. In this respect, some experts propose the interesting idea of a neutral global moderation body, neither public nor private, establishing the same standards for all platforms and involving these same platforms in the definition of these good practices.

Important information

Please do not hesitate to reach out to your privileged contact person at Mirabaud or contact us here if this topic is of interest to you. Together with our dedicated specialists, we will be happy to evaluate your personal needs and discuss possible investment solutions tailored to your situation.

Continue to

These articles might interest you

Login
Choose your language