UNITED NATIONS, Jan 06 (APP):UN human rights experts Friday called on the heads of many of the world’s biggest social media platforms to change their business models and become more accountable in the battle against rising hate speech online.
In a detailed statement, more than two dozen UN-appointed experts – including representatives from three different working groups and multiple Special Rapporteurs – called out chief executives by name, saying that the companies they lead “must urgently address posts and activities that advocate hatred, and constitute incitement to discrimination, in line with international standards for freedom of expression.”
They said the new tech billionaire owner of Twitter, Elon Musk, Meta’s Mark Zuckerberg, Sundar Pichai, who heads Google’s parent company Alphabet, Apple’s Tim Cook, “and CEOs of other social media platforms”, should “centre human rights, racial justice, accountability, transparency, corporate social responsibility and ethics, in their business model.”
They reminded that being accountable as businesses for racial justice and human rights, “is a core social responsibility, advising that “respecting human rights is in the long-term interest of these companies, and their shareholders.”
They underlined that the International Convention on the Elimination of Racial Discrimination, the International Covenant on Civil and Political Rights, and the UN’s Guiding Principles on Business and Human Rights provide a clear path forward on how this can be done.
“We urge all CEOs and leaders of social media to fully assume their responsibility to respect human rights and address racial hatred.”
As evidence of the corporate failure to get a grip on hate speech, the Human Rights Council-appointed independent experts pointed to a “sharp increase in the use of the racist ‘N’ word on Twitter”, following its recent acquisition by Tesla boss Elon Musk.
This showed the urgent need for social media companies to be more accountable “over the expression of hatred towards people of African descent, they argued.
Soon after Musk took over, the Network Contagion Research Institute of Rutgers University in the US, highlighted that the use of the N-word on the platform increased by almost 500 per cent within a 12-hour period, compared to the previous average, the experts said.
There has been a sharp increase in the use of hate speech on Twitter after its recent acquisition.
“Although Twitter advised this was based on a trolling campaign and that there is no place for hatred, the expression of hatred against people of African descent is deeply concerning and merits an urgent response centred on human rights.”
They added that hate speech, “advocacy of national, racial and religious hatred that constitutes incitement to discrimination and violence, as well as racism on social media, are not just a concern for Twitter but also for other social media giants such as Meta”, the company formerly known as Facebook.
The experts said although some companies claimed not to allow hate speech, there was a clear gap between stated policies, and enforcement.
“This is particularly salient in the approval of inflammatory ads, electoral disinformation on Facebook, and content that talks of conspiracy theories. Research from Global Witness and SumOfUs recently revealed how Meta is unable to block certain advertisements”, the experts state.
Meta “took a significant step with the establishment of an oversight board in 2020”, in response to complaints, they said, noting that the “group of experts from diverse areas of expertise is in place to ‘promote free expression by making principled, independent decisions regarding content on Facebook and Instagram and by issuing recommendations on the relevant Facebook Company Content policy’”.
The experts acknowledged that the board had been well funded, received around two million appeals regarding content, and made a number of recommendations and decisions.
“However, the effectiveness of the Oversight Board can only be seen over a long-time horizon and will require continued commitment at the highest levels” to reviewing and modifying tools to combat racial hatred online, the experts said.
“There is a risk of arbitrariness and profit interests getting in the way of how social media platforms monitor and regulate themselves”, they added.
Hate speech, whether online or offline, poses a threat to democracy and human rights, it was pointed out.
They pointed out that High Commissioner Volker Turk who heads up OHCHR, had recently penned an open letter to Twitter CEO Elon Musk, emphasizing that free speech did not mean “a free pass to spread harmful disinformation that results in real world harms.
“As he underlined, human rights law is clear – freedom of expression stops at hatred that incites discrimination, hostility or violence. We see too often that the spread of hatred and hate speech against people of African descent, and other groups, not only undermines their rights but creates major fissures in societies. These are increasingly difficult to overcome and a source of various forms of destabilization within countries.”
The independent experts said that allowing and tolerating incitement to hatred and expression, or advocacy of hatred against people of African descent and other marginalized groups, “not only encourages the perpetrators, but also constitutes a continuous source of chronic race-based traumatic stress and trauma.”
The presence of racial hatred further undermines confidence on the part of those impacted, in using social media and seeking justice.
“It is especially alarming” considering that so many youngsters “live a significant part of their lives” online, they added.
“Content moderation can only address a part of what happens in cyber space but does not take into account the intended and unintended effects in society. There are deeper issues about advocacy of racial hatred, lack of accountability for abuses, and an absence of efforts to promote tolerance.
“If addressed, these can be strong determining factors in building a positive future both online and offline.”
Acknowledging the power for good that social media represents if put to positive use, the experts said that it has “a major role to prevent further rifts, so that racial justice and human rights can be upheld, to build less racist, less devisive, more tolerant, just and equitable societies.”
Special Rapporteurs and independent experts are appointed by the Geneva-based UN Human Rights Council, and form part of it’s so-called Special Procedures to examine and report back on a specific human rights theme or a country situation. The positions are honorary and the experts are not paid for their work.
Follow the PNI Facebook page for the latest news and updates.