Columnists

Social media platforms cannot hide behind "narrative of neutrality"

IN a recent letter to the prime minister, the Asia Internet Coalition (AIC) raised its concerns about Malaysia's proposed licensing framework for social media and private messaging platforms in the country.

In short, the AIC argues that a licensing regime on platforms would burden the users. They argue for self-regulation as a more flexible and responsive alternative to government intervention.

There are deep flaws to this.

First, the idea that these platforms are "neutral conduits for information" is woefully inaccurate at best, or a blatant untruth at worst.

The reality is that most social media platforms, particularly the ones the AIC claims to represent, utilise algorithmic curation for various purposes.

The most obvious reason would be to maximise user engagement and retention by prioritising a type of content over another.

In such situations, platforms are no longer mere facilitators of free expression, but complicit in algorithmic manipulation.

The consequences of such curation, when it goes wrong, have been well-documented globally — from the spread of conspiracy theories to real-world violence incited by disinformation.

Second, the claim that platforms are able to self-regulate to address online harm does not bear out.

As it stands, scams, child sexual abuse, cyber bullying, anti-public health and hate speech content continue to proliferate despite the platforms' moderation policies.

It is worth remembering that for long stretches of the Covid-19 pandemic, many platforms were hesitant or unwilling to expand their moderation policies to include medical mis- and disinformation, further contributing to negative public health outcomes.

Relatedly, the AIC's statement also alludes to the platforms' apparent willingness to collaborate with the Malaysian Communications and Multimedia Content Forum— the industry forum designated to oversee and promote self-regulation of online content.

Nonetheless, none of the platforms represented by AIC have subscribed to the Content Code for self-regulation.

Third, it remains unclear how much resources these platforms dedicate to individual markets like Malaysia.

This includes the number of human content moderators, and subsequent considerations, such as their proficiency in local languages, their familiarity with local context, and the amount of training to perform their role.

Similarly, algorithmic and/or automated moderation remains in a black box — with equal levels of uncertainty over the training data of these systems and their effectiveness, as proven by an objective third-party.

Fourth, platforms are eager to point to their transparency reports as proof that they are enforcing their internal standards.

Nonetheless, with these reports only containing what the platforms choose to include while leaving out sufficient details necessary for effective scrutiny — how different is this practice from transparency washing?

Likewise, if students were allowed to mark their own exams, it would be no surprise that everyone would be tied for first place.

These considerations must be taken into account for a balanced view on MCMC's licensing framework.

While it is expected of — and we, personally, encourage — people to be healthily sceptical of government overreach, especially in matters associated with free speech, it is worth remembering that these platforms neither have "clean hands" nor a democratic mandate to determine the red lines of free speech.

Despite this, platforms continue to resist meaningful regulation by hiding behind the narrative of neutrality.

With the AIC arguing that regulation will harm innovation, we must question if the innovation that they are claiming to promote will genuinely benefit society or merely a company's stock price.

Moving forward from this requires humility, especially since the AIC has since backtracked on its most damning statements.

Platforms must understand that governments and people have wisened up to the consequences of the tech industry's "move fast and break things" mentality, and that government can, indeed, regulate a technical sector.

Here, we posit that there is room for deliberate and delicate policymaking that supports free expression and innovation, while simultaneously holding platforms accountable for the impact they have on society.

With the right policy intervention, these three ostensibly competing objectives can be mutually reinforcing.


Harris Zainul is deputy director of research; Samantha Khoo is research intern at the Institute of Strategic and International Studies (ISIS) Malaysia

The views expressed in this article are the author's own and do not necessarily reflect those of the New Straits Times

Most Popular
Related Article
Says Stories