Nation

CMA amendments, socmed licensing to align platforms with Malaysian laws

KUALA LUMPUR: The recent amendments to the Communications and Multimedia Act 1998 and the introduction of social media licensing aim to ensure better compliance with Malaysian laws and norms.

Malaysian Communications and Multimedia Commission (MCMC) deputy managing director Datuk Zulkarnain Mohd Yasin said the current community guidelines set by the social media and online messaging providers aligned with the international norms and countries, such as the United States.

He said as these countries did not have the royal institution or dealt with racial and religious sensitivity issues, most providers would not adhere to the agency's request to take down harmful content related to the subjects.

"Previously, we operated without clear legislation, relying on standards or terms set by the platforms.

"If something was wrong with what they set, they would remove the content, but the decision to remove content might not align with national laws.

"Their community standards follow international norms, possibly US law, under which there is no monarchy, and freedom of speech is paramount.

"When it comes to issues involving the monarchy, most platforms will not act — they will leave it be.

"Similarly, matters involving insults to religion might not be a problem for them, but for our country, issues related to religion, race and the monarchy are very sensitive," Zulkarnain said on TV3's 'Soal Rakyat' programme.

On Dec 9, the Dewan Rakyat approved the amendments to the Communications and Multimedia Act 1998 (Act 588).

Communications Minister Fahmi Fadzil said these amendments aimed to create a safer and more sustainable Internet network ecosystem for all users.

On Friday, he announced that the government has identified eight social media and online messaging platforms that will be required to obtain a licence under the Communications and Multimedia Act 1998 by next year.

He said the platforms included Meta's WhatsApp, Facebook and Instagram, Elon Musk's X (formerly Twitter), Google's YouTube, Pavel Durov's Telegram, Tencent's WeChat and ByteDance's TikTok.

He said these platforms had met the threshold of at least eight million users in the country, adding that they were not being specifically targeted.

Besides, Zulkarnian said, "different perspectives" from the providers were also a factor behind the abundance of advertisements related to online gambling and prostitution on social media platforms.

He said as these forms of advertisement were their source of income, it made the providers less sensitive to taking down the contents from their sites despite the MCMC request.

"Why are there so many online gambling advertisements? From a different perspective, their source of income is advertising.

"When the advertisements are related to gambling, platforms have no incentive to remove them because they generate revenue for the platform.

"But for us, gambling is a social ill that we need to address. We don't want influencers promoting gambling, so we see the need to take action," he said.

Thus, he believed the amendments and the social media licensing provided the government with a more comprehensive legal approach to address these issues.

"The amendments, along with the recently announced Code of Conduct (Best Practice) for Internet Messaging Service Providers and Social Media Service Providers, will allow the government to issue directives to the providers. These directives aim to prevent harm to social media users."

Zulkarnian said the mechanism, among others, provided a certain "response time" for them to remove the harmful contents.

"When we issue instructions for them to follow, we set a specific timeframe and standards they need to comply with. If they fail to comply, it becomes an offence under the Act.

"We believe that with this licensing and standards, they will be more aware of their responsibilities.

"Perhaps there is a mechanism we can set, with conditions on which content should be removed quickly.

"Their (artificial intelligence) system should be able to identify it. This is not something new, they already have such systems (in place)."

On Friday, MCMC introduced the Code of Conduct, which includes age verification measures aimed at ensuring child safety.

However, MCMC said, the measures must be implemented "with due respect to the privacy of child users" and in line with industry best practices.

The code, aimed at supporting the regulatory framework for digital service providers, outlines best practices to address harmful online content and ensure online safety, particularly for children and vulnerable groups.

The code also stipulates best practices that service providers with more than eight million users, now required to be licenced under the Communications and Media Act 1998, must adhere to.

These measures include "clear and robust systems, mechanisms and procedures" to safeguard against harmful content, accessibility features for users with disabilities, and identification and removal of "any potential risks of online harm on their platforms before such risks materialise".

Most Popular
Related Article
Says Stories