Nation

Social media firms must curb harmful content

KUALA LUMPUR: Social media and online messaging service providers must combat online harm against children, scams, and harmful artificial intelligence content.

A frequently asked questions (FAQ) released by the Malaysian Communications and Multimedia Commission (MCMC) said that service providers operating in Malaysia were expected to observe several "conduct requirements" that MCMC would set.

This could include a restriction on users under the age of 13 from accessing their platforms, and addressing cyberbullying and sexual grooming issues.

Minors should be protected from misleading advertisements, and scams, harmful AI content and deepfakes should be curbed on social media platforms.

The FAQ said these guidelines would be fully developed within five months of the framework's gazettement on Aug 1.

During this period, service providers can apply for the Applications Service Provider Class (ASP (C)) licence and comply with the requirements.

Service providers must form a locally incorporated entity before applying for a licence.

MCMC said it would conduct public consultations to develop the guidelines.

The licencing requirement as part of a new regulatory framework to ensure a safer online ecosystem.

Communication Minister Fahmi Fadzil said MCMC had discussed the licensing requirements with most major social media platforms.

Most Popular
Related Article
Says Stories