In the rapidly evolving landscape of the Technology and Communications industry, ensuring online safety and effective content moderation has become a top priority for regulatory bodies worldwide.

It was recently reported that the Malaysian Communications and Multimedia Commission ("MCMC") will be taking legal action against the parent company of Facebook and WhatsApp, Meta Platforms Inc.1 The reasons given by the MCMC which led to this event include:

  1. failure to remove undesirable contents, such as: Race, Royalty, Religion (3R),2 defamation, impersonation, online gambling, and scam advertisements, which have jeopardised user safety and posed a threat to social harmony and cultural sensitivities;3 and
  2. despite several warnings having been issued, Meta has failed to remove the aforementioned undesirable contents.4

The decision to take legal action against Meta not only sends a strong message to all industry players, but also aligns with the increasing global trend of regulatory bodies prioritising user safety in the digital realm. This development highlights the importance of effective content moderation strategies and the responsibility of platform owners to promptly address content-related issues.

This alert provides insight into the MCMC's powers, specifically on directions to compel compliance under the Communications and Multimedia Act 1998 ("CMA 1998") and the legal implications it holds for both Meta and companies operating in the Technology and Communications industry.

MCMC's Powers

While the MCMC did not explicitly specify the legal actions to be taken against Meta, they possess extensive powers under the CMA 1998, including issuing directions and making determinations to address non-compliance with regulatory requirements.5 When a company, such as Meta, fails to comply with the requirements set forth in the CMA 1998 or its subsidiary legislations, the MCMC can issue directions compelling compliance under Section 51 of the CMA 1998. This means that when Meta was found to have undesirable content and failed to remove it despite several warnings, the MCMC may exercise its power to issue directions, requiring Meta to take necessary actions to comply.

In general, the Commission adopts a comprehensive process when issuing directions under Section 51 of the CMA 1998. This begins with written notices, where the Commission clearly communicates the desired compliance to the companies / individuals involved. This step grants the companies / individuals an opportunity to be heard and submit written submissions within a defined timeframe. After evaluating the inputs received, the Commission proceeds to issue directions, stipulating precise actions necessary to address or prevent any violations.

It is worth noting that, according to Section 53 of the CMA 1998, failure to comply with the Commission's directions, including the removal of undesirable content, constitutes an offence. Such non-compliance can result in a fine not exceeding RM300,000 and/or imprisonment for a term not exceeding 3 years.

By utilising its powers granted under the CMA 1998, the MCMC showcases its commitment to ensuring a safe digital environment. This decision against Meta serves as a strong reminder to all industry players that non-cooperation in the removal of harmful content will not be tolerated. Moreover, it demonstrates the MCMC's resolve in enforcing content moderation standards and holding platform owners accountable for their responsibilities.

The action taken by MCMC against Meta has proven to be effective. Last week, the Communications and Digital Ministry announced that Meta has agreed to collaborate with the MCMC and the Police, following a meeting held at the headquarters of the Royal Malaysia Police.6

Key Takeaways

Companies operating in the Technology and Communications industry should closely observe the MCMC's actions against Meta, as it establishes a precedent for the regulatory body's approach in ensuring online user safety.

To stay ahead in the industry, companies must remain updated on regulatory developments and industry best practices. It is crucial to implement robust content moderation policies and practices that comply with evolving regulations and guidelines.

Collaborating with compliance and risk management teams facilitates ongoing assessments of content policies and compliance procedures. This proactive approach not only ensures user safety, but also helps to build trust and maintain a positive brand image in an increasingly interconnected world.

By staying informed, proactive, and responsive to regulatory changes, companies can ensure compliance, mitigate legal risks, and promote a safe and secure digital space for all users.


1. MCMC Media Statement, ‘Non-Cooperation to remove Undesirable Contents from its platform: MCMC to take Legal Action against Meta', (MCMC, 23 June 2023) (MS_MCMC_TO_TAKE_LEGAL_ACTION_AGAINST_META.pdf); see also Reuters, ‘Malaysia to take legal action against Meta over harmful content' (Reuters, 27 June 2023) ( accessed 11 July 2023

2. Astro Awani, ‘PDRM pandang serius isu 3R, pasukan siasatan khas ditubuh sejak Mac lalu' (Astro Awani, 18 May 2023) ( accessed 11 July 2023

3. New Straits Times, ‘744 Facebook scams reported since January' (New Strait Times, 4 June 2023)

( accessed 11 July 2023

4. The Star Online, ‘MCMC to take legal action against Meta over malicious content on Facebook' (The Star, 23 June 2023) ( accessed 11 July 2023

5. Communications and Multimedia Act 1998, s 51 and 55.

6. The Star Online, ‘Meta has agreed to cooperate with MCMC, police to curb online crime, says Fahmi' (The Star, 4 July 2023)

( accessed 11 July 2023

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.