用户名/邮箱
登录密码
验证码
看不清?换一张
您好,欢迎访问! [ 登录 | 注册 ]
您的位置:首页 - 最新资讯
More harmful social media content detected in Q1 2024 than all of last year, says MCMC
2024-04-09 00:00:00.0     星报-国家     原网页

       

       KUALA LUMPUR: Social media platform providers have been told to be more alert and proactive as harmful social media content detected in the first quarter of this year exceeded the total for the whole of last year.

       The Malaysian Communications and Multimedia Commission (MCMC) said that from January to March, a total of 51,638 cases were referred to online platform providers for further action.

       "This number is high compared to 42,904 cases recorded for the whole of 2023," it said in a joint statement with the Royal Malaysia Police (PDRM) on Tuesday (April 9).

       The statement urged providers like Meta (Facebook and Instagram) and ByteDance (TikTok) to take a holistic approach to such content, especially touching on 3R (race, religion and royalty) issues.

       "Both platform providers were asked to step up and follow through with monitoring efforts owing to the increase in harmful online content.

       ALSO READ: Significant rise in provocative social media content detected says MCMC

       "TikTok and Meta were asked to provide a comprehensive improvement and strategy plan (and) to deal with such content effectively by referring to (their own) guidelines as well as Malaysian laws.

       StarPicks

       Leading the way in IT excellence

       "In addition, TikTok and Meta are urged to curb non-compliant behaviour (coordinated inauthentic behaviour or CIB, using real and fake social media accounts to manipulate issues), as well as monitor and take immediate action against harmful content such as scams and illegal online gambling," it said.

       On Monday (April 8), MCMC and PDRM met with representatives of the platform providers in Cyberjaya.

       ALSO READ: Meta, TikTok pledge online safety measures for children using social media, says Fahmi

       The statement said other matters discussed at the meeting, chaired by Communications Minister Fahmi Fadzil, included ensuring online safety measures for children below the age of 13.

       "We discussed online safety improvements, especially with the authentication process to ensure the minimum age for social media access is 13.

       "This is to ensure that children are not affected by harmful content on social media.

       "We also revisited the efficiency of the (platforms' respective) artificial intelligence algorithm and capabilities of learning to detect 3R content, CIB and other harmful content," the statement added.

       


标签:综合
关键词: online platform providers     statement     TikTok     media     content    
滚动新闻