Tighter control of ads on younger teens’ social media accounts … – The Straits Times

SINGAPORE A new code of practice will require social media platforms to promptly inform users of actions taken on their reports of online harms. This is in response to feedback from users here that they are often left in the dark after submitting their reports.

Advertisements that could have a harmful effect on young users mental health should be kept away from them, and platforms must submit online safety reports annually to be published on the Infocomm Media Development Authoritys (IMDA) website.

These requirements, which take effect on Tuesday, are among the dos and donts for social media platforms here, under the online safety code of practice that IMDA announced on Monday.

The designated social media services named in the code of practice are Facebook, HardwareZone, Twitter, TikTok, Instagram and YouTube, said IMDA, which added that Singapore is one of the first jurisdictions in the world to introduce laws for platforms to take preventive measures to ensure online safety.

The code of practice sets in stone how popular platforms should operate here following the introduction of the new law in February, the Online Safety (Miscellaneous Amendments) Act.

The law gives the authorities the power to direct social media platforms to remove online harms, like sexual and violent content, those that promote cyber bullying, vice, organised crime, suicide or self-harm, and those that may incite racial or religious tensions or endanger public health.

Failure to comply may attract a fine of up to $1 million, or a direction to have their social media services blocked here.

The regulations come amid a crackdown on online harms across app stores, social media platforms and messaging apps.

Under the code of practice, each platform must establish its own community guidelines that clearly state what content is allowed and not allowed on its services, said IMDA.

These rules should be enforced through effective content moderation, including the removal of content that violates its own community standards and blocking or banning users who break the rules.

Users should also be given tools to manage their own safety, like the option to hide harmful content and unwanted interactions, limit location sharing and the visibility of their accounts from other users.

Each platform must also create separate community guidelines for younger users, along with content moderation and online safety information that they can easily understand.

Accounts belonging to children must not receive advertisements, promoted content and content recommendations that designated social media services are reasonably aware to be detrimental to childrens physical or mental well-being, said IMDA.

Platforms are also required to include tools that allow children or their parents to manage their safety on these services, and mechanisms for users to report harmful content and unwanted interactions.

Parents and guardians must also be given tools to manage the content that their children can see, the public visibility of their accounts and permissions for who can contact and interact with them.

Users who use high-risk terms related to self-harm or suicide must be actively offered local safety information that is easy to understand. These include safety resources or information on support centres.

Originally posted here:
Tighter control of ads on younger teens' social media accounts ... - The Straits Times

Related Posts

Comments are closed.