MeitY Acts Against Online Platforms X, Telegram & YouTube Over Child Sexual Abuse Material
MeitY Acts Against Online Platforms X, Telegram & YouTube Over Child Sexual Abuse Material
The notices issued to these social media giants call for the implementation of proactive measures, including the deployment of content moderation algorithms and efficient reporting mechanisms, to prevent the dissemination of CSAM in the future

The Ministry of Electronics and Information Technology (MeitY) has cracked down on social media intermediaries — X, YouTube, and Telegram — over the presence of Child Sexual Abuse Material (CSAM) on their platforms within the Indian internet space.

The notices issued to these social media giants emphasize the imperative of swiftly and permanently removing or disabling access to any CSAM found on their platforms. Furthermore, they call for the implementation of proactive measures, including the deployment of content moderation algorithms and efficient reporting mechanisms, to prevent the dissemination of CSAM in the future.

MeitY’s notices explicitly warn that failure to comply with these directives will be considered a breach of Rule 3(1)(b) and Rule 4(4) of the IT Rules, 2021.

Rule 3(1)(b) mandates that intermediaries, including social media platforms, must exercise due diligence. They must inform users that they cannot host, display, upload, modify, publish, transmit, store, update, or share information that: belongs to another person without permission; is defamatory, obscene, pornographic, invasive of privacy, insulting, harassing, or contrary to laws; is harmful to children; infringes patents, trademarks, copyrights, or other proprietary rights; violates existing laws; knowingly spreads false or misleading information; impersonates others; threatens India’s unity, security, foreign relations, public order, or encourages crimes; contains malicious software; and is intentionally false and misleading for financial gain or harm.

Rule 4(4) of the IT Rules pertains to significant social media intermediaries and their obligation to deploy technology-based measures. They must use automated tools or other mechanisms to proactively identify content depicting rape, child sexual abuse, or similar conduct, and content identical to what was previously removed or disabled. A notice informing users about this identification must be displayed. However, it also says that these measures must be proportionate, considering free speech, user privacy, and technical use. There should also be human oversight with periodic reviews of automated tools, evaluating accuracy, fairness, bias, discrimination, and privacy and security impact.

The Ministry has also cautioned the three social media intermediaries that any delay in adhering to the notices will result in the withdrawal of their safe harbour protection, as per Section 79 of the IT Act. This provision currently shields them from legal liabilities related to content posted by users.

Union Minister of State for Skill Development & Entrepreneurship and Electronics & IT Rajeev Chandrasekhar affirmed the government’s unwavering commitment to ensuring a safe and trusted internet environment under the IT rules. He stated: “We have sent notices to X, YouTube, and Telegram to ensure that there is no Child Sexual Abuse Material present on their platforms. The government is determined to build a safe and trusted internet under the IT rules.”

The action aligns with the provisions of the Information Technology (IT) Act, 2000, which provides the legal framework for addressing pornographic content, including CSAM. The act imposes stringent penalties and fines under sections 66E, 67, 67A, and 67B for the online transmission of obscene or pornographic material.

Original news source

What's your reaction?

Comments

https://sharpss.com/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!