Money

MeitY Takes Action Against X, Telegram & YouTube for Hosting Child Sexual Abuse Material – News18

Union Minister of State for Skill Development & Entrepreneurship and Electronics & IT Rajeev Chandrasekhar affirmed the government’s unwavering commitment to ensuring a safe and trusted internet environment under the IT rules. (PTI)

The notices issued to these social media giants call for the implementation of proactive measures, including the deployment of content moderation algorithms and efficient reporting mechanisms, to prevent the dissemination of CSAM in the future

The Ministry of Electronics and Information Technology (MeitY) has taken action against social media intermediaries — X, YouTube, and Telegram — for hosting Child Sexual Abuse Material (CSAM) on their platforms in the Indian internet space.

The issued notices emphasize the need to promptly and permanently remove or disable access to any CSAM found on their platforms. Additionally, proactive measures such as implementing content moderation algorithms and efficient reporting mechanisms are required to prevent the spread of CSAM in the future.

MeitY’s notices explicitly state that non-compliance with these directives will be considered a violation of Rule 3(1)(b) and Rule 4(4) of the IT Rules, 2021.

Rule 3(1)(b) mandates that intermediaries, including social media platforms, must exercise due diligence. It prohibits them from hosting, displaying, uploading, modifying, publishing, transmitting, storing, updating, or sharing information that belongs to another person without permission, is defamatory, obscene, pornographic, invasive of privacy, insulting, harassing, or contrary to laws, is harmful to children, infringes patents, trademarks, copyrights, or other proprietary rights, violates existing laws, knowingly spreads false or misleading information, impersonates others, threatens India’s unity, security, foreign relations, public order, or encourages crimes, contains malicious software, and is intentionally false and misleading for financial gain or harm.

Rule 4(4) of the IT Rules pertains to significant social media intermediaries and their obligation to use technology-based measures. This includes the deployment of automated tools or other mechanisms to proactively identify content that depicts rape, child sexual abuse, or similar conduct, as well as content that is identical to what was previously removed or disabled. Users must also be informed about this identification process. However, these measures must be proportionate, considering free speech, user privacy, and technical use. Human oversight is also required, with periodic reviews of automated tools to evaluate accuracy, fairness, bias, discrimination, and privacy and security impact.

MeitY has warned the three social media intermediaries that any delay in complying with the notices will result in the withdrawal of their safe harbor protection, as per Section 79 of the IT Act. This provision currently shields them from legal liabilities related to user-generated content.

Union Minister of State for Skill Development & Entrepreneurship and Electronics & IT Rajeev Chandrasekhar has reiterated the government’s commitment to ensuring a safe and trusted internet environment under the IT rules. He stated, “We have sent notices to X, YouTube, and Telegram to ensure that there is no Child Sexual Abuse Material present on their platforms. The government is determined to build a safe and trusted internet under the IT rules.”

This action aligns with the provisions of the Information Technology (IT) Act, 2000, which provides the legal framework to address pornographic content, including CSAM. The act imposes strict penalties and fines under sections 66E, 67, 67A, and 67B for the online transmission of obscene or pornographic material.

My unique perspective: Child Sexual Abuse Material (CSAM) is a serious issue that needs to be addressed across all platforms. It is commendable that the Ministry of Electronics and Information Technology in India has taken action against social media intermediaries hosting such content. The implementation of content moderation algorithms and efficient reporting mechanisms will play a crucial role in preventing the dissemination of CSAM in the future. It is essential for platforms to exercise due diligence and prioritize the safety and well-being of their users. By adhering to these directives, social media giants can contribute to creating a safe and trusted internet environment for all individuals.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Check Also
Close
Back to top button