India must update its laws to tackle AI-generated Child Sexual Abuse Material (CSAM). This includes redefining CSAM, clarifying "sexually explicit" content, holding intermediaries like cloud services accountable, adopting global conventions, and amending the Digital India Act to address emerging tech threats and protect children’s dignity and safety online.
Copyright infringement not intended
Artificial Intelligence (AI) is being used to create Child Sexual Abuse Material (CSAM).
It includes pictures, videos, or audio clips that show children in sexually explicit situations. For example, Someone using Computer software or AI to create a fake video of a child in a harmful situation.
Even though the child isn’t real, the video can cause harm by spreading lies or encouraging bad behavior. CSAM is so dangerous as it violates a child’s right to safety and dignity.
The U.K. is designing new laws that make it illegal to possess, create, or share AI tools used to make CSAM. For example, if someone has software that creates fake images of children in harmful situations, they will face legal action.
The U.K. is also targeting "paedophile manuals," which are guides that teach people how to use AI tools to create CSAM. This is a big step because it focuses on stopping the tools themselves, not just punishing people after they commit crimes.
India has laws to protect children from online abuse, however, these laws are outdated. For example:
Current laws don’t address AI-generated CSAM. For example, if someone uses AI to create fake images of children, current laws might not cover that because no real child is involved.
India needs to update its laws to include AI-generated CSAM and hold companies like cloud services accountable for blocking harmful content.
The National Cyber Crime Reporting Portal recorded 1.94 lakh incidents of child pornography as of April 2024. India has agreements with international organizations like the National Center for Missing and Exploited Children (NCMEC) in the U.S., which shares tip-line reports about CSAM cases. As of March 2024, over 69 lakh reports were shared with Indian authorities. |
Expand the Definition of CSAM: Replace the term "child pornography" with "CSAM" in the POCSO Act. This makes the law broader and more effective.
Define "Sexually Explicit": Clearly define what counts as "sexually explicit" under the IT Act. This helps authorities block harmful content faster.
Include New Technologies: Add Virtual Private Networks (VPNs), Virtual Private Servers (VPS), and Cloud Services to the list of intermediaries responsible for blocking CSAM.
Adopt International Conventions: Push for global agreements to stop criminals from using technology for illegal activities.
Create Stronger Laws: Amend the Digital India Act to include rules specifically targeting AI-generated CSAM. For example, if a company provides cloud storage and someone uses it to store harmful images, the company should be held responsible unless they act quickly to remove the content.
Must Read Articles:
SC VERDICT ON CHILD BETROTHALS
STATE OF THE WORLD'S CHILDREN 2023
NATIONAL COMMISSION FOR PROTECTION OF CHILD RIGHTS
Source:
PRACTICE QUESTION Q. "The internet was meant to empower but has become a tool for harm." Critically Analyze. 150 words |
© 2025 iasgyan. All right reserved