Reclaiming Digital Spaces for Child Safety
Technology-Facilitated and Online Child Abuse and Exploitation

As digital platforms, artificial intelligence (AI), and encrypted networks become increasingly embedded in everyday life, they have also emerged as powerful tools in the hands of human traffickers, exposing individuals, including children, to new and evolving forms of harm. The rights of children are enshrined in international law, most comprehensively in the United Nations (UN) Convention on the Rights of the Child (CRC), which outlines a child’s right to protection from violence, abuse, neglect, and exploitation, including in digital spaces.1 Organizations like the UN Children’s Fund (UNICEF) and the UN Office on Drugs and Crime (UNODC) emphasize that these rights extend into the virtual world, where online safety is an increasingly urgent human rights issue. While the UN-led initiatives, such as the CRC and the UN’s Child and Youth Safety Online agenda, provide a universal framework for protecting minors from online harm, their implementation is often left to the discretion of states. UNICEF continues to advocate for stronger global governance, but in practice, international consensus has yet to translate into effective, enforceable protections across jurisdictions.2
In the European Union, the AI Act, which came into force in August 2024, explicitly prohibits AI systems that exploit age-based vulnerabilities.3 Critics point out that recent drafts of the EU Code of Practice on General Purpose AI have weakened safeguards for children’s rights by relying on voluntary detection of abuse content rather than binding obligations.
While Moldova has implemented a National Strategy for Preventing and Combating Human Trafficking (2021–2025), limited resources and institutional constraints continue to challenge its effectiveness.4
On April 28, 2025, the U.S. Congress passed S.146, the Take It Down Act, a bill that criminalizes the nonconsensual publication of intimate images, including “digital forgeries” (i.e., deepfakes), in certain circumstances.5 Another proposed U.S. bill, S. 1748, Kids Online Safety Act, would protect the safety of children on the internet, establishing a duty of care and require covered platforms to implement safeguards and provide notices to protect minors and their activities online.6
In Southeast Asia, countries such as Indonesia and the Philippines have introduced anti-trafficking laws; for instance, the Philippines Anti-Trafficking in Persons Act was first introduced in 2003 and further expanded and amended in 2012.7
Meanwhile, in Qatar, authorities have criminalized child sexual abuse material (CSAM) and introduced the Safe Space campaign, a digital education initiative to promote safer online behavior among children and caregivers.8
The absence of binding international digital safety standards, combined with uneven national implementation, leaves millions of children vulnerable to exploitation through online platforms, AI manipulation, and unregulated virtual spaces. Moreover, as noted in a 2025 study, Digital Traps: The Critical Role of Online Encounters in the Entrapment of Minors in Sex Trafficking, online encounters—through social media, gaming apps, and encrypted messaging—are increasingly the gateway to exploitation and trafficking of minors.9
In order to access the rest of the article sign in with your IACP or Subscriber credentials.

