A sleepy town in southern Spain is in shock after it emerged that AI-generated naked images of young local girls had been circulating on social media without their knowledge.
The pictures were created using photos of the targeted girls fully clothed, many of them taken from their own social media accounts.
These were then processed by an application that generates an imagined image of the person without clothes on.
So far more than 20 girls, aged between 11 and 17, have come forward as victims of the app’s use in or near Almendralejo, in the south-western province of Badajoz.
The suspects in the case are aged between 12 and 14. Spanish law does not specifically cover the generation of images of a sexual nature when it involves adults, although the creation of such material using minors could be deemed child pornography.
Another possible charge would be for breaching privacy laws. In Spain, minors can only face criminal charges from the age of 14 upwards.
Some good questions. Clearly, the girls are victims. But is it child porn if the images are fake? What is the appropriate legal sanction, if any, for taking a public image of someone and manipulating it? If the boys had done this by drawing or painting, is it morally different than using AI to create the images? Is it a crime to draw an imagined image of a naked person – adult or child? Our legal infrastructure in the age of AI is woefully behind. The action is clearly disgusting and morally reprehensible, but how should the law deal with it?