Parent Advisory: “Nudifying” Apps Available Through Google Target Kids with Revenge Porn, Sextortion, and Harassment
AI “nudifying” apps are a scary and growing trend, owing in part to the ease with which kids can access them from Google search and the Google Play store. These apps do exactly what their name suggests: take photos of people who are clothed and make them appear nude. There have been several recent cases of kids using these apps on innocent photos of their classmates and then circulating the images. Nudifying apps can also be used to create content for sextortion – a crime often targeting children and teens that’s up by more than 300%.
99% of the victims of these fake images are female, and include both female celebrities and average tween and teen girls. Victims report feeling harassed, violated, and traumatized.
Parents should know that creating or sharing sexual images of children, even with AI, is illegal. Parents should talk to their kids about the ethics of using these apps and how they take away people’s consent and bodily autonomy. Parents should also talk to their kids about how to respond if they learn of AI-generated nude images of themselves or their friends.
Unfortunately, dozens of these apps are easily accessible through a Google search or via the Google Play Store. Google also makes it easy to find images created with these tools, including disturbing images of child stars like Emma Watson, Demi Lovato, and Zendaya who are now adults but appear as children and teens in the sexual and sometimes violent images.
More than 12,000 parents have signed a petition asking Google to stop promoting unethical and dangerous nudifying apps that are being used to target young girls.
Social media is a primary spreader of these images. In addition limiting access to unethical and harmful apps like “nudifying” apps, ParentsTogether supports the Kids Online Safety Act (KOSA) at the federal level, which would require tech platforms to design their products with kids’ safety in mind, and age-appropriate design codes at the state level, like those in Minnesota, Maryland, and Vermont, which would require platforms to design their platforms in a way that promotes, rather than threatens, children’s health and well-being.