Exposed deepfake database reveals horrific ways users manipulated celebrity images
Thousands of AI-generated nude deepfakes were left exposed in an unsecured database, including those that reportedly portrayed celebrities as young children.
The database, discovered by cybersecurity researcher Jeremiah Fowler, contained 93,485 images produced by a "nudify" app from the South Korea-based AI company GenNomis.
Fowler, as outlined in an article for vpnMentor, noted the presence of text files that included the command prompts made to generate each image. Among the files he viewed...