> “I’ve never filmed pornography,” Wylder says. “But they have plenty of pictures and videos of my body and me walking in and out of the office, myTikTok, photos of me on Twitter before my account was deleted. They could easily make a composite of me, and I could be starring in adult videos that I never agreed to and never consented to.”
It just needs to be illegal to make an AI likeness of someone without their consent. I honestly still don’t understand why it’s not. It’s a whole new form of exploitation when brothel owners want to create AI porn that will be more profitable for them than the sex work taking place in the brothel yet they don’t want to pay the sex workers for the porn (or even get their consent to create it).
baubino•39m ago
It just needs to be illegal to make an AI likeness of someone without their consent. I honestly still don’t understand why it’s not. It’s a whole new form of exploitation when brothel owners want to create AI porn that will be more profitable for them than the sex work taking place in the brothel yet they don’t want to pay the sex workers for the porn (or even get their consent to create it).