ChatGPT’s Horny Era Could Be Its Stickiest Yet

Spread the love

He sees erotic bots as “a part of your relationship spectrum” rather than a replacement for human connection, where users “might have a hate” they can’t explore IRL.

Prompt pleasure

When imagining who’s actually going to use a chatbot for sexual pleasure, it’s easy to picture some stereotypical thin-haired straight guy who hasn’t left his house in days or otherwise feels disconnected from physical connection. After all, there were men Quick to start Using generative AI tools, and now talk about men”Loneliness is epidemic” felt inevitable.

Devlin pushes back against the idea that “incel types” are the only humans turning to AI bots for fulfillment. “There’s a general perception that it’s only for straight men, and none of the research I’ve done has done that,” she says. He pointed out r/MyBoyfriendIsAI subreddit As an example of women using ChatGPT for companionship.

“If you think there’s a risk in this kind of relationship, introduce me to human relationships,” MacArthur says. Devlin echoed this sentiment, saying that women face an influx of toxicity from men online, so it makes sense for him to “make himself out to be a nice, respectful boyfriend” from the chatbot.

Carpenter is more cautious and clinical in his approach to ChatGPT. “People shouldn’t automatically be put into the social category of something you share an intimacy with or be like a friend or a confidant,” she says. “This is not your friend.” He says bot interactions should be classified into a novel social category distinct from human-to-human interactions.

Every expert WIRED spoke to highlights User Privacy as a key concern. If a user’s ChatGPT account is hacked or chat transcripts are otherwise leaked, erotic conversations can be not only embarrassing, but harmful. Just like a user’s pornography habits or their browser history, their chatbot sexts can include many highly sensitive details, such as a closeted person’s sexual orientation.

Devlin argues that erotic chatbot conversations can open users up to the possibility of “emotional commodification” where horny AI becomes a revenue stream for companies. “I think it’s a very manipulative approach,” she says.

Imagine a hypothetical version of ChatGPT that dazzles with dirty talk and is fine-tuned to appeal to your deepest sexual desires via text, images and voice—but costs extra per month for a subscription.

“It’s actually a seductive technology. It’s one that allows us to connect, whether it’s sexual or romantic,” says Devlin. “Everybody wants connection. Everybody wants to feel.”

Leave a Reply

Your email address will not be published. Required fields are marked *