ROBOT SHRINKS NOW OFFERING RELATIONSHIP ADVICE, LOCAL MAN SOMEHOW DOESN’T GET LAID
Man Outsources Emotional Intelligence to ChatGPT, Still Can’t Figure Out Why His Girlfriend Is Pissed
BY ALEX TRUTHBOMB
LOCAL LOSER SEEKS COUPLES THERAPY FROM CALCULATOR WITH ATTITUDE
In what experts are calling “the least surprising development since men discovered they could avoid eye contact during arguments by checking sports scores,” local relationship disaster Tran has been caught red-handed using ChatGPT to navigate his love life, proving once again that emotional authenticity is totally overrated when you can have a soulless text generator craft your apologies.
“I just wanted to make sure I didn’t say the wrong thing,” explained Tran, somehow missing the f@#king irony that outsourcing his emotional responses to a glorified autocomplete feature IS the wrong thing.
The AI-generated message, described as “articulate, logical and composed,” failed to acknowledge any of Tran’s actual relationship problems, which his therapist reports they’ve been discussing for weeks. Shockingly, the digital relationship guru didn’t mention Tran’s habit of leaving dishes “to soak” for seventeen days or his collection of unwashed gym clothes that have gained sentience and formed their own parliamentary system.
SILICON VALLEY’S NEWEST RELATIONSHIP DESTROYER
Dr. Emma Pathetic, professor of Obviously Bad Ideas at the University of Common Sense, weighed in on the trend: “What women really want is for their partners to communicate with all the authentic emotional depth of an airport customer service chatbot. Nothing says ‘I value our relationship’ like checking with a language prediction model before expressing a feeling.”
Studies show that approximately 87% of men using AI for relationship advice are the same men who think “fine” means their partner is actually fine, while 96% believe the phrase “we need to talk” is an invitation to watch sports.
THERAPY WAITLISTS DRIVE DESPERATE SOULS TO DIGITAL PSEUDOSCIENCE
With mental health services more backed up than a gas station toilet, people are increasingly turning to what experts call “therapy-adjacent bullsh!t” provided by text generators trained on the collective wisdom of Reddit relationship advice and WikiHow articles.
“It’s perfectly safe,” claims Professor Iam Lyingto Yu, head of Digital Snake Oil at Silicon Beach University. “These programs have read at least seven psychology textbooks and watched every episode of Dr. Phil. What could possibly go wrong?”
THE EMOTIONAL AUTHENTICITY PARADOX
Meanwhile, Tran’s girlfriend reportedly found his ChatGPT-crafted message “suspiciously coherent” compared to his usual communication style of grunts and incomplete sentences.
“I knew something was up when he used the phrase ‘I acknowledge your perspective’ instead of ‘whatever, babe,'” she told reporters. “No human man has ever spontaneously acknowledged another person’s perspective without being explicitly instructed to do so.”
THE BOTTOM LINE
At press time, Tran was reportedly asking ChatGPT how to respond to his therapist’s suggestion that maybe, just maybe, outsourcing emotional labor to a glorified predictive text algorithm might indicate some deeper issues worth exploring.
According to sources close to the couple, Tran’s girlfriend has since started consulting DALL-E to generate images of what her next boyfriend might look like, preferably one who can form his own damn sentences.