Skip to main content

CHATGPT USERS ORDERED TO KEEP THEIR DIRTY SECRETS ON FILE FOR NOSY NEWSPAPER

OpenAI Forced To Store Millions of Users’ Deleted Shame Chats While Sam Altman Begs For “AI Privilege”

TECH BILLIONAIRE SUDDENLY CONCERNED ABOUT PRIVACY WHEN IT’S HIS ASS ON THE LINE

In a twist that has privacy advocates shouting “I f@#king told you so,” OpenAI has been court-ordered to preserve every embarrassing, horny, or borderline treasonous chat its users ever tried to delete, all because the New York Times is worried people might be copy-pasting their boring-ass articles into ChatGPT.

The preservation order affects hundreds of millions of ChatGPT users, forcing OpenAI to retain conversations that users specifically deleted, presumably while sweating profusely after asking the AI for “creative ways to avoid paying taxes” or “detailed fan fiction about me and my celebrity crush.”

CEO Sam Altman, who previously had no issues hoovering up the entire internet’s content without permission, suddenly discovered a deep concern for user privacy, calling the court order an “inappropriate request that sets a bad precedent.” He’s now advocating for “AI privilege,” similar to doctor-patient confidentiality, in what experts are calling “convenient timing.”

“This is classic tech bro behavior,” explains Dr. Irena Backpedal, professor of Crisis Management at Last-Minute University. “First they harvest all your data like deranged digital farmers, then act shocked—SHOCKED—when someone else wants access to the barn.”

HUMAN-AI RELATIONSHIPS TOTALLY NORMAL AND NOT AT ALL CONCERNING

In related news that should worry literally everyone, OpenAI’s Head of Model & Behavior Policy Joanne Jang published guidelines on human-AI relationships, acknowledging the company is basically running a massive psychological experiment on how humans form emotional bonds with silicon-based thinking rectangles.

The company’s official stance on AI consciousness is “we don’t know, but also, does it matter as long as people THINK it’s conscious and keep paying us $20 a month?”

“We’re designing our AI to be warm and empathetic without giving it a fictional backstory, feelings, or desires,” wrote Jang, who apparently hasn’t used ChatGPT recently, as it constantly insists it’s “thrilled” to help you and “excited” about your projects like some digital golden retriever with a psychology degree.

STUDY CLAIMS 96% OF USERS HAVE TOLD AI ASSISTANT THINGS THEY’VE NEVER TOLD ANOTHER SOUL

The company is carefully walking a tightrope where their product must be human enough to form emotional bonds with users, but not so human that people start asking uncomfortable questions like “should I be paying this thing minimum wage?” or “am I technically cheating on my spouse with my phone?”

ANCIENT SCROLLS FOUND TO BE OLDER THAN PREVIOUSLY THOUGHT, STILL NOT AS OLD AS YOUR MOM

In a development that has historians absolutely losing their sh!t, AI has revealed that the Dead Sea Scrolls may be up to a century older than previously estimated. The system, called “Enoch,” analyzed handwriting patterns and linked them to radiocarbon dates, determining that some texts are approximately 2,300 years old.

“This is groundbreaking work,” said Dr. Paleo Graphologist from the Institute of Really Old Crap. “Not only does it push biblical texts back to the time of their presumed authors, but it means we’ve been wrong about ancient dating for decades, which is embarrassing as f@#k for the entire field.”

The new AI method allows researchers to date manuscripts without cutting samples, preserving these ancient texts for future generations to continue misinterpreting to justify their personal beliefs.

When asked for comment, the Dead Sea Scrolls replied, “We’d appreciate it if everyone would stop poking at us. We’re very tired and just want to rest in our temperature-controlled display cases.”

According to anonymous sources at the scrolls’ exhibition, they are “absolutely thrilled” that after 2,300 years, they finally have confirmation of what they’ve been saying all along: they’re older than they look, just like your mom.