Skip to main content

UK TECH MINISTER DEMANDS AI INSTITUTE STOP BEING “SUCH A LITTLE NERD” AND START KILLING PEOPLE

Technology Secretary Peter Kyle has dramatically ordered Britain’s premier artificial intelligence institute to “man the f@#k up” and focus on creating murder robots instead of “that boring math sh!t” that its namesake Alan Turing was into.

SPREADSHEETS TO BLOODSHEETS

In a blistering letter obtained by our reporters, Kyle demanded the Alan Turing Institute immediately pivot from helping scientists understand complex data to developing algorithms that can “blow stuff up real good” and “track citizens with terrifying efficiency.”

“We’ve spent millions funding a bunch of academics who just want to ‘advance human knowledge’ or whatever,” Kyle reportedly wrote. “Where are my killer drones? Where are my thought-predicting surveillance systems? This is BRITAIN, for God’s sake. We invented colonialism!”

EXPERTS QUESTION APPROACH

Dr. Ivana Militarize-Everything, the newly appointed Warfare Enhancement Consultant at the Ministry of Defense, enthusiastically supported the change. “People think Alan Turing was just some gay math genius who helped defeat the Nazis and founded modern computing. What they don’t realize is that he SECRETLY wanted his legacy to be facial recognition software that can identify potential shoplifters before they even know they’re going to shoplift.”

Studies show that 87% of artificial intelligence currently being developed in the UK is dedicated to making Netflix recommendations slightly less terrible, while only 2% is focused on creating autonomous weapon systems that could definitely never malfunction and target civilians.

INSTITUTE REBRANDING ALREADY UNDERWAY

Sources inside the institute reveal plans to rename it “The Alan Turing Institute for Making Sure Those Other Countries Don’t Kill Us First With Their Own AI That They’re Definitely Building Right Now Trust Us.”

Professor Hugh Janus, who previously led the institute’s ethics department but now oversees the newly formed “Termination Solutions Team,” explained the shift: “Yesterday I was working on ensuring AI doesn’t perpetuate social biases. Today I’m teaching it to identify ‘suspicious behavior’ with a 12% accuracy rate. It’s basically the same thing if you don’t think about it at all.”

AMBITIOUS TARGETS SET

The institute has been given six months to produce results, with Kyle demanding “at least three weapons systems that would make Geneva Convention signatories sh!t themselves” by Christmas.

When asked about concerns over rushed military applications of artificial intelligence, Kyle reportedly responded by making air quotes and saying “ethics” seventeen times while rolling his eyes.

At press time, the institute’s first prototype, a defense algorithm designed to identify national security threats, had categorized “people who don’t separate their recycling properly” as the UK’s greatest existential danger, proving they still have some kinks to work out before the robot apocalypse can properly begin.