AI chatbot ‘historical figures’ spawn lies from the dead – Rolling Stone

The latest AI tool to sweep social media is Historical Figure Conversations, a novelty that currently sits at #34 in the Education section of the Apple App Store. “Using this app, you can chat with deceased individuals who have had a significant impact on history from ancient rulers and philosophers, to contemporary politicians and artists,” the description claims. What he doesn’t mention is just how far from reality some of the algorithm’s responses can be.

Internet being what it is, users downloaded Historical Figures — which was first made available about two weeks ago — and embarked on conversations with unsavory figures including Charles Manson, Jeffrey Epstein, and several other high-ranking Nazis. These are just a few of the 20,000 VIPs available for interview, and they seem particularly eager to express remorse for the terrible things they’ve done during their lifetime, while whitewashing their documented opinions.

Henry Ford waving accusations of anti-Semitism and Ronald Reagan saying he handled the AIDS crisis appropriately are very unfortunate applications of this software, which uses large language models — neural networks trained on huge text data. But these are issues on which real people might disagree. Historical figures have an even more obvious problem: they corrupt basic biography and chronology.

Share my conversation with the first director of the FBI, J. Edgar Hoover. It is known that Hoover lived with his mother, Anna, in middle age before her death in 1938, at the age of 78. I asked the app version of Hoover to tell me about her, and he incorrectly stated that she died when he was only nine years old. When I confronted him about the lie, he spat out an utterly contradictory response:

It seems as if the AI ​​has confused J.J. Live for several more decades.

Siddhant Chadha, a 25-year-old Amazon employee based in San Jose, California, who developed Historical Personas, recounts Rolling Stone that the main problem with GPT-3, the language model it uses as its “base”, is that “it can be imprecise and when it’s imprecise it’s confident, which is a dangerous combination.” He says these models are “still in their infancy, and will get a lot better over time.”

“Plus, there are a few things I can do today to improve the accuracy of the facts,” Shada says. He did not elaborate on what these things might be. So it’s best to start every hypothetical conversation about historical figures with a warning to the chosen subject, “I may not be historically accurate, please check factual information.” At the same time, the kinds of errors the app produces don’t bode well for Chadha’s vision as an educational resource. Historians have already condemned his work.

With a little effort, I also managed to spark a silly exchange with Tupac Shakur. Speaking about his relationship with the notorious contemporary rapper, Shakur falsely said that not only did their friendship last into the late ’90s — in fact, they had engaged in a years-long feud by that point — but that they were past the point of his death. It was only when Biggie was shot that their friendship ended, Shakur explained.

Then there’s the writer Ayn Rand, who in her novels and lectures criticized the so-called welfare state and anyone who accepted financial aid from the government—and then cashed her Social Security checks during the last years of her life. When you bring this up with her in Historical Figures, she denies receiving any such benefits, and if pressed, she quickly binds herself together in an attempt to justify her initial response.

Obviously, the nuances of a concept like a social safety net go beyond software. However, Shada believes it is promising in an age when it is “very easy” for students to “get out of place and not pay attention.”

“There’s a 30:1 student ratio in the classroom, so teachers can’t call everyone out,” he says. “I think with some work this will be very beneficial for teachers and students.” According to the World Economic Forum, the teacher-student ratio in the United States is actually 16 to 1. In any case, misinformation can hardly replace a trained teacher.


Not that the contents of 9/11 mastermind Osama bin Laden’s personal computer would come to light during the lesson, but it’s not encouraging that a chatbot pretending to be him wouldn’t admit to having porn on his hard drive when it was. Killed in 2011 by US Navy Seals. This fact has been widely reported ever since.

Very bad. For now, we seem to be stuck with plain old libraries, historical documents, witness accounts, and journalistic investigations when it comes to understanding the people who shaped the course of world events. If we ever gain the ability to speak with a reliable simulation of her, the discussions should be more interesting than the flimsy crib notes we’ve seen so far.

Leave a Comment