Artificial intelligence is increasingly becoming the focus of societal debate and discussion following the emergence last November of ChatGPT, a " chatbot ," or intelligent, trainable software application capable of conducting an online conversation without the participation of a human. It lets you engage with ordinary language questions or comments and then reacts with conversational responses that it derives from vast amounts of information on the internet.
If it occurs to you that this is eerily like HAL 9000 — the fictional computer in the 1968 film 2001: A Space Odyssey that was capable of speech, speech recognition, facial recognition, lip-reading, interpreting and expressing emotions, and playing chess — we're thinking the same thing.
ChatGPT follows closely on revelations about AI technology that can create " deep fakes :" highly realistic, simulated photographic and videographic creations that are difficult to distinguish from the actual people whom they portray and can quite literally put words in their mouths.
Consider an example from earlier this month that was spawned by the possibility that a New York City grand jury would indict former President Donald Trump and that he would be arrested. Eliot Higgins, the founder of an open-source investigative journalism group, used an AI generator to visualize Trump's as-yet imaginary arrest. By providing it with simple prompts such as "Donald Trump being chased by the police," he obtained images that are quite extraordinary for their realism and emotional impact.
Within a couple of days, Higgins's posts depicting an imaginary event had been viewed millions of times, illustrating the sophistication and mischief of AI-generated images and the ease with which they can be promulgated.
We are also subjected regularly and unwittingly to the influence of internet "bots," software algorithms that represent themselves as human in social media or the accessing of websites to achieve some ideological, political, commercial, or criminal purpose.
As scientifically astonishing as these technologies may be, there is a very dark side to them, which will become increasingly clear as they merge and interact. They will enable the creation of what amounts to digital synthetic people under the control, at least initially, of the developers who created the algorithms, though their personae may turn out to be somewhat unpredictable as the algorithms "learn" from new data fed to them.
Related to that unpredictability is the possibility that the chatbots will be able to independently select sources of information from the internet. Will they glean medical and scientific information from the Mayo Clinic and NIH or the Church of Scientology?
It is not difficult to conceive of a coming Age of Media Deception, where the willingness to suppress unwelcome information and desire to highlight preferred narratives according to the bias of the media cross over the line into the manipulation of reality. The notion of objective journalism, if not already dead, is certainly in intensive care as the practitioners experience the shortening of deadlines from days into minutes or seconds and are increasingly beholden to click counts and eyeballs.
These powerful applications can be benign, such as 60 Minutes showing fabricated posthumous "interviews" of holocaust survivors, but, of course, in our Age of Misinformation and Disinformation, they'll be widely used for nefarious purposes as well. There is also little to stop deep fakes with content generated by GPTs from popping up from obscure or even anonymous sources and going viral, just as false rumors do now.
This is the dystopian underbelly of AI, George Orwell's 1984 arriving several decades late. Welcome to the coming media hellscape.
Andrew I. Fillat spent his career in technology venture capital and information technology companies. He is also the co-inventor of relational databases. Henry I. Miller, a physician and molecular biologist, is the Glenn Swogger distinguished fellow at the American Council on Science and Health. They were undergraduates together at M.I.T.