This Mind Reading AI Model Can Turn Your Thoughts Into Text So Think Happy Thoughts

hero man touching human mind
Researchers have developed an artificial intelligence (AI) system that can effectively read the human mind. The group from the University of Texas at Austin trained a neural network to noninvasively decode functional magnetic resonance imaging (fMRI) signals from the human brain.

The innovations in AI have been growing exponentially as of late. While some are exploring how AI can improve how we work, others are looking into ways that AI can further scientific research. Such is the case with the latest research from a group of scientists that was recently published in Nature Neuroscience. The paper explains how the group has overcome current invasive methods of decoding thoughts into words with a new artificial intelligence system called a semantic decoder.

"Here we introduce a noninvasive decoder that reconstructs continuous language from cortical semantic representations recorded using functional magnetic resonance imaging (fMRI)," the group explains in the opening of their research paper.

The researchers utilize a GPT model, simply called GPT-1, that interprets a person's brain activity through its blood flow being captured by an fMRI. This allows access to what the person is "hearing, saying, or imagining."

Perhaps the most exciting aspect of the new research is that it allows the process to be done in a noninvasive manner, while other systems currently being used require a surgical implant. The group also points out that participants are not required to use only words from a prescribed list.

"For a noninvasive method, this is a real leap forward to what's been done before, which is typically single words or short sentences," explained Alex Huth in a press release.

The method is not perfect in its translation, however. One example given was a research participant who listened to a recording stating, "I don't have my driver's license yet." As GPT-1 processed the data, it generated that the participant heard, "She has not even started to learn to drive yet." While it is not exact, it is clear the model is able to understand the general meaning of what it is translating.

Researchers are hopeful that the new AI system can eventually help those who are mentally conscious yet unable to speak, possibly due to a stroke, to talk intelligibly once again. The system is currently not practical outside of the laboratory just yet, however, due to its reliance on the time needed on an fMRI machine. The group does think that it could one day be utilized in a "more portable brain-imaging system, such as functional near-infrared spectroscopy (fNIRS)."