With technology becoming more and more advanced, anything is possible, even mind-reading.

Scientists from the University of Washington were able to develop a new computer program that is able to look deep into humans' thoughts and decode it in real time.

The researchers say they may predict what a person is thinking just by analyzing the electrical signals produced by electrodes placed in the brain.

The discovery may have significant applications to studies of neurological health including memory, motor function and disorders such as epilepsy.

"The computational tools that we developed can be applied to studies of motor function, studies of epilepsy, studies of memory," says co-author and neurosurgeon Jeff Ojemann.

Co-author Rajesh Rao explains the goals of their study. First, they want to understand how the brain perceives objects. Second, they need to determine how a computer program can draw out and predict what a person is seeing in real time.

The experiment involved seven patients diagnosed with epilepsy, which seizures are not relieved by medications. Surgeons temporarily implanted electrodes in a brain location called temporal lobe to locate the focal points of the patients' seizures.

Temporal lobes are the regions of the brain where most epileptic seizures occur. These lobes are located behind the eyes and ears and are also linked to dementias. Temporal lobes are also considered vulnerable sites for head traumas, Ojemann says.

The researchers asked the participants to look at numerous images of houses, faces and blank spaces on a computer screen as it briefly flicker by. The main task is for the participants to identify when an upside-down house appears.

During this experiment, a computer program analyzed the signals going through the participants' brains. The program is so powerful that it was able to detect the signals 1,000 times per second.

For the first two-thirds of the experiment, the program was able to determine what the brain signals look like when the participant is seeing a house, a face or a blank space.

Interestingly, on the last third of the experiment, the computer was able to predict what the person actually saw. The most astonishing thing is it was able to do so with 96 percent accuracy and within a 20 millisecond time frame only.

The researchers were able to identify that the computer program requires two kinds of brain signals to read the signals. The first one is called an event-related potential, which is activated instantly after seeing the image. The other one is called broadband spectral change, which is the persistent processing long after the initial surge of information.

In the past, researchers only look at single neurons. In the present study, the researchers were able to present a larger network of neurons when an individual sees and perceives a complex visual material.

The study was published in PLOS Computational Biology.

Photo: Keoni Cabral | Flickr

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion