AI Processes Brain Electrical Activity to Diagnose Depression, Mind-control Gadgets, and More

Researchers from Skoltech, HSE University, and the RAS Institute of Higher Nervous Activity and Neurophysiology have developed a toolbox and an online crowdsourcing platform for analyzing electroencephalography data. The automated solution identifies meaningful components in EEG signals faster and more consistently than human experts. As more researchers and clinicians contribute to the platform, its creators expect the algorithms to become progressively more accurate. They envisage the platform as a hub for the community of medical scientists and geeks spanning sleep studies, stroke rehabilitation, epilepsy diagnostics, brain-computer interfaces, and more. The paper came out in Frontiers in Neuroinformatics.

Spatial distribution of electrical activity on the scalp. The left and right picture represent how two different components extracted from the same EEG signal might look. Image Credit: Gurgen Soghoyan et al./Frontiers in Neuroinformatics.

Electroencephalography is a technique that noninvasively measures the electrical activity of the brain via electrodes positioned on the scalp (read about an EEG device recently developed at Skoltech). The resulting brain wave readings are used to study sleep, diagnose coma and epilepsy patients, enable users to mentally interact with gadgets, and help people recover from a stroke or other conditions that impair normal brain activity.

EEG is cheap and noninvasive, but the recorded signals are very noisy compared with those detected by implanted electrodes. Since the sensors are positioned on the scalp, each of them picks up the sum of the electrical activity of many neurons, and the signal gets distorted by propagating through bone, skin, and other tissue. Moreover, an EEG may contain unwanted electrical activity, including sources that are either close to the brain (such as eye blinking) or simply strong (heartbeat), and even the electrical current powering the diagnostic equipment itself.

“So there are two problems. First, the signal we record is messy and we have to subtract whatever is not meant to be there: the effects of breathing, head movements, sweating, and so on. Second, there is so much going on in the brain at any given time that even a ‘clean’ signal is actually a combination of many signals representing different cognitive processes. And depending on what the EEG is used for, one might need to zero in on very specific signal components, such as the motor activity responsible for limb movements,” principal investigator Maxim Sharaev, a senior research scientist at Skoltech, commented.

In a real-life setting, the noisy EEG could be made up of a number of independent noises and actual brain activity components. Scan preprocessing typically involves an experienced physician recognizing the respective signal contributions in a tedious and fairly subjective analysis.

“We automated this process and made it more consistent using machine learning. Now there’s an algorithm trained on hundreds of EEG recordings marked up by multiple human experts. It can denoise the signal and recognize specific signal components,” Sharaev said.

He also noted that while similar attempts have been made before, the new solution has the important advantage in that it is carried by a dedicated crowdsourcing platform. Other experts can upload their own EEGs recorded with different equipment and from other patients, and reassess the original scans. This means that the platform has the potential to become a major hub for EEG analysis, and as it attracts more specialists, the AI will produce ever better results.

As more and more data are accumulated, it is conceivable that EEG will at some point become a valid way to diagnose not just the more obvious disorders such as epilepsy — where abnormal brain activity is readily recognizable — but other, finer conditions, such as major depressive disorder, schizofrenia, or autism. “As of today, this area of research is only just emerging and is not part of clinical practice. We are working on this, too,” Sharaev added.

Another major application has to do with brain-computer interfaces. This refers to the technology that converts brain signals picked up by EEG into commands for external or implanted devices, either to make up for lost functions in the body or for plain fun. The commands could range from moving the arm of an exoskeleton worn by a paralyzed person to turning on the TV. “For example, we showed in this paper that the algorithm can identify so-called mu waves, the signal component responsible for voluntary body movements,” the scientist noted.

A related but distinct application is post-stroke rehabilitation. This involves training a patient to generate activity in a certain brain region by repeatedly making a mental effort and getting visual feedback on the screen.

Fundamental studies of human cognitive faculties also use EEG data. That includes sleep research and experiments with people performing cognitive tasks while their brain activity is monitored to detect regions of the brain involved in particular cognitive processes.

“We see this project as an important platform for collaboration,” Sharaev stressed. “At Skoltech, this means the collaboration between our own Research Center for Applied AI and Carbon Footprint Reduction and Vladimir Zelman Center for Neurobiology and Brain Rehabilitation. Beyond Skoltech, hopefully this will grow into a major community around EEG studies and applications.”

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.