New Technique Based on Artificial Intelligence Addresses a Critical Roadblock in Neuron Analysis

A new automated process developed by biomedical engineers at Duke University is capable of tracing the shapes of active neurons as precisely as human researchers but in just a fraction of the time.

This video from two-photon imaging shows neurons firing in a mouse brain. Recordings like this enable researchers to track which neurons are firing, and how they potentially correspond to different behaviors. (Video credit: Yiyang Gong, Duke University)

The latest method is predicated on utilizing artificial intelligence to understand video images. It deals with a major barrier in neuron analysis, enabling scientists to quickly collect and process neuronal signals for behavioral studies in real time. The study was recently reported in the Proceedings of the National Academy of Sciences.

To determine the activity of neurons, scientists usually employ a process called two-photon calcium imaging, which enables them to document the activity of single neurons in the live animals’ brains. Such recordings allow investigators to track the type of neurons that are firing, and how they possibly match with numerous behaviors.

Although measurements like these are valuable for behavioral studies, detecting single neurons in the recordings is not an easy process. At present, the most precise technique demands a human analyst to circle each “spark” that is seen in the recording, usually prompting them to halt and replay the video until the neurons of interest are detected and saved. To complicate the process even more, researchers are usually keen on detecting just a tiny subset of active neurons that overlap in varied layers within the countless number of neurons that are imaged.

Known as segmentation, this process is slow and fussy. When it comes to segmenting neurons in a 30-minute video recording, scientists can spend anywhere between 4 and 24 hours and that’s assuming they are completely focused for the duration and don’t take breaks to eat, sleep, or use the bathroom.

On the contrary, a novel open-source automated algorithm developed by neuroscience and image processing scientists in Duke’s Department of Biomedical Engineering is capable of precisely detecting and segmenting neurons in a fraction of the time.

As a critical step towards complete mapping of brain activity, we were tasked with the formidable challenge of developing a fast automated algorithm that is as accurate as humans for segmenting a variety of active neurons imaged under different experimental settings.

Sina Farsiu, the Paul Ruffin Scarborough Associate Professor of Engineering, Department of Biomedical Engineering, Duke University.

The data analysis bottleneck has existed in neuroscience for a long time—data analysts have spent hours and hours processing minutes of data, but this algorithm can process a 30-minute video in 20 to 30 minutes. We were also able to generalize its performance, so it can operate equally well if we need to segment neurons from another layer of the brain with different neuron size or densities.

Yiyang Gong, Assistant Professor, Department of Biomedical Engineering, Duke University.

Our deep learning-based algorithm is fast, and is demonstrated to be as accurate as (if not better than) human experts in segmenting active and overlapping neurons from two-photon microscopy recordings.

Somayyeh Soltanian-Zadeh, PhD Student and Study First Author, Department of Biomedical Engineering, Duke University.

Through deep-learning algorithms, scientists can rapidly process huge amounts of information by simply transmitting it through various layers of nonlinear processing units, which, in turn, can be trained to detect various parts of an intricate image.

The researchers produced an algorithm in their framework. This algorithm can process timing as well as spatial data in the input videos. Subsequently, the team “trained” the algorithm to imitate the segmentation of a human analyst while enhancing the precision at the same time.

Such advancements represent a major step towards enabling neuroscientists to achieve real-time monitoring of a neural activity. Thanks to the extensive usefulness of their tool, the researchers have made their annotated dataset and software available online.

The latest technique is already being used by Gong to more closely examine the neural activity related to various behaviors in mice. According to Gong, a better insight into which kind of neurons fire for different activities could enable researchers to control brain activity to alter behavior.

This improved performance in active neuron detection should provide more information about the neural network and behavioral states, and open the door for accelerated progress in neuroscience experiments.

Somayyeh Soltanian-Zadeh, PhD Student and Study First Author, Department of Biomedical Engineering, Duke University.

The National Institutes of Health Medical Imaging Training Program pre-doctoral fellowship (T32-EB001040 and P30-EY005722) and the National Science Foundation BRAIN Initiative (NCS-FO 1533598) supported the study. Sina Farsiu is supported by a Google Faculty Research Award, and Yiyang Gong is supported by the Beckman Young Investigator Award.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.