Posted in | News | Medical Robotics

Artificial Intelligence Used to Identify Critical Patterns from Video Footage of Baby Movements

Subtle traits in the impulsive movement of very young babies may expose clinically critical aspects of their neurodevelopment. Visual assessment of common movement patterns (General Movements, GM) by a clinical professional is known to be effective in early detection of, for example, cerebral palsy (CP).

A three month old infant shows frequently occurring stereotypical, dancing-like movements throughout the body and limbs. A noted absence of them is highly predictive of later emergence of CP.

Sampsa Vanhatalo, Professor of Clinical Neurophysiology, University of Helsinki.

A very early identification and follow-up therapeutic intervention would be very advantageous for lessening the neurodevelopmental impact of CP. At present, CP is diagnosed in a child at a much later age, usually between 6 months and 2 years of age. GM analysis holds potential in early detection of CP, however, it requires special expertise that is presently acquired through global teaching courses, which effectively restricts the number of therapists or doctors with the applicable skills. Furthermore, GM analysis in its current form is designed around visual assessment, which is always biased.

“There is an urgent need for objective and automated methods. They would allow employing movement analyses at much wider scale, and make it accessible to basically most, if not all, children in the world,” says Vanhatalo.

The stick man re­veals the es­sen­tials

Scientists at the University of Helsinki and University of Pisa aimed to investigate the possibility that a conventional video recording of a baby lying in bed could be converted to a quantified analysis of the baby’s movements. They partnered with an AI company based in Tampere, Neuro Event Labs, whose employees were able to formulate a technique for an accurate extraction of children’s movements (applying a method called pose estimation), enabling the construction of a simplified “stick man” (or skeleton) video.

Then, the scientists delivered the stick figure videos to doctors with GM proficiency to see whether diagnostically vital information was well maintained in those videos.

Using just the stick figure videos, the doctors were able to assign diagnostic groups in 95% of cases, establishing that the clinically vital information had been preserved.

The researchers demonstrated that an automated algorithm may extract clinically crucial movement patterns from standard video recordings. These stick figure extractions can be immediately used for quantitative analyses. To show such potential, the team offered a proof of concept analysis where basic measures of stick figure movements revealed distinct differences between groups of babies with either abnormal or normal movements.

Application of the stick figure videos also allows global sharing among research communities without privacy issues. This has been a major bottleneck in the development of multinational research activities within this field.

This will finally enable a genuinely Big Data kind of development for better quantitative movement analyses in infants. Since this study, we have collected larger datasets, including 3D video recordings, and we are currently developing an AI-based method for infantile motor maturity assessment. The rationale is straightforward: there is a developmental issue with the child, if the computational assessment of the motor maturity does not match with the child’s true age.

Sampsa Vanhatalo, Professor of Clinical Neurophysiology, University of Helsinki.

Movement analysis tells about neurodevelopment and effectiveness of therapeutic interventions

Besides early CP detection, automated movement analyses have a number of potential applications in the assessment of infant neurological development. “We could create one kind of functional growth chart,” says Vanhatalo.

Movement analyses could also be applied in different ways to enhance therapeutic decisions. Such approaches could offer quantitative means to objectively measure the effectiveness of various therapeutic strategies; one of the universal hot topics in restorative medicine. Automated movement analyses could also enable out-of-hospital screening of children to spot those that need additional care, or to offer assurance of normality in cases with regards to child’s development.

Use of machine learning and artificial intelligence allows for the extraction of substantial amounts of clinically useful information from a simple home-grade video recording. The ultimate aim is to find methods that will make it possible to provide high and even quality infant healthcare everywhere in the world.

Sampsa Vanhatalo, Professor of Clinical Neurophysiology, University of Helsinki.

The research was a partnership between scientists from the University of Helsinki, Helsinki University Hospital, University of Pisa, Scuola Superiore San’Anna ja IRCCS Stella Maris Foundation from Pisa, Istituto Superiore di Sanità from Rome, and Neuro Event Labs Ltd from Tampere. The Arvo and Lea Ylppö Säätiö, Finnish Pediatric Foundation, and Finnish Academy supported the research.

Pose estimation of infant's spontaneous movements

This video shows an example of a stickman (skeleton) video, which was created automatically using the novel pose estimation method. A quantitative analysis of movements is directly accessible from the trajectories of the different body parts in this stickman. (Credit: University of Helsinki)

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.