Researchers at the University of South Australia have developed a computer vision system that can recognize a tiny baby’s face in a hospital bed and remotely monitor its vital signs with the same precision as an electrocardiogram machine using a digital camera.
Adults often use artificial intelligence-based software to recognize human faces, but this is the first time researchers have built software that can accurately detect a preterm baby’s face and skin when it is covered in tubes, garments, and phototherapy.
Babies in neonatal intensive care can be extra difficult for computers to recognize because their faces and bodies are obscured by tubes and other medical equipment.
Professor Javaan Chahl
An electrocardiogram (ECG) is frequently used in conjunction with other tests to diagnose and monitor cardiac problems. It can be used to look into symptoms including chest discomfort, palpitations (quickly perceptible heartbeats), dizziness, and shortness of breath that could indicate a heart condition.
Using a digital camera, UniSA engineering researchers and a neonatal critical care expert monitored the cardiac and breathing rates of seven newborns in the Neonatal Intensive Care Unit (NICU) at Flinders Medical Centre in Adelaide.
“Babies in neonatal intensive care can be extra difficult for computers to recognize because their faces and bodies are obscured by tubes and other medical equipment,” says UniSA Professor Javaan Chahl, one of the lead researchers.
“Many premature babies are being treated with phototherapy for jaundice, so they are under bright blue lights, which also makes it challenging for computer vision systems.”
The ‘baby detector’ was created using a dataset of recordings of newborns in the NICU to determine their skin tone and faces accurately. The values of vital signs mirrored those of an electrocardiogram (ECG) and, in some circumstances, appeared to surpass the traditional electrodes, demonstrating the efficacy of non-contact monitoring of preterm neonates in critical care.
The research is part of a larger UniSA initiative to replace contact-based electrical sensors with non-contact video cameras in order to reduce skin ripping and infections caused by adhesive pads on newborns’ delicate skin.
Infants were videotaped at close range using high-resolution cameras, and essential physiological data was collected using modern signal processing algorithms that can identify small color changes from heartbeats and body movements that aren’t visible to the naked eye.
Kim Gibson, a newborn critical care specialist at the University of South Australia, believes that employing neural networks to recognize babies’ faces is a huge step forward in non-contact monitoring.
It is extremely difficult to get clear videos of preterm newborns in the NICU. Because there are so many obstacles and the lighting might change, getting precise findings can be challenging. The detection model, on the other hand, has outperformed our expectations.
“Worldwide, more than 10 percent of babies are born prematurely and due to their vulnerability, their vital signs need to be monitored continuously. Traditionally, this has been done with adhesive electrodes placed on the skin that can be problematic, and we believe non-contact monitoring is the way forward,” Gibson says.
Professor Chahl believes the findings are especially important in light of the COVID-19 epidemic and the requirement for physical separation. In 2020, the UniSA team created world-first technology that analyzes adults’ vital signs to check for COVID-19 symptoms, which is now utilized in commercial goods supplied by Draganfly in North America.