AI Learns to Spot COVID-19 in Lung Ultrasound Images

How does AI detect COVID-19 in lung ultrasound images?

Recent research reveals that artificial intelligence (AI) can identify COVID-19 in lung ultrasound images, similar to how facial recognition technology identifies faces in a crowd, as reported in Medical Xpress.

This advancement may signify a significant stride in AI-driven medical diagnostics, potentially enabling healthcare professionals to swiftly diagnose patients with COVID-19 and other pulmonary ailments by employing algorithms that scrutinize ultrasound images for signs of disease.

AI Detects COVID-19 in Lung Ultrasound Images

The study marks the culmination of efforts initiated early in the pandemic to aid clinicians in rapidly evaluating a large number of patients, particularly in overwhelmed emergency rooms.

The study's senior author, Muyinatu Bell, emphasized the utility of this automated detection tool in emergency settings, where timely and accurate diagnoses are paramount, especially during the initial stages of the pandemic.

"We developed this automated detection tool to help doctors in emergency settings with high caseloads of patients who need to be diagnosed quickly and accurately, such as in the earlier stages of the pandemic," Bell said in a statement.

"Potentially, we want to have wireless devices that patients can use at home to monitor progression of COVID-19, too."

Additionally, the researchers note that the tool holds promise for the development of wearables capable of tracking various illnesses, such as congestive heart failure, by monitoring fluid buildup in the lungs.

According to the researchers, this innovation could also advance point-of-care diagnostics, with potential applications in wearable ultrasound patches that provide real-time feedback to patients, indicating the need for medical intervention or medication adjustments.

Identifying B-lines

The AI system functions by analyzing ultrasound images of the lungs to identify specific features known as B-lines, which manifest as bright, vertical abnormalities indicative of inflammation in patients with pulmonary complications.

The integration of computer-generated images with real ultrasound scans, including those from patients at Johns Hopkins, enabled the development of this tool.

Bell's team overcame initial challenges posed by the scarcity of patient data and limited understanding of COVID-19 manifestations by developing software capable of learning from a combination of real and simulated data.

The software employs deep neural networks, mimicking the interconnected neurons of the human brain to recognize patterns and detect abnormalities in ultrasound scans indicative of COVID-19 infection.

"Early in the pandemic, we didn't have enough ultrasound images of COVID-19 patients to develop and test our algorithms, and as a result our deep neural networks never reached peak performance," said first author Lingyi Zhao.

"Now, we are proving that with computer-generated datasets we still can achieve a high degree of accuracy in evaluating and detecting these COVID-19 features."

The research team's findings were published in the journal Communications Medicine.


ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics