(Photo : Youtube/ CBC News: The National) AI in hospitals to help doctors

When a patient is rushed to the hospital, doctors would ask them to rate how much pain they are feeling on a scale of 1 to 10.

However, pain tolerance is subjective and it can make it difficult for doctors to know why someone's pain is worse than the pain that other patients are feeling.

Artificial intelligence in hospitals

According to a study published in Nature Medicine, researchers used AI techniques in order to analyze knee X-rays to predict the experienced pain of the patients, especially for those who are suffering from osteoarthritis of the knee.

The study involved 36,369 observations gathered from 4,172 patients. The computer analysis could pick up things that a radiologist might not record.

Also Read: Coronavirus and Artificial Intelligence: Researchers See AI Solutions to Combat COVID-19

Ziad Obermeyer, an assistant professor at Berkeley and the co-author of the study said that they did not train the algorithm to know what the doctor was going to say about the knee X-ray. The algorithm was trained to predict what the patient was going to say about their own experience of pain in the knee.

Obermeyer said that the algorithm was able to explain more of the pain that people were feeling. This finding is important because doctors have been accused of judging the patient's pain with discrimination.

Racial inequality

Studies have highlighted healthcare inequalities between black patients and white patients in the Untied States, according to AJPH.

The doctors are less likely to take some groups seriously when they say that they are in pain. Studies indicate that black patients are likely to have their pain level underestimated and it can affect their treatment and their recovery.

One of the focuses in the study was to study the mystery of why black patients have higher levels of pain. The study found that radiologists examining very similar arthritis cases would find that black patients reported more pain than white patients.

However, the algorithm indicated that the cases were less similar than what they appeared. It took account of additional undiagnosed features that would be overlooked by doctors employing the commonly use radiographic grading systems.

Since patients who reported severe pain and scored highly on the algorithm's own measure, but low on the official grading systems were more likely to be black, it also suggests traditional diagnostics may be ill serving.

The study is getting attention because AI has been accused of being discriminatory too. This is usually because the datasets the algorithm was trained on suffered from accidental bias, according to PNAS.

The resulting algorithm would probably be less accurate when applied to the smaller group than one that will make up the majority of the country's population.

The charge is that AI systems often suffer from bias because they have learned to distinguish patters in the features and the habits of white people that may not work as well when it is applied to people of other skin tones.

The use of artificial intelligence in healthcare is not meant to replace a doctor, it is to be used to assist doctors, especially with tasks that are usually tedious or don't directly correlate to patient care.

Related Article: Elon Musk Thinks Reporting to AI 'Would Be an Improvement' and 'Robot Future' Would Soon Come!

This article is owned by Tech Times

Written by Sieeka Khan

ⓒ 2021 All rights reserved. Do not reproduce without permission.