Digital health applications are turning into suicide prevention hotlines as revealed in a survey. Patients make use of these apps to relay their suicidal thoughts.

One of these apps is Ada, a leading medical app available in over 130 countries.

Health Aid From Ada

Ada is an AI-powered platform established in 2011 by Ada Health, a European company that delivers personalized health care to the public. It was put up by a team of doctors, scientists, and engineers. The app is available in five languages and has over 6 million registered users. It received the recognition of best health start-up of 2018.

Run by sophisticated AI technology, the Ada app features a personalized interactive chat. The app asks simple, relevant questions and compares the answers to thousands of similar cases to help the user or patient find possible explanations for the symptoms experienced.

The app claims to support clinical decision-making and enables patients and health care providers to deliver quality, more effective medical care. It also has a growing medical library wherein users can search information on common medical conditions.

Such digital apps are still figuring out how to best respond to users with suicidal thoughts.

According to a survey of digital health companies, Ada app has completed 10 million health assessments since it was launched. Out of those inquiries, 130,000 users were expressing suicidal thoughts or behaviors. Patients also go to the extent of relaying suicidal thoughts through emails to the app's customer support team.

How Apps Respond?

How does digital health apps respond to patients' suicidal thoughts and behaviors?

Ada can go as far as advising a suicidal patient to call an ambulance or get emergency help from a hospital. The app would suggest to the patient to find someone to talk to.

In the case of American Well, a telemedicine solutions company that connects patients with doctors over secure video, a concerned physician once called 911 upon knowing through a video visit that a patient had been punched by her spouse.

Most digital health companies reported that they are following a response plan to help patients expressing suicidal thoughts. Lower risk patients are encouraged to call a crisis hotline or immediately contact a friend. Startups say that if they need to, they will seek emergency medical service or police in dealing with patients with higher risks of suicide.

Limitations Of Apps

While apps are trying to assess the situation of patients, no one really knows the probability of patient harming themselves.

"This has been a problem that people have been struggling with for a really long time — and there's just no science on this. People are largely winging it and using their clinical wisdom to try and figure out when and how to intervene," said Matthew Nock, a psychologist at Harvard University.

Nock said digital health companies must conduct research, evaluate practices, and gather data on what response will work or not depending on levels of suicide risk. He said that decisions whether to respond or not both carry risks.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion