Google's DeepMind and the UK's National Health Service (NHS) have recently teamed up to help doctors detect early signs of eye diseases.
DeepMind, Google's AI division based in the United Kingdom, will put machine learning to work in order to analyze more than than 1 million anonymous eye scans. This will provide the researchers with algorithms able to detect subtle early warning signs that physicians might miss during their diagnosis.
This is not the first time that DeepMind worked together with the NHS. However, it is the first time when the partnership will tap into the potential of artificial intelligence.
DeepMind should highlight two eye conditions, namely diabetic retinopathy and wet age-related macular degeneration. The former is considered by many as the leading cause of blindness around the globe.
"There's so much at stake, particularly with diabetic retinopathy," says Mustafa Suleyman, the co-founder of DeepMind.
He goes on to add that the risk of going blind is 25 times higher for persons with diabetes. He also notes that early detection of the issue could lead to preventing 98 percent of cases from taking place.
The partnership between DeepMind and the Moorfields Eye Hospital (MEH) started as Pearse Keane, a consultant ophthalmologist at Moorfields, saw the potential of using machine learning for image recognition.
Keane thought that the company's machine-learning skills would come in handy in the analysis of eye scans, known as Optical Coherence Tomography (OCT) images.
He praised the openness and responsiveness of Suleyman, who agreed to jump in on the research project.
Moorfields is set to offer anonymous OCT data, which means that the project will be less controversial than the previous one. In a previous project dubbed Streams, DeepMind cooperated with the Royal Free Hospital, which put the full care history of some 1.6 million patients on the table for AI analysis.
At the time, some chastised Google for bypassing the standard authorization to handle personal data information. Regardless, both the Royal Free Hospital and DeepMind swore that their agreement was within legal boundaries. One of the results of the project, the Streams app, is still undergoing testing.
In referring to the eye scans from Moorfields, Google affirms that the data is delivered in such a way that nobody can "identify any individual patients" from the data.
The scans are also considered historic, which means that although the findings of the research will contribute to better care in the future, the current care patients are receiving will remain as it is.
The leader of Moorfields' ophthalmology research center, Peng Tee Khaw, explains that the surging number of precise retinal scans will make the whole difference.
"These scans are incredibly detailed, more detailed than any other scan of the body we do: we can see at the cellular level," Peng Tee Khaw notes.
When doctors have to handle such a huge amount of data on their own, time becomes a luxury commodity. Deep learning should come in handy to filter and categorize the scans, saving precious time.
"In many cases, once sight is lost it cannot be restored, so earlier detection that leads to rapid treatment will be hugely beneficial," notes Clara Eaglen, the eye health campaigns manager for the Royal National Institute of Blind People.