Semiconductor Tech Diagnoses Eye Disease Over the Internet
An imaging analysis technique developed to find defects in semiconductors is being used to diagnose the eye problems associated with diabetes over the internet.
Pictures of diabetic patients’ retinas, the inner surface of the eye, are uploaded to a server that compares them to a database of thousands of other images of healthy and diseased eyes. Algorithms can assign a disease level to the new eye image by looking at the same factors, mainly damage to blood vessels, that an eye doctor would.
Right now, ophthalmologist Edward Chaum of the University of Tennessee double checks the system’s work, but he expects the algorithms to be diagnosing patients on its own within three months.
“At that point, the system becomes completely automated with just oversight from me,” Chaum said. “That’s unique. There isn’t anything like that going on anywhere in the world.”
Chaum’s work goes beyond telemedicine, in which physicians connect to patients through data networks, to automated medicine. There are huge advantages to the system: Chaum is expensive, while a bit of computer processing power is cheap. Also, like other telemedicine systems, it moves images over the internet, instead of patients through a health care network, which is easier for everyone involved. Patients get faster, cheaper care and doctors can spend their time treating patients that computers have already spotted as needing help. Increasing acceptance of these types of technologies could mean better medical care for people in areas of the country and world in which access to doctors is limited.
“We don’t want to manage the patients, we want to manage the images [of their eyes] and leverage the power of the connectivity of the internet and image analysis methods,” Chaum said. “We collect large numbers of images and manage that data and do the screening through data processing.”
More than 25 million Americans suffer from diabetes, which, if left untreated, can cause blindness, among other physical problems. The huge numbers of people who need to be screened for diabetes-linked eye problems have created a problem that our health care system, and its relatively small number of ophthalmologists, is not well-structured to solve. Because of the time and expense involved, only half the people who should be getting screened so they won’t go blind actually go in for tests. But new technology could help, reducing the cost and increasing the availability of screening for the eye problems that impair the vision of thousands of patients each year.
In the rural, poor areas of the Mississippi Delta where the special internet-linked retinal cameras are being installed, preventative care could be transformed for a population in which diabetes affects up to 20 percent of the population.
“Basically, we’re putting these cameras in communities in which there are no eye doctors,” Chaum said. “Certainly, there are no retina specialists who can diagnose and refer those patients in a way that makes sense to get them in for the care that they need at the time that they need it.”
The project spun out of a chance visit by Chaum to the Oak Ridge National Laboratory in Tennessee. He listened to Ken Tobin, an engineer at the lab, who’d developed the image-processing ideas for the semiconductor industry. In that world, they’d used huge databases filled with images of defective products to help engineers spot similar types of failures.
As Tobin described his work looking for defects in the wafers to the visiting Tennessee faculty, Chaum realized the same image-recognition system could be geared to find diseased eyes by using his huge database of retinal pictures (like those at the top of this story).
“As he was describing his methodology to me, it became very clear that what he was doing was exactly what I do as a physician when I’m examining a patient with diabetic retinopathy,” Chaum said. “I look for specific features that are present in that retina and I go into my own [mental] library — thousands and thousands of patients I’ve seen over the eyars — to say, ‘This is diabetic retinopathy of a certain level.'”
After several years of collaboration, Chaum has successfully transferred that knowledge from his brain into the server that does the calculations.
“The computer is a reflection of my perspective,” Chaum said.
Now, Tobin claims that the system correctly identifies between 90 and 98 percent of the diabetic patients, tagging patients on a scale from healthy to severe versions of the disease.
“We’re looking for lesions. They are like the defects on a semiconductor device. White spots or dark spots,” Tobin said. “By finding those and knowing how many there are, and certain combinations of bright and dark lesions, we can tell not just whether they have the disease but how bad it is.”
The retinal images are particularly well-suited for analysis by computers. Tobin describes them as nearly two-dimensional with well-defined areas of light and dark. Other areas of the body are tougher. Mammograms and lung X-rays, for example, look at areas with more depth and less well-defined disease indicators.
“In a chest x-ray, you are looking for things that are sort of cloud-shaped amongst other cloud-shaped objects,” Tobin said. “It’s not really something where it’s at a point where it could replace an oncologist or radiologist.”
That’s why automated diagnosis faces an uphill battle for widespread acceptance in the health care industry. The presence of a doctor just seems necessary — and institutions are loathe to take chances with a computer misdiagnosis when doctors do a generally adequate job.
It doesn’t help automated diagnosis that, as described in a review article on the use of computers in diagnosis, early missteps led many medical practitioners to write off the technique based on outdated technology from the previous decades. One doctor wrote, “We do not see much promise in the development of computer programs to simulate the decision-making of a physician.”
The other big hurdle is that insurance companies require a doctor’s sign-off for reimbursement. Practically, that’s a deal-breaker for most clinics.
Chaum and Tobin’s automated system could be groundbreaking in providing the first in-the-field test of an automated diagnostic system that the pair are confident will work. That could turn some heads in the medical field and get more doctors thinking about how to treat more patients for less money by using technology.
“What we’re trying to show is that at least in a screening environment, we can take the ophthalmologist out of the loop,” Tobin said.