Ayyy-EYE! Google code 'predicts heart disease' by eyeballing retinas

Eye see what you did, there, machine-learning boffins


AI researchers at Google have developed algorithms that can assess the risk of heart attacks by analyzing retinal scans.

By looking for common patterns in images of retinal scans and matching them up with the data in the patients’ medical records, one algorithm could determine if someone was a smoker or non-smoker to an accuracy of 71 per cent. Another algorithm focused on the blood vessels in the eye could tell if someone had severe high blood pressure or not, a sign associated with increased chances of stroke.

Their models can also predict other factors such as age, gender, and the chance of a heart attack or stroke, the boffins claim in a paper published in Nature Biomedical Engineering journal on Monday.

“Given the retinal image of one patient who (up to 5 years) later experienced a major [cardiovascular] event (such as a heart attack) and the image of another patient who did not, our algorithm could pick out the patient who had the cardiovascular event 70% of the time,” Lily Peng, a product manager at Google Brain, explained in a blog post this week.

Aw, how sweet: Google Brain claims to clock diabetic eye diseases just like a proper doc

READ MORE

The training dataset was collected by EyePACS, a programme developed by doctors to test for diabetic retinopathy, an eye disease that can affect people with diabetes. The dagnostics dataset is predominantly made up of Hispanic people. The validation dataset is also includes patients taken from UK Biobank, a health charity, mainly made up of Caucasian people.

Scientists from Stanford University, Google Brain and Verily - the latter being an Alphabet company focused on life sciences - used over 1.6 million retinal scans taken from 284,335 patients to train their models. Another 25,996 images were held back to validate the algorithms.

The level of accuracy is, apparently, similar to the more traditional method of drawing blood to measure cholesterol levels. Peng said the work “may represent a new method of scientific discovery.”

“Traditionally, medical discoveries are often made through a sophisticated form of guess and test — making hypotheses from observations and then designing and running experiments to test the hypotheses. However, with medical images, observing and quantifying associations can be difficult because of the wide variety of features, patterns, colors, values and shapes that are present in real images.

“Our approach uses deep learning to draw connections between changes in the human anatomy and disease, akin to how doctors learn to associate signs and symptoms with the diagnosis of a new disease. This could help scientists generate more targeted hypotheses and drive a wide range of future research,” she concluded. ®

Narrower topics


Other stories you might like

  • Google sours on legacy G Suite freeloaders, demands fee or flee

    Free incarnation of online app package, which became Workplace, is going away

    Google has served eviction notices to its legacy G Suite squatters: the free service will no longer be available in four months and existing users can either pay for a Google Workspace subscription or export their data and take their not particularly valuable businesses elsewhere.

    "If you have the G Suite legacy free edition, you need to upgrade to a paid Google Workspace subscription to keep your services," the company said in a recently revised support document. "The G Suite legacy free edition will no longer be available starting May 1, 2022."

    Continue reading
  • SpaceX Starlink sat streaks now present in nearly a fifth of all astronomical images snapped by Caltech telescope

    Annoying, maybe – but totally ruining this science, maybe not

    SpaceX’s Starlink satellites appear in about a fifth of all images snapped by the Zwicky Transient Facility (ZTF), a camera attached to the Samuel Oschin Telescope in California, which is used by astronomers to study supernovae, gamma ray bursts, asteroids, and suchlike.

    A study led by Przemek Mróz, a former postdoctoral scholar at the California Institute of Technology (Caltech) and now a researcher at the University of Warsaw in Poland, analysed the current and future effects of Starlink satellites on the ZTF. The telescope and camera are housed at the Palomar Observatory, which is operated by Caltech.

    The team of astronomers found 5,301 streaks leftover from the moving satellites in images taken by the instrument between November 2019 and September 2021, according to their paper on the subject, published in the Astrophysical Journal Letters this week.

    Continue reading
  • AI tool finds hundreds of genes related to human motor neuron disease

    Breakthrough could lead to development of drugs to target illness

    A machine-learning algorithm has helped scientists find 690 human genes associated with a higher risk of developing motor neuron disease, according to research published in Cell this week.

    Neuronal cells in the central nervous system and brain break down and die in people with motor neuron disease, like amyotrophic lateral sclerosis (ALS) more commonly known as Lou Gehrig's disease, named after the baseball player who developed it. They lose control over their bodies, and as the disease progresses patients become completely paralyzed. There is currently no verified cure for ALS.

    Motor neuron disease typically affects people in old age and its causes are unknown. Johnathan Cooper-Knock, a clinical lecturer at the University of Sheffield in England and leader of Project MinE, an ambitious effort to perform whole genome sequencing of ALS, believes that understanding how genes affect cellular function could help scientists develop new drugs to treat the disease.

    Continue reading

Biting the hand that feeds IT © 1998–2022