This article is more than 1 year old
We know you all want to shove AI where the sun doesn't shine. And that's exactly where it's going – detecting prostate cancer
Your wish come true, thanks to these US neural net boffins
Artificially intelligent software could help doctors treat a problem that is, quite literally, a pain in the arse: prostate cancer.
A team of radiologists at the University of California, Los Angeles, built a convolutional neural network to analyse MRI scans of male nether regions and detect signs of the cancer. These types scans are a lot less invasive method than doctors delicately delving in to collect tissue for biopsies.
Inspecting the MRI images requires doctors undergo extensive training, and assessments are often based on interpretations. The researchers thought that computer algorithms may be able to help medics in diagnosing the disease.
The neural network, named FocalNet, detects prostate cancer by analysing the pixels in MRI scans for lesions in the prostate area. These small areas of damaged tissue are either benign or cancerous. One way of checking is to compare the structure of the cells using Gleason scoring, a grading system that estimates how aggressively the cancer has spread.
Prostate cancer can be cured when it’s caught in its early stages. FocalNet predicts the severity of the disease in patients from their MRI scans by classifying each bloke into one of six groups. One group contains fellas with a healthy prostate with no lesions, while the other five groups are associated with different gradings of the Gleason score. A higher score indicates that the disease has progressively worsened and there is a higher chance that cancerous cells have spread to other parts of the body.
Not so fast AI Doctor, the FDA would like to check how good you really are at healthcareREAD MORE
The scans of 417 patients, who had MRIs scans prior to prostate cancer surgery, were used to train the CNN. The data contained a total of 728 lesions, and the researchers used an Nvidia Titan Xp graphic processing unit with 12GB of memory to crunch through all the numbers. Each 2D scan was first processed as a vector and fed into FocalNet as input, whilst the outputs were classified into the six different categories.
FocalNet achieved the best results when analysing patients at the highest risk of having prostate cancer. Its accuracy was 80.5 per cent for the most aggressive lesions and 79.2 per cent for less advanced stages that were still considered “clinically significant,” and 55.6 per cent across the rest of the range. For comparison, three radiologists with at least ten years’ experience of analysing MRI scans scored better at 83.9 per cent accuracy for the most aggressive cases, 80.7 per cent or clinically significant, and 61.8 per cent for all lesions.
This is a like-for-like comparison with 0.62 false positives per patient for the radiologists and the software.
So, anally speaking, human experts beat a neural network in an academic study that isn't in production use. Hopefully, this tech will improve with more data, eventually.
The research was presented at the IEEE’s International Symposium on Biomedical Imaging in Italy, this month. ®