Artificial intelligence is starting to combine with smartphone technology in ways that could have profound impacts on the way we monitor health, from tracking blood volume changes in diabetics to detecting concussions by filming the eyes. Using the technology to spot melanoma in its early stages is another exciting possibility, and a new deep-learning system developed by Harvard and MIT scientists promises a new level of sophistication, by using a method commonly used by dermatologists known as the “ugly duckling” criteria.
Using smartphones to detect skin cancers is an idea that scientists have been exploring for more than a decade. Back in 2011 we looked at an iPhone app that used the device’s camera and image-based pattern recognition software to provide risk assessments of unusual moles and freckles. In 2017, we looked at another exciting example, in which an AI was able to use deep learning to detect potential skin cancers with the accuracy of a trained dermatologist.
The new system developed by the MIT and Harvard researchers again leverages deep learning algorithms to take aim at skin cancer, but with some key differences. Algorithms built to automatically detect skin cancers so far have been trained to analyze individual skin lesions for odd features that could be indicative of melanoma, which is a little different to how dermatologists operate.