Artificial intelligence (AI) has the potential to help clinicians care for patients and treat disease — from improving the screening process for breast cancer to helping detect tuberculosis more efficiently. When we combine these advances in AI with other technologies, like smartphone cameras, we can unlock new ways for people to stay better informed about their health, too.
Google has shared a preview of an AI-powered dermatology assist tool that helps people understand what's going on with issues related to their skin, hair and nails. Using many of the same techniques that detect diabetic eye disease or lung cancer in CT scans, this tool gets users closer to identifying dermatologic issues using their phone's camera. For further information see the IDTechEx report on AI in Medical Diagnostics 2020-2030: Image Recognition, Players, Clinical Applications, Forecasts.
Each year Google see almost ten billion Google Searches related to skin, nail and hair issues. Two billion people worldwide suffer from dermatologic issues, but there's a global shortage of specialists. While many people's first step involves going to a Google Search bar, it can be difficult to describe what you're seeing on your skin through words alone.
Google's AI-powered dermatology assist tool is a web-based application that Google hopes to launch as a pilot later this year, to make it easier to figure out what might be going on with skin issues. Once the tool is launched, simply use your phone's camera to take three images of the skin, hair or nail concern from different angles. The tool will then set questions about your skin type, how long you've had the issue and other symptoms that help the tool narrow down the possibilities. The AI model analyzes this information and draws from its knowledge of 288 conditions to give a list of possible matching conditions that can then be further researched.
For each matching condition, the tool will show dermatologist-reviewed information and answers to commonly asked questions, along with similar matching images from the web. The tool is not intended to provide a diagnosis nor be a substitute for medical advice as many conditions require clinician review, in-person examination, or additional testing like a biopsy. Rather the intention is to give access to authoritative information so users can make a more informed decision about their next step.
The tool is the culmination of over three years of machine learning research and product development. To date, Google have published several peer-reviewed papers that validate their AI model and more are in the works. Their landmark study, featured in Nature Medicine, debuted the deep learning approach to assessing skin diseases and showed that the AI system can achieve accuracy that is on par with US board-certified dermatologists. The most recent paper in JAMA Network Open demonstrated how non-specialist doctors can use AI-based tools to improve their ability to interpret skin conditions.
To make sure the tool is building for everyone, the model accounts for factors like age, sex, race and skin types — from pale skin that does not tan to brown skin that rarely burns. Google has developed and fine-tuned their model with de-identified data encompassing around 65,000 images and case data of diagnosed skin conditions, millions of curated skin concern images and thousands of examples of healthy skin — all across different demographics.
Recently, the AI model that powers the tool successfully passed clinical validation, and the tool has been CE marked as a Class I medical device in the EU. In the coming months, Google plans to build on this work so more people can use this tool to answer questions about common skin issues.
Source: Google Health
Top image: Pixabay