are-ai-powered-skin-check-tools-on-the-horizon-for-dermatologists,-pcps?
Are AI-Powered Skin-Check Tools on the Horizon for Dermatologists, PCPs?

Views: 0

An influential Nature paper predicted in 2017 that advances in artificial intelligence (AI) could unleash remarkable changes in dermatology, such as using phones to help detect skin cancer earlier.

Justin Ko, MD, MBA

Given that about 6.3 billion smartphones would soon be in use, this AI approach could provide a gateway for “low-cost universal access to vital diagnostic care,” wrote Justin Ko, MD, MBA, a dermatologist, and colleagues from Stanford University that included other dermatologists and engineers.

Ko and his coauthors described how they trained a computer system to identify both benign and cancerous skin lesions. They used an approach known as a convolutional neural network, often deployed for projects seeking to train computers to “see” through image analysis. They said that their test of this system found it to be on par with the performance of 21 board-certified dermatologists.

“This fast, scalable method is deployable on mobile devices and holds the potential for substantial clinical impact, including broadening the scope of primary care practice and augmenting clinical decision-making for dermatology specialists,” they wrote in their paper.

More than 6 years later, there are signs that companies are making progress toward moving skin checks using this technology into US primary care settings — but only with devices that employ special tools.

It may prove tougher for companies to eventually secure the sign-off of the US Food and Drug Administration (FDA) for mobile apps intended to let consumers handle this task with smartphones.

Sancy Leachman, MD, PhD

Such tools would need to be proven highly accurate before release, because too many false-positives mean that people would be needlessly exposed to biopsies, said Sancy Leachman, MD, PhD, director of the melanoma research program and chair of the Department of Dermatology at Oregon Health & Science University (OHSU).

And false-negative readings would allow melanoma to advance and even be fatal, Leachman told Medscape.

Roxana Daneshjou, MD, PhD

Roxana Daneshjou, MD, PhD, a dermatologist at Stanford who has studied the promise and the pitfalls of AI in medicine, said that developers of a consumer skin-check app would need to know how people would react to their readings. That includes a good sense of how often they would appropriately seek medical care for a concerning reading. (She was not an author of the previously cited Nature paper but has published widely on AI.)

“The direct-to-consumer diagnostic space makes me nervous,” Daneshjou said in an interview. “In order to do it, you really need to have good studies in consumer populations prior to release. You need to show how effective it is with follow up.”

FDA Shows Interest — and Reservations

As of July, the FDA had not yet given its okay for marketing of any consumer apps intended to help people detect signs of skin cancer, an agency spokesperson told Medscape.

To date, the agency has only cleared two AI-based products for this task, both meant to be used by dermatologists. And only one of these two products, Scibase’s Nevisense, remains in use in the United States. The other, MelaFind, has been discontinued. In 2017, Strata Skin Sciences said that the product did not win “a significant enough level of acceptance by dermatologists to justify the continued investment” in it. And the company said it notified the 90 owners of MelaFind devices in the United States that it would no longer support the device.

But another company, DermaSensor, said in a 2021 press release that it expects its AI-powered tool, also named DermaSensor, to be the “first ever FDA cleared or approved skin cancer detection device for primary care providers.”

The Miami-based firm said that the FDA had granted its product a “breakthrough” device designation. A breakthrough designation means that agency staff will offer extra help and guidance to companies in developing a product, because of its expected benefit for patients.

In a 2020 press release, 3Derm Systems, now owned by Digital Diagnostics, made a similar announcement about winning FDA breakthrough designation for an AI-powered tool intended to allow skin checks in primary care settings.

(The FDA generally does not comment on its reviews of experimental drugs and devices, but companies can do so. Several other companies have announced FDA breakthrough designations for AI-driven products intended to check for skin lesions, but these might be used in settings other than primary care.)

Both DermaSensor and Digital Diagnostics have chairs with notable track records for winning FDA approvals of other devices. DermaSensor’s Maurice Ferre, MD, also is the chairman of Insightec, which in 2016 won the first FDA approval for a device with a breakthrough designation device that uses ultrasound to treat tremors.

In 2018, the FDA allowed Digital Diagnostics, then called IDx, to introduce in the United States the first medical device using AI in primary care offices to check for signs of diabetic retinopathy. This product also had an FDA breakthrough designation. The executive chairman and founder of Digital Diagnostics is Michael Abramoff, MD, PhD, professor of engineering and ophthalmology at the University of Iowa. Abramoff and the team behind the AI tool for retinopathy, now called the LumineticsCore system, also scored a notable win with Medicare, which agreed to cover use of the product through a dedicated CPT code.

FDA Draft Guidance

The FDA has acknowledged the interest in broadening access to skin checks via AI.

This was a topic of discussion at a 2-day advisory committee meeting the FDA held last year. In April 2023, the FDA outlined some of its expectations for future regulation of skin-analyzing tools as part of a wide-ranging draft guidance document intended to aid companies in their efforts to develop products using a form of AI known as machine learning.

In the document, the FDA described how it might approach applications for “hypothetical” devices using this kind of AI, such as a special tool to help primary care clinicians identify lesions in need of further investigation. Such a product would use a specific camera for gathering data for its initial clearance, in the FDA’s hypothetical scenario.

The FDA staff offered technical suggestions about what the developer of this hypothetical device would have to do to extend its use to smartphones and tablets while keeping clinicians as the intended users.

Some of these expanded uses could fall within the bounds of the FDA’s initial clearance and thus not trigger a need for a new marketing submission, the agency said. But seeking to shift this hypothetical product to “patient-facing” use would require a new marketing submission to the FDA, the agency said.

In this scenario, a company would expect people to follow-up with a dermatologist after receiving a report suggesting cancer. Thus, this kind of a change could expose patients to “many new, unconsidered risks,” the FDA said.

Reality Check?

The state of current efforts to develop consumer apps for checking for skin cancer seems to be summarized well on the website for the MoleMapper. The app was developed by researchers at OHSU to help people track how their moles change over time.

“Mole Mapper is NOT designed to provide medical advice, professional diagnosis, opinion, or treatment. Currently, there is not enough data to develop an app that can diagnose melanoma, but if enough data is collected through Mole Mapper and shared with researchers, it may be possible in the future,” the app’s website says.

OHSU released MoleMapper as an iPhone app in 2015 The aim of this project was to help people track the moles on their skin while also fostering an experiment in “citizen science,” OHSU’s Leachman told Medscape.

OHSU researchers hoped that the digital images taken by members of the public on cell phones could one day be used to develop diagnostic algorithms for melanoma.

But around 2017, the MoleMapper team realized that they would not be able to create a diagnostic app at this time, Leachman explained. They could not collect enough data of adequate quality.

And by 2021, it was clear that they could not even develop a successful app to triage patients to assess who needs to be seen quickly. The amount of data required was, at this point, beyond what the team could collect, Leachman said in an interview.

That was a disappointment because the team had successfully completed the difficult task of creating a confidential pathway for collecting these images via both iPhones and smartphones run on Android.

“We thought if we built it, people would come, but that’s not what happened,” Leachman said. Many patients didn’t want their images used for research or would fail to follow up with details of biopsy reports. Sometimes images were not captured well enough to be of use.

“You need at least hundreds of thousands, if not millions, of data points that have been verified with pathologies, and nobody was giving us back that data. That was the reality,” Leachman said.

There were valuable lessons in that setback. The OHSU team now has a better grasp of the challenges of trying to build a data-collection system that could prove helpful in assessing skin lesions.

“If you don’t build it, you don’t know” what can go wrong, she said.

Leachman said other scientists who have worked on similar projects to build skin-analyzing apps have probably encountered the same difficulties, although they may not reveal these issues. “I think that a lot of people build these things and then they try to make it into something that it’s not,” she said.

In addition to the challenges with gathering images, dermatologists frequently need to rely on touch and other clues from in-person visits when diagnosing a suspicious lesion. “There’s something about seeing and feeling the skin in person that can’t be captured completely with an image,” Leachman said.

Public Demand

Still, regulators must face the strong and immediate interest consumers have in using AI to check on moles and skin conditions, despite continuing questions about how well this approach might work.

In June, Google announced in a blog post that its Google Lens tool can help people research skin conditions.

“Just take a picture or upload a photo through Lens, and you’ll find visual matches to inform your search,” Google said in a blog post. “This feature also works if you’re not sure how to describe something else on your body, like a bump on your lip, a line on your nails or hair loss on your head. This feature is currently available in the U.S.”

Google also continues work on DermAssist, an app that’s intended to help people get personalized information about skin concerns using three photos. It is not currently publicly available, a Google spokesperson told Medscape.

Several skin-analyzing apps are already available in the Apple and Google Play stores. The British Association of Dermatologists last year issued a press release warning consumers that these apps may not be safe or effective and thus may put patients at risk for misdiagnosis.

“Unfortunately, AI-based apps which do not appear to meet regulatory requirements crop up more often than we would like,” the association said. “Additionally, the evidence to support the use of AI to diagnose skin conditions is weak which means that when it is used, it may not be safe or effective and it is possible that AI is putting patients at risk of misdiagnosis.”

Delicate and Difficult Balancing Act

At this time, regulators, entrepreneurs, and the medical community face a delicate balancing act in considering how best to deploy AI in skin care, Stanford’s Justin Ko told Medscape. (In addition to being one of the authors on the widely cited 2017 Nature paper mentioned above, Ko served until March as the initial chair of the American Academy of Dermatology’s Augmented Intelligence Committee.)

There are many solid reasons why there hasn’t been speedy progress to deploy AI in dermatology, as many envisioned a few years ago, Ko said.

Some of those reasons are specific to dermatology; this field doesn’t have a ready set of robust data from which to build AI-driven tools. In this aspect, dermatology is decades behind specialties like radiology, pathology, and ophthalmology, where clinicians have long been accumulating and storing images and other data in more standardized ways, Ko said.

“If you went to most dermatology practices and said, ‘Hey, let me learn from the data accumulated over the course of your 30-year practice to help us develop new tools,'” there may not be a whole lot there,” Ko said.

Beyond the start-up hurdles is the larger concern Ko shares with other dermatologists who work in this field, such as Daneshjou and Leachman. What would clinicians without much dermatology training and patients do with the readings from AI-driven tools and apps?

There would need to be significant research to show that such products actually help get people treated for skin diseases, including skin cancer.

Ko praised Google for being open about the stumbles with its efforts to use its AI tool for identifying diabetic retinopathy in a test in Thailand. Real-world hitches included poor internet connections and poor image quality.

Developing reliable systems, processes, and workflows will be paramount for eventual widespread use of AI-driven tools, Ko said.

“It’s all those hidden things that are not sexy,” as are announcements about algorithms working about as well as clinicians in diagnosis, Ko said. “They don’t get the media attention, but they’re going to be make or break for AI, not just in our field but [for] AI in general.”

But he added that there also needs to be a recognition that AI-driven tools and products, even if somewhat imperfect, can help people get access to care.

In many cases, shortages of specialists prevent people from getting screened for treatable conditions such as skin cancer and retinopathy. The challenge is setting an appropriate standard to make sure that AI-driven products would help most patients in practice, without raising it so high that no such products emerge.

“There’s a risk of holding too high of a bar,” Ko said. “There is harm in not moving forward as well.”

Kerry Dooley Young is a freelance journalist based in Washington, D.C. Follow her on Twitter @kdooleyyoung

For more news, follow Medscape on Facebook, Twitter, Instagram, and YouTube

LEAVE A REPLY

Please enter your comment!
Please enter your name here