Artificial intelligence and clinical decision support – FDA lends a helping hand

In February last year, the US Food and Drug Administration (FDA) cleared the first medical device which uses artificial intelligence (AI) to provide clinical decision support for stroke. The Viz.AI Contact application uses an AI algorithm to identify a suspected stroke and notifies a specialist more quickly than was previously possible. Faster treatment, in turn, lessens the extent of a stroke or its progression. Subsequent FDA clearances and a recent decision to formalize regulations for such evaluations are likely to stimulate further innovation and acceptance of AI devices.

Saving time
Viz.AI Contact analyses CT images of the brain and sends a text notification by smartphone or tablet to a vascular neurologist or a neuro-interventional specialist, should a large vessel occlusion (LVO) be suspected. The algorithm automatically notifies the specialist at the same time that a review of the images is being conducted by a first-line provider. This is faster than the usual standard of care where patients wait for a radiologist to firstly review CT images and then notify a neurovascular specialist.

Retrospective study and real world data
Viz.AI, Inc., which developed the Contact application, submitted a retrospective study of 300 CT scans. This compared the performance of the image analysis algorithm and notification functionality against two trained neuro-radiologists.
Real-world evidence from a clinical study demonstrated quicker notification of a neurovascular specialist, in cases where blockage of a large vessel in the brain was suspected. In more than 95 percent of cases, the automatic notification was faster, saving an average of 52 minutes (with a range of between 6 and 206 minutes).

De Novo premarket review
The Viz.AI application was reviewed by the FDA through its De Novo premarket review process, a regulatory pathway for new types of medical devices that carry low to moderate risk, but lack a legally marketed predicate device to base a determination of equivalence. The FDA action creates a new regulatory classification, allowing other devices with the same medical imaging intended to obtain marketing authorization by 510(k) notification. One of the first areas to benefit from Viz.Ai will be AI or computer-aided triage devices, whose potential in fields such as emergency medicine is likely to be vast. Viz.AI, Inc., itself is developing Viz ICH, which uses AI to automatically detect intra-cerebral hemorrhages and triage the patient directly to the neurosurgeon on call.

Decision support for breast cancer screening
Nine months after FDA approval of Viz.AI, at the 2018 Radiological Society of North America (RSNA) annual meeting in November, Siemens Healthineers showcased the AI-based features of syngo.Breast Care, a mammography solution. syngo.Breast Care aims to provide interactive decision support for breast cancer screening.
Transpira, Siemens’ mammography reading software, is based on deep learning techniques, with training provided via over 1 million images. As a result, syngo.Breast Care’s AI-based algorithms evaluate and interpret individual lesions as well as 2-D mammograms and 3-D tomosynthesis. The system also sorts and scores cases on a 10-point scale, based on radiologist preferences of risk factors such as lesions, micro-calcifications and other abnormalities.
Siemens Healthineers aims to integrate interactive decision support into syngo.Breast Care, and reduce radiologists’ workload for the interpretation of mammograms. This has become especially challenging, given rapid growth in the use of techniques such as 3-D breast tomosynthesis.

Small firms also in play
Smaller firms have also targeted this area. ICAD’s ProFound AI, for example, also leverages AI to detect cancer in breast tomosynthesis. The software, which was FDA cleared less than a month after syngo.Breast Care was unveiled, examines every image in a tomosynthesis scan, detects malignant soft tissue densities and calcifications.
Profound AI estimates a ‘Certainty of Finding’ for each detection and, like the classification system in syngo.Breast Care, assigns Case Scores to each case to represent confidence that a detection or case is malignant. The scores are represented on a scale from 0 to 100 percent, with higher scores indicate high confidence levels in malignancy. This, in turn, is expected to improve detection, lead to fewer patient recalls and save mammographers time in reading images. This makes it geared toward screening, although it can evidently be used for diagnostic studies.

AI at inflection point
The above examples demonstrate that the use of AI is now close to an inflection point in terms of clinical decision support tools. These will provide physicians usable interactive and dynamic pathways which move beyond decision support to true evidence-based decision making, along with personalized care recommendations.
To many experts, AI seems to have been the missing link for tools that assist radiologists in improving appropriateness of follow-up recommendations for incidental findings, and thereby to enhance adherence to guidelines available at point of care. One of the consequences of such AI-assisted tools will be to reduce the variability in follow-up recommendations, as well as unnecessary imaging studies.

Diagnosis and decision support versus analysis and detection

Maximum attention to AI in imaging is currently on diagnosis and decision support. AI in areas such as quantitative analysis and assisted detection can be considered a spin-off from automation, which has been around for a longer period of time, but reinforced more recently by machine learning.
Automated quantification tools are now sufficiently mature and routinely accepted in the market. AI algorithms are used to make measurements from imaging exams and perform calculations which were previously manual and time-consuming. AI-driven quantitative analysis tools also are being used in data analytics for data mining electronic medical records, billing systems, patient scheduling and even in stand-alone scanners. Mined data range from radiation dose used by particular technologists for specific protocols to predictive analytics that pinpoint spikes in demand by day and time, and schedule back-up staff in the radiology department.
By contrast, the application of AI (and even automation) in medical fields such as computer-aided diagnosis and clinical decision support is very recent, and is likely to be some time before they become commonplace. The principal focus on AI use for image diagnosis is where timing is crucial – such as a heart attack or stroke (e.g. Viz.AI Contact). Closely related areas include tools to reduce review time for complex exams, and help triage patients needing more immediate care or other kinds of back-up.

Other new AI imaging applications

One exciting new entrant into AI in imaging is IcoMetrix, from Belgium’s IcoBrain. This FDA-cleared algorithm analyses CT scans to characterize traumatic brain injury, using deep learning to quantify the severity of such typically qualitative indicators of brain injury as hyperdense volumes, compression of the basal cisterns and midline brain shift.
Another FDA-cleared device is Cardio AIMR, which analyses MR images for cardiovascular blood flow. Its developer, Arterys, also has other AI tools to measure and track liver lesions and lung nodules, accelerate display of medical images, and interface with the common desktop Google Chrome browser to display mammograms.

The challenge of integration
Although the FDA is clearing the way for follow-on AI products, there are concerns that the process is constrained to highly specific medical imaging diagnostic reviews. Some radiologists are questioning the viability of new AI software systems, if they require scores of different contracts and integration into a hospital or enterprise imaging system – which would be a problem not only for hospital IT departments but also for legal review.
One of the ways forward is by reconfiguring approaches to enterprise imaging by streamlining workflow. Some vendors are developing bridges between different AI applications. One of the immediate goals is to have AI imaging dovetail into picture archive and communication systems (PACS) as well as vendor neutral archives. For example, software is designed to receive DICOM images directly from any CT scanner to a local virtual machine (VM) behind a network’s firewall.

Major firms nurture start-ups
Leading healthcare technology vendors are also starting to actively partner with smaller companies to provide a combination of in-house and third-party apps via a web-based AI app store platform. One good example of this is Siemens’ Digital Ecosystem, which offers an online menu of apps from Siemens and its partner, including some offering AI-enabled technology. Similar AI app store initiatives are also being taken by other vendors.
At RSNA 2018, where Siemens showcased syngo.Breast Care, IBM Watson said it would begin to partner with AI vendors to offer products on its new AI Marketplace, by offering standardized application programming interfaces (API) for building or integrating third party software and making it available through the IBM Cloud. Smaller vendors have seized such opportunities. French imaging agent vendor Guerbet, for instance, is working with IBM Watson Health to develop AI software to support liver cancer diagnosis and care.
IBM had initially planned to develop and launch its own AI solutions across the healthcare spectrum. However, it had to cope not only with delays in commercializing its own AI products, but small and nimbler start-ups, such as viz.AI getting ahead in obtaining FDA clearance. The biggest setback was MD Anderson ending its partnership on cancer imaging with IBM.
Other major players are also treading similar paths. GE Healthcare’s Edison platform is designed to help accelerate the development and adoption of AI and other new technologies, with clinical partners using Edison to develop and test algorithms and mate them to Edison applications and smart devices. On its part, at RSNA 2018, Philips Healthcare also launched its IntelliSpace Discovery 3.0 visualization and analysis platform to prepare patient data to train and validate deep learning algorithms. The platform is designed specifically to support imaging research.

FDA to formalize De Novo rules
Developments in AI-enabled clinical decision support, like broader AI healthcare applications, are likely to pick up after the FDA decided to formally establish regulations for the De Novo classification process in December 2018. Although the De Novo process is part of the Food and Drug Administration Modernization Act, the FDA Safety Innovation Act and the 21st Century Cures Act, it is currently not covered by any specific regulations. If finalized, the proposed rules are intended to provide clarity and transparency on the De Novo classification process.