Beukenlaan 137
5616 VD Eindhoven
The Netherlands
+31 85064 55 82
info@interhospi.com
PanGlobal Media IS not responsible for any error or omission that might occur in the electronic display of product or company data.
Cardiovascular disease is the most frequent cause of mortality globally, with cancer the second most frequent cause: CVD accounts for over 30percent, and cancer around 17percent, of deaths worldwide. In the more affluent western countries, because of the enormous improvements in diagnosis and management of CVD, cancer has overtaken CVD as the leading cause of death. However as populations age the two conditions frequently coexist. Of course many of the modifiable risk factors are shared, but CVD is also a known complication of cancer therapy and recent robust population studies have shown that patients with some forms of CVD have an increased risk of cancer.
Most of the modifiable risk factors for both conditions are well known, and include tobacco smoking, physical inactivity, unhealthy dietary habits and obesity. There are also well established risk factors for CVD that recent studies suggest may also be risk factors for cancer, such as Type 2 diabetes, and hypertension and hyperlipidaemia, both prevalent in cancer survivors. Alcohol consumption, a known risk factor for cancers including those of the alimentary tract, liver and breast, is also a risk factor for CVD (unless consumption is light, which is still considered protective against CVD).
As the number of patients surviving cancer continues to increase, more and more data are available demonstrating that the risk of morbidity and mortality from CVD in these individuals is greater than in subjects without a history of cancer. For instance, a robust analysis involving over a million female survivors of breast cancer compared with control women who had not had cancer reported that the risk of CVD mortality was significantly lower in the control group. Cancer itself can cause local and systemic cardiovascular conditions such as effusions and arrhythmias, and in addition many of the drugs and drug combinations used in cancer chemotherapy can be cardiotoxic, such as anthracyclines, trastuzumab and most of the approved tyrosine kinase inhibitors. Radiation therapy can affect the pericardium, valves and myocardium long term.
Recently a Danish group of over 9000 cancer-free chronic heart failure (HF) patients were compared over time with the general Danish population and a significantly increased risk of cancer was demonstrated in the HF group. Over a thousand US cancer-free survivors of myocardial infarction followed by HF were also shown to have a significantly higher risk of developing cancer compared with patients who did not have HF.
It is surely prudent that all healthcare providers as well as CVD and cancer patients are informed about this bidirectional relationship.
There are growing concerns about an unfortunate but often-unavoidable scenario in modern medicine. Although the latest generation of drugs has improved patient survival for a vast array of diseases, the prolongation of life is often accompanied by a sharp increase in the probability of adverse effects of medication. Treatment of one disease can provoke or complicate another.
Clinicians, of course, focus on the more urgent and life-threatening condition. However, the choice is neither always straightforward or easy. In certain cases, there are both short-term complications and long-term consequences.
One major area of attention in recent years is cardio-oncology (or onco-cardiology). This concerns the development of heart problems in patients treated for cancer. In cancer survivors, years or even decades could elapse after chemotherapy or radiation, before the emergence and detection of problems.
Origins in anthracycline side effects
The origins of ‘cardio-oncology’ date back to the late 1960s/early 1970s, when the use of anthracycline anti-cancer medication began to be associated with cardiac dysfunction – a major side effect.
Anthracyclines like doxorubicin are commonly used in the treatment of solid tumours (e.g. breast cancer, osteosarcoma) and hematologic malignancies (acute lymphoblastic leukemia, Hodgkin- and non-Hodgkin lymphoma etc.)
A variety of studies beginning from the late 1990s through to the late-2000s found the risk of congestive heart failure (CHF) with high cumulative dose of anthracyclines ranging from 3-5% with 400 mg/m2, 7-26% at 550 mg/m2, and 18-48% at 700 mg/m2. Since then, better management of total anthracycline dose has seen CHF reduced significantly.
However, given two demographic factors (growing incidence and survival rates of cancer patients in a high-risk ageing population), the number of patients with cardiac complications remains elevated and is likely to grow further in the coming years.
Cardio-toxicity near-universal for anti-cancer drugs
Though breakthroughs in cancer research have led to therapies selectively targeting malignant cells, many new treatments too continue to cause problems with the heart. In reality, virtually all anti-cancer agents are associated with a significant degree of cardio-toxicity These range from direct cytotoxic effects and cardiac systolic dysfunction, to ischemia, arrhythmias, pericarditis and repolarization abnormalities.
The tyrosine kinase inhibitor, Trastuzumab, for example, also affects cardiac function. Indeed, the HER2/ErbB2 protein in certain breast cancer cells targeted by trastuzumab plays a major role in the myocardium, and it was the occurrence of severe cardiac side effects with trastuzumab which led to the recent revival of serious interest in cardio-oncology.
Other challenges are also seen with newer cardiac agents such as imatinib and bevacizumab. The first contributes to cardiac decompensation by altering preload through fluid retention, while the latter achieves the same effect by alteration afterload through hypertension. Ifosfamide is associated with arrhythmias, while 5-cisoplatin and the anti-metabolite 5-fluourouracil cause cerebrovascular disease.
Type I and II cardio-toxicity
Since 2005, physicians have been using a classification model to define and distinguish between two types of cardio-toxicity.
Type I results in the direct and irreversible damage to the cardiomyocyte, principally in a dose-dependent manner. Anthracyclines are a good example of Type I cardio-toxicity.
Conversely, Type II cardio-toxicity entails cardiac dysfunction with less prominent structural injury or irreversible cell damage. Crucially, it does not exhibit dose dependency, is usually transient and carries a better prognosis. Trastuzumab is associated with Type II cardio-toxicity .
No rest for the heart
Overall, the heart is especially vulnerable to cancer treatments. Cardiac cells are incapable of division or regeneration. They lack sufficient ability to heal if damaged, especially if active – an especially poignant issue since the heart in a living person never rests totally/stops beating. Cardiac cells are also highly sensitive to stress. Disruptions can impact the heart in a negative fashion and do so significantly. Such stress and disruption can be caused by medications, not least against cancer.
An understanding of onco-cardiology will therefore be critical for effective, long-term care of cancer patients, and there is growing recognition that cardiologists should be involved or consulted when cancer drugs are given to patients.
There already are some promising results due to such involvement. Cardio-toxic effects of chemotherapy seem to be decreased by the concurrent use of angiotensin-converting enzyme (ACE) inhibitors, angiotensin receptor blockers, or beta-blockers. Anti-platelet or anticoagulation therapy offer improvements in outlook for cancer patients with a potential hyper-coagulable status, associated with chemotherapy.
Cardiac risks of radiation therapy
Medication is however not the only problem.
Radiation therapy too is associated with all-inclusive involvement of the heart (myocardium, pericardium, valves and coronary arteries) and leads to accelerated atherosclerosis in the great vessels and fibrotic changes to the valves, pericardium and myocardium. However, reduction in left ventricular ejection fraction (LVEF) and development of congestive heart failure (CHF) is considered to be one of the most serious problems and has consequently drawn maximum attention. Confounding the problem is one of lead-lag. For most patients, such effects can appear only after a decade or more following radiotherapy.
New approaches
Once again, new cardio-oncological approaches are seeking to improve longer-term outcomes by reducing the dose of radiation to the heart in cancer patients. Included here are techniques such as intensity-modulated radiation therapy, proton beam therapy, breath-hold techniques and prone positioning, as well as 3-D treatment planning with dose-volume histograms to precisely calculate both heart volume and dose.
The so-called normal tissue complication probability (NTCP) model takes account of the dose and the volume of normal tissues subject to radiation exposure and can be used to make a correlation between a given dose and the risk of cardiac mortality, over a period of 15 years.
Cardiac disease as a therapeutic barrier to cancer
Given the growing connection between today’s cancer survivor and tomorrow’s heart disease patient, many hospitals have begun to dedicate multidisciplinary programmes focused on cardio-oncology. Their aim is to proactively, and sometimes aggressively, balance benefits of cancer treatments against the risks of adverse cardiovascular effects. Though the immediate goal is to improve outcomes for cancer patients with cardiac challenges, eventually, cardio-oncology seeks to eliminate cardiac disease as a barrier to effective cancer therapy.
Some cardio-oncology programmes emphasize the need to consider cardiovascular health in the shortest possible interval of time after a cancer diagnosis. The objective is to not just manage complications as they arise, but assessing and mitigate cardiovascular risks, in both acute and chronic terms, to optimize long-term outcomes.
On their part, cardiologists are expected to stay abreast of all current and emerging cancer therapies – in terms of their cardio-toxic effects. This will allow them to recommend concurrent heart-protective interventions and establish a tailored approach to cardiac therapies for cancer patients.
Detecting cardio-toxicity with echocardiography
There are currently several approaches for the detection of cardio-toxicity and cardiac function. The most commonly used is 2-dimensional echocardiography (2-D echo), to identify anthracycline-induced cardiomyopathy based on left ventricular ejection fraction (LVEF) parameters. One recent study at the European Institute of Oncology in Milan, on a mainly breast cancer population treated with anthracyclines, used standard 2-D echo for prospective and close monitoring of LVEF over the first 12 months after completion of chemotherapy. The technique provided early detection of almost all cases of cardio-toxicity (98%), and prompt treatment led to normalization of cardiac function in most cases (82%). In other words, LVEF at the end of chemotherapy was an independent predictor of further development of cardio-toxicity.
However, only 11% of patients made complete recovery (with LVEF at least equal to the value before initiation of chemotherapy initiation). The researchers concluded that approaches to prevent development of left ventricular dysfunction (LVD) appear more effective than therapy interventions aimed at countering existing damage which can be progressive and irreversible in many cases.
Indeed, some research suggests that diastolic dysfunction precedes LVEF reduction in patients with chemotherapy-induced cardio-toxicity. However, to date, no diastolic parameters have been proven to definitively predict cardio-toxicity, and the role of diastolic dysfunction in cardio-toxicity screening remains controversial.
Strain-echocardiography
Newer technology promising improved accuracy in calculating LVEF is strain-echocardiography, which measures myocardial deformation. One common metric, peak systolic longitudinal strain rate, is increasingly accepted as a tool to identify most early-stage variation in myocardial deformation during anticancer therapy.
However, long-term data on large populations confirming the clinical significance of this is not yet available. There are also several other limitations such as the need for offline, time-consuming, analysis and variability between echo machines and software packages.
Biomarkers
There is fast-growing enthusiasm about the use of biochemical markers, in particular cardiac troponins, for early real-time identification and monitoring of antitumour drug-induced cardio-toxicity Cardiac troponins are proteins within the myocardium, released within hours of damage to the myocyte. Studies show troponins detect cardio-toxicity at a preclinical phase, long before any reduction in LVEF in patients who have been treated with anticancer drugs.
Such an approach would annul the variability reported with imaging between ultrasound observations. However, there is still more research needed to determine the precise timing of biomarker measurement.
The most promising (and potentially useful) research priorities are allocated to prediction of the severity of future LVD, given that peak troponin value after chemotherapy closely correlates to LVEF reduction. Some researchers also seek to stratify cardiac risk after chemotherapy, in order to personalize the post-chemotherapy process, excluding patients who are not at risk from prolonged monitoring
Although not commonplace, carotid artery stenting (CAS) is occasionally accompanied by the protrusion of plaque into the stent lumen, and a variety of ischemic complications during intra- and post-operative periods. Among these complications, plaque protrusion (PP) into the stent and thrombus on the stent after CAS are some of the most worrying for clinicians.
The correction of PP is achieved by additional post-dilations or stent-in-stent implantation.
First noticed in mid-1990s
The first observations of PP date over two decades and provided the impetus for embolic protection devices. In a paper published in the mid-1990s, a team led by Frank Veith, M.D (from the Mayo Clinic) suggested that restoration of flow and removal of protection devices might lead to the continuing break off of protrusions and provoke some delayed strokes.
Today, apart from PP, key causes of late in-stent stenosis seen in CAS include neo-intimal proliferation due to self-expanding stents as well as restricted post-procedural stent dimensions from inefficient balloon dilation.
The covered stent
One way to prevent PP is by covering the plaque with a stent graft. Covered stents also offer advantage in usability, without the need for distal protection devices in difficult internal carotid arteries, where a protection filter may be near-impossible to place. However, covered stents are accompanied by a high rate of restenosis. In one randomized trial in the mid-2000s, researchers at Vienna Medical University reported a 38% restenosis rate in patients treated with covered stents for carotid artery stenosis, while restenosis was absent in the bare stent group. The precise causes of high restenosis rates with stent grafts require further research. One suspected factor is the buckling of a covered stent’s proximal and distal ends and the prevention of endothelization.
Both symptomatic and asymptomatic events
During stent implantation, plaque disruption and distal migration of plaque particles may cause symptomatic or asymptomatic ischemic events, in spite of protection devices. These can be viewed on diffusion-weighted (DW) magnetic resonance imaging (MRI).
In some CAS cases, physicians encounter plaque particles filling filters leading to symptomatic cerebral embolism. This is particularly true with ulcerated plaque and severe stenosis. What is now of growing concern is that, after stent implantation, plaque protrusion into the lumen can lead to peri-procedural stent thrombosis, 30-day stroke, and late in-stent stenosis.
Stroke biggest complication
The biggest complication with carotid artery stenting is stroke. This can occur during CAS and for up to 30 days after the procedure. Although the cause of late stroke after CAS remains unknown, PP is generally suspected to be a key cause.
PP incidence is evaluated by IVUS (intra-venous ultrasound) and angiography, although as we shall see, there are variations in results based on methodology. The prognosis of PP (over a 30-day period) and the incidence of ischemic lesions (48 hours after CAS) are usually assessed by diffusion-weighted images.
The Tokai study
In recent years, one oft-cited report concerns a study by the Department of Cardiology at Tokai University School of Medicine (Isehara). The findings were published in October 2014 by the ‘Journal of Stroke and Cerebrovascular Disorders’. During their study, the Tokai researchers evaluated 77 CAS procedures, which were performed consecutively with IVUS between May 2008 and December 2012. All cases were distally protected with filter devices. The rate of PP was assessed at the end of each procedure using IVUS and angiography.
Six plaque protrusions (7.8%) through the stent struts were detected by IVUS but only two (2.6%) by angiography. One of the major predictors of PP was pre-procedural severe stenosis with flow delay. Overall stroke rate was 2.6% (major 0%, minor 2.6%), and these occurred in the catheterization laboratory. However, no late stroke was observed at 30 days after procedure.
One of the key outcomes of the Tokai study was that IVUS seems to detect plaque protrusion better than angiography. Since the adequate management of plaque protrusion is considered as a means to reduce stroke complications, IVUS usage is worth considering.
The Yao-Nara study
In 2017, results of another, broader Japanese effort by researchers at Ishinkai Yao General Hospital (Yao) and Nara Medical University (Nara) suggested different conclusions. The study, which was published in the April 17 issue of ‘JACC: Cardiovascular Interventions’, sought to clarify the frequency and prognosis of plaque protrusions in CAS by analysing data on 328 patients treated under IVUS guidance in the period 2007-2016, using different types of stents and embolic protection devices.
At 30 days, the rate of ipsilateral ischemic stroke was 2.8% and the rate of transient ischemic attack was 2.6%. There were no patient deaths. Moreover, in most stroke cases, symptoms were observed immediately after dilatation. New ischemic lesions were found in 35.7% of patients within 48 hours of the procedure, based on diffusion-weighted imaging (DWI).
Lack in lesion variation, but stent type matters, as does evaluation method
One of the most intriguing conclusions was the lack of difference in the incidence of new ischemic lesions, in terms of stable versus unstable plaques. Analysis by stent type, however, did indicate difference. There were more ipsilateral ischemic lesions with open-cell stents as compared to closed-cell stents.
The authors suggest the findings indicate a necessity to minimize PP “to prevent periprocedural ischemic stroke” and that the placement of open-cell stents with high radial force may disintegrate unstable plaque, causing protrusions. One strategy mentioned by the authors to manage PP is to perform IVUS to check for large-volume protrusions. The latter are then sought to be differentiated as being either ‘convex’ or ‘non-convex’. For the former, stent-in-stent placement is performed using closed-cell stents until the disappearance of the protrusion. In the case of ‘nonconvex’ protrusions, the authors recommend 5-10 minutes of observation, followed again by stent-in-stent placement should the protrusion enlarge, or clinical follow-up within 30 days after CAS in case of no enlargement.
As we observed previously, there are differences in PP incidence based on whether it is evaluated by angiography or IVUS. One of the most significant limitations of the Japanese study above was the occurrence of 27 cases of plaque protrusion on IVUS, but just nine cases on angiography. The study protocol required confirmation by both modalities.
Limitations to Yao-Nara study
In an editorial accompanying the study, William A. Gray, MD (Lankenau Heart Institute, Wynnewood, PA), cautioned that this two thirds difference “will clearly affect many of the subsequent associations and conclusions.” Gray also underlined that by treating plaque protrusion with stent-in-stent placement in approximately half of the cases, the researchers might have potentially changed the clinical and imaging outcomes. Furthermore, he cautioned, the study was not core-lab controlled, with no routine use of MRI before and after procedures, and that the assessors were not blinded. Finally, they did not mandate use of specific stents or perform independent neurological assessment of clinical outcomes.
As a result, the association between stent type and plaque protrusion is ‘likely’. However, it may not be as strong as the authors contend.
Such shortcomings are likely to be addressed when the Japanese effort is paired with emerging data showing reductions in both plaque protrusion and ischemic lesions via the use of mesh-covered stents. Gray agrees that this is strengthening the case for “improvements in stent design.” Indeed, emerging micromesh stent designs are expected to contribute greatly to prevention of plaque protrusion and may become a new standard for CAS.
SCAFFOLD trial
In the United States, the SCAFFOLD trial, led by Peter A. Schneider, MD, of Kaiser Foundation Hospital at Honolulu (Hawaii) is completing evaluation of a mesh-covered, open-cell heparin coated stent in patients at high surgical risk. The objectives are to make the first 30 days safer, with the understanding that reduced cell size equates to less plaque prolapse and fewer delayed events.
Other similar trials using different mesh technologies are also under way, and more are imminent. In Italy, for instance, University of Roma La Sapienza has begun a positive-control study to analyze and compare the rate of off-table subclinical neurological events in two groups of patients submitted to CAS with a close-cell stent, and a new mesh-covered carotid stent called C-Guard.
Overall, new parameters are coming into place, via stents with differences in pore size, flexibility etc. The drivers for such efforts range from new materials to a broad range of cardiovascular conditions. In France, for example, University Hospital Grenoble is conducting trials with Mguard, a stainless-steel closed cell stent covered with an ultra-thin polymer mesh sleeve, to prevent distal embolization during percutaneous coronary intervention in ST-segment-elevation myocardial infarction.
Case selection and stenting success
One of the observations of SCAFFOLD was that case selection for stenting was a key “to good clinical results.”
So far, patient selection criteria for CAS is largely based on surgical risk related to other co-morbidities. The morphology of the atherosclerotic plaque is given little attention, although studies have demonstrated the existence of extensive variability, which in turn confers specific risks for plaque vulnerability. Overall, the detection of unstable plaque on MR plaque imaging and the use of open cell stent are considered to be significant predictive factors of PP.
In recent years, there have been growing calls for devising best practices in peri-procedural management and follow-up, and for continuous feedback from clinicians to industry to improve stent design.
In general, achieving better outcomes of CAS is seen as the best method to solidify its place as a frontline treatment of carotid vascular disease.
One promising approach for patient selection and identification of plaque, has been the use of virtual histology intravascular ultrasound imaging (VH IVUS). Researchers have suggested a strong correlation between VH IVUS plaque characterization and the true histological examination of plaque following endarterectomy, especially in ‘vulnerable’ plaque types.
The results of one of the earliest efforts in this area were published in October 2007 in ‘The Journal of Endovascular Therapy’. This followed a prospective, two-arm study by the Arizona Heart Hospital & Translational Research Center. The researchers enrolled 30 patients.
In the first arm of the study, 15 patients underwent VH IVUS examination of carotid plaque with a cerebral protection device. This was immediately followed by carotid endarterectomy (CEA), and the comparison of ‘virtual’ with true histology (classifying plaque type by VH IVUS and histopathology in a blinded study).
In the second arm, 15 patients undergoing CAS had a preliminary VH IVUS scan performed with cerebral protection. Debris collected from the filter following stenting was examined histologically and compared with the VH IVUS data.
The diagnostic accuracy of VH IVUS to agree with true histology in different carotid plaque types was 99.4% in thin-cap fibroatheroma, 96.1% for calcified thin-cap fibroatheroma, 85.9% in fibroatheroma, 85.5% for fibrocalcific, 83.4% in pathological intimal thickening, and 72.4% for calcified fibroatheroma.
Rupture of the coronary plaque surface, accompanied by the exposure of thrombogenic, red cell-rich necrotic core material, is one of the most important underlying mechanisms in acute coronary syndrome (ACS). After decades of idling, such plaques sometimes suddenly burst into this life-threatening condition. The rupturing occurs during the evolution of coronary atherosclerotic lesions, and is often accompanied by super-imposed thrombosis.
To date, the precise mechanisms involved in plaque erosion remain generally unknown. Coronary spasm is simply a universal suspect.
Prevention is therefore considered to be the only effective means for reducing the mortality and morbidity of coronary heart disease.
Coronary lesions that are prone to rupture have a distinct morphology compared with stable plaques, and provide a unique opportunity for non-invasive imaging to identify vulnerable plaques, before they lead to clinical events.
Plethora of terminology
The severity and prognosis of plaque rupture is characterized by a plethora of terminology. Plaque vulnerability describes the risk of symptomatic thrombosis in the short term, whereas plaque ‘activity’ remains ambiguous (referring to one of a wide variety of processes associated with progression).
‘Plaque burden’ is accepted to denote extent of disease. It is a measure of the extent of atherosclerosis, regardless of the cellular composition or activity of plaques. There are various ways to measure the burden: plaque volume, lesion-coverage of arterial surface – sometimes based on using computed tomography (CT) to measure coronary calcium score, or ultrasound to assess plaque area in the carotid bed. Given that atherosclerosis is multi-focal (and impacts upon the entire vasculature), a high plaque burden in one region (e.g. the lower limbs) may be a marker for advanced disease elsewhere. The highest concern on the latter consists of the coronary arteries due to their high degree of susceptibility.
Size is not everything
Rather than plaque size alone, the risk of rupture depends more on the composition and type of plaque, inter alia, richness in soft extracellular lipids – and macrophages. Indeed, structurally what is required for plaque rupture is an extremely thin fibrous cap. As a result, ruptures are usually minuscule and occur mainly at the periphery of the cap covering the lipid-rich core – among lesions clinically defined as thin-cap fibroatheromas. They have reduced tensile strength and are more extensible than intact caps, while the presence of collagen and smooth muscle cells is lower. In effect, as extracellular lipid accumulation progresses (usually due to external stress/triggers), the fibrous cap weakens and predisposition of a plaque to rupture increases.
Several other factors are also believed to play a concurrent role, among them inflammatory cell recruitment, macrophage formation, necrosis, matrix synthesis, calcification, arterial remodelling, etc.
The interaction between these factors is not only complex but variable too, as far as the development of plaque is concerned. This leads to unpredictable rates of progression and variable clinical outcomes.
Not all ruptures lead to ACS
Nevertheless, there are, once again, certain other issues in play with regard to the clinical relevance of vulnerable plaque detection. Most plaques remain subclinical and asymptomatic. Others elicit acute thrombosis and may lead to an acute coronary syndrome (ACS).
However, not all plaque ruptures cause ACS. Some develop obstructively (stable angina). Indeed, on its own, stable angina pectoris derived from atherosclerosis is rarely fatal without scarring of the myocardium – the latter can provoke an arrhythmia presenting as sudden cardiac death.
Confusion arises in other contexts too. For example, the presence of thrombosis is not the same as the occurrence of ACS. Indeed, some physicians believe that the majority of ruptures and erosions are asymptomatic in the short term, although they may sometimes lead to gradual coronary narrowing.
Nightmare for prognosis
This lack of clarity has proven to be a nightmare. ACS occurs only when vulnerable plaque, platelet activation and impaired fibrinolysis occur alongside inflammatory states. Such vulnerability may change with time, and it is these changing dynamics vis-a-vis stress/triggers which determines the exact moment and point of rupture. As a result, the non-invasive detection of vulnerable plaques is considered to be of great clinical relevance, especially in ultra-high risk patients.
At the cutting-edge
Currently, a host of new, non-invasive techniques are being harnessed to assess and predict the likelihood of coronary plaque rupture. Leading the way are computational fluid dynamics (CFD) and fractional flow reserve (FFR) methodologies. They are based on harnessing supercomputing capability to the analysis of CT angiography.
The high quality imaging and sub-millimetre resolution of modern computed tomography (CT) scanners allows characterization and quantification of lesions at accuracies unimaginable barely a decade ago. CFD supplements the functional information of CT-based plaque assessment by calculating lesion-specific endothelial shear stress and FFR. Such supplementation of functional information by quantified morphologic data about coronary plaques is considered to be one of the best means to detect vulnerable plaques.
FFR guided therapy
For patients with coronary calcification and hemodynamically significant obstructive disease, FFR has long been considered the best solution for guiding re-vascularization of lesions and improving outcomes. FFR provides an index of atherosclerosis and lesion significance, as measured with a pressure-sensitive angioplasty guidewire. FFR-guided therapy has improved patient outcomes, reduced stent insertions. However, it is used in less than one-tenth of cases due to procedural and operator related factors – above all, patient discomfort due to time and motion artifact as well as cost.
Coupling FFR to CT angiography
More recently, due to the developments in non-invasive CT imaging and the application of CFD modelling to CT angiography datasets, FFR can be derived non-invasively without requiring modification of standard CT angiography acquisition protocols or inducing hyperemia.
Such non-invasive FFR, moreover, has been shown to demonstrate excellent correlation with invasive FFR.
PLATFORM Study
One of the key studies investigating the impact of combining FFR and CT was called PLATFORM (the Prospective LongitudinAl trial of FFRCT: Outcome and Resource Impacts).
PLATFORM, which ran from the end of 2013 to 2015 at centres in the US and Europe, demonstrated improved patient selection for invasive angiography using a combination of coronary CT angiography (CCTA) along with fractional flow reserve CT (FFRCT). The so-called CCTA-FFRCT approach increased the chance of identifying obstructive coronary artery disease among those intended for invasive testing and held forth the promise of serving as an efficacious gatekeeper to invasive coronary angiography (ICA).
The findings were conclusive, with numbers presented by researchers at the European Society of Cardiology at London in 2015. The use of FFRCT in patients with planned invasive catheterization, they noted, was associated with a reduction in the rate of finding no obstructive CAD at ICA, from 73% to 12%. It also resulted in cancellation of 61% of ICAs.
Computational fluid dynamics
In effect, the adoption and translation of CFD modelling may be considered to have revolutionized cardiovascular medicine.
CFD is a specialist IT discipline bringing together advanced mathematics and fluid mechanics. Its roots lie in mission-critical/high-performance engineering systems. Much of its history is intimately connected to the aerospace industry, to enhance the accuracy of complex simulation scenarios such as transonic or turbulent air flows.
In medicine, the first-ever CFD investigations began in cardiovascular research, to clarify the characteristics of aortic flow in a degree of detail below the threshold of experimental measurements. Computer-aided design (CAD) models of the human vascular system were built using modern imaging techniques, coupled to rapid, economical, low-risk 3-D prototyping. The ensuing models precisely computed factors such as blood flow and tissue behaviour and response, taking close consideration of boundary conditions such as complex systemic/physiological pressure and ‘virtualized’ metrics such as wall shear stress.
CFD modelling has already revolutionized the development of devices such as stents, valve prostheses, and ventricular assist devices.
CFD is currently being translated into cardiovascular clinical tools for minimally-invasive application to a wide spectrum of coronary, valvular, myocardial and peripheral vascular diseases. One of the biggest advantages offered by combining high-resolution imaging with CFD is that unique patient-specific data can be juxtaposed into multi-scale, variable duration models to make individualized risk prediction and planning possible. This is directly opposed to registry-based, population-averaged data.
In the future, it is expected that the trend to ‘digital patient’ representation, combined with population-scale numerical models, will reduce cost, time and risk associated with clinical trials.
The massive processing power brought to play by CFD quickly led to the understanding that mechanistic forces of arterial wall shear stress (WSS) and axial plaque force acting on coronary plaques might be responsible for both the development of coronary plaque and its vulnerability to rupture.
For example, it is difficult to measure WSS, a key factor in the development of atherosclerosis and in-stent restenosis, without invasive procedures – with all the latters’ attendant risks and frequent futility. One study demonstrated that less than a third of patients with suspected obstructive coronary artery disease (CAD) showed its presence after invasive coronary angiography (ICA), while an even-smaller number had flow-limiting obstructive disease based on invasive fractional flow reserve (FFR).
In contrast, CFD models can both compute and map the spatial distribution of WSS, establishing links between haemodynamic disturbance and atherogenesis and explaining why atherosclerotic plaque tends to be deposited at arterial bends or bifurcations.
CFD modelling has also been central to comprehending the role of WSS in endothelial homoeostasis. While turbulent blood flow reduces WSS and stimulates adverse vessel remodelling, non-disturbed laminar blood flow seems to be associated with higher WSS – which reduces endothelial cell activation. In a March 2012 issue of ‘Circulation’, researchers from Johns Hopkins University School of Medicine, CVPath Institute at Maryland and the Mount Sinai School of Medicine in New York established that a complex series of WSS-related signalling pathways and interactions underlie the above phenomenon.
Though much more remains to be understood before such pathways can be exploited to their full extent to yield new anti-atherosclerotic therapies, few doubt that the way forward lies in further CFD models that combine dynamic fluid behaviour analysis with cellular response.
Other emerging techniques
Apart from CFD, other methodologies under consideration to quantify measurement of coronary plaque and lesions include the use of radio-frequency (RF) backscatter intravascular ultrasound. A prospective study in 2011 in the US known as ATLANTA sought to make the first-ever assessment of the accuracy of 3-dimensional, quantitative measurements of coronary plaque by computed tomography angiography (CTA) against intravascular ultrasound with radiofrequency backscatter analysis (IVUS/VH).
For the ATLANTA study, 60 patients underwent coronary X-ray angiography, IVUS/VH and coronary CTA. Plaque geometry and composition was quantified after spatial co-registration on segmental and slice-by-slice bases. The researchers found significant correlation for all pre-specified parameters by segmental and slice-by-slice analyses. Compositional analysis suggested that high-density non calcified plaque on CTA best correlated with fibrous tissue and low-density non calcified plaque correlated with necrotic core plus fibrofatty tissue by IVUS/VH.
The march of healthcare technology is not always even. Benefits on one front can often be outweighed by problems on another. Radiology is no exception to this rule.
Like other medical professionals, radiologists have begun using portals and social media to connect to patients and join the move towards personal healthcare.
Websites and radiology
Today, websites staffed by imaging professionals seek to directly address the public about radiology. Such a trend is especially pronounced in the US. Examples include radiology Q&A portals at the University of Texas’ John P. and Kathrine G. McGovern Medical School, Northwest Radiology Consultants in Atlanta, Georgia, and a host of others. One of the best known is the RSNA/ACR public information website, RadiologyInfo.org, which offers a library of resources for patients including information on how various imaging procedures are performed. In Europe, the ESR has a Website page dedicated to ‘Radiation and Patients’ and an ‘Ask EuroSafe Imaging’ Q&A page, split into three sections (CT, interventional radiology and pediatric imaging), with answers provided by radiologists from across the continent.
Radiologists and social media
Radiologists have also sought to use social media to build and continuously strengthen interactive relationships with patients outside a formal hospital or physician office setting. Such approaches have spilt over into tackling concerns after widespread reports in the media about the ‘over-use’ of medical radiation. In the US, for example, the Health Physics Society has a site dedicated largely to addressing such risk perceptions in the general public. The UK too has seen such a step with the British Institute of Radiology and the Institute of Physics and Engineering in Medicine endorsing Ask for Evidence, as part of which a panel of radiologists and medical physicists respond to questions from the public on radiation safety.
Technology versus patient downtime
These are clearly significant and laudable developments. Informed patients are increasingly regarded to be better patients by several physicians. However, other recent developments in technology, above all electronic medical/health records (EMR/EHR), are placing a growing burden on clinicians to update medical documentation, in order to facilitate real-time sharing and reduce errors. This results in less time for patient care. A key driver here, in the US, consists of federal government meaningful use (MU) requirements, which provides physicians with financial incentives to use EHRs.
For radiologists, these incentives are hardly negligible and range from 44,000 to 63,750 dollars (39,000 to 56,735 Euros) over a 5-or 6-year period via Medicare and Medicaid, respectively.
On the other hand, MU also requires 10% of patients viewing, downloading or transmitting their electronic health information, with over 40% of all imaging scans to be made accessible via certified EHR technology.
The above requirements are hardly a testimonial to efficiency. One study on MU published by the Radiological Society of North America (RSNA) in 2012 found that medical residents reported having to spend the bulk of their time updating charts and documentation, and that EHR adoption correlated directly to reduced time for direct patient care.
Radiology strives to remain at technology cutting edge
This is a profound challenge. Radiology has traditionally been the medical speciality at the cutting edge of technical advancement. It was radiology which first moved away from paper to digital technology. As a result, radiologists and industry are currently seeking to fast track solutions for increasing patient downtime and improving workflow.
Data use
In the first stage, the focus was on enhancing use of available data. Ironically, illustrating the unevenness and asynchronicity in the progress of technology, efforts were concentrated on getting more usable data out of electronic records, which did not always trickle down to radiologists. One reason was the lack of skills. Referring physicians often left responsibility to get approval for imaging to office staff, many of of who lacked the clinical knowledge required to seek such approval.
Automation: From CPOE to CDS
Soon after, the effort shifted to automation, especially in the shape of decision support (and so-called assistant clinical reasoning) tools. Such a process continues, with evolution from static to dynamic, patient-centred tools. A good example of this is the computerized order entry (CPOE) system. In 2012, a study in the ‘Journal of the American College of Radiology’ proved the clinical viability of combining radiology CPOE with imaging decision support, including pathways and algorithms, as well as classification for actionable findings.
One of the longest-used clinical decision support systems is ACR Assist from the American College of Radiology, which is designed to blend in seamlessly with radiology workflow. Clinical data is encoded in vendor-neutral ways, in order to quickly build commercial applications. The ACR has since created guidelines for radiologists and referring physicians to proceed after clinical findings. Others are also stepping in with new initiatives to enhance automation and decision support. Massachusetts General Hospital, for example, has developed Procedure Order Entry (PrOE), a surgical appropriateness system to help identify whether a procedure is necessary, and the implications of this for radiology are under active investigation. By utilizing evidence-based guidelines rather than have a less-informed entity authorize diagnostic imaging, CPOE in radiology not only enhances efficiency, but also the quality of care.
IPads and speed
One unexpected finding cited in the 2012 RSNA study on meaningful use was that residents using iPads were able to enter and update data more rapidly. Indeed, a majority of those surveyed found that iPads led to significant increases in work efficiency.
This was an opportune moment, given that a year previously, the US Food and Drug Administration had cleared the first mobile app to allow physicians to make diagnoses using iPads or iPhones.
Currently, radiology imaging applications for mobile platforms allow remote monitoring and control for a PACS administrator. Fuelled by standard tools such as DICOM viewers, these impact directly on quality control, data management and workflow efficiency.
The implications of teleradiology connectivity are especially dramatic in emergency settings. About five years ago, Mayo Clinic physicians deployed smartphones in order to assess their utility in a telemedicine stroke-management network which connected radiologists to neurologists and emergency physicians at a remote facility. The findings were encouraging, with over 90% of agreement on the key radiological findings. In the future, smartphone-based teleradiology systems are likely to become commonplace among first responders.
Image management and automation
Image management is also being used as a means to automate processes. The fast growth of technology has also necessitated unprecedented collaborations between specialists. Oncologists, for example, have been working with radiologists to analyse datasets for tumour detection and monitoring, and some studies report sharp reduction in the time required to study suspicious tissue.
On its part, Massachusetts General has also developed QPID (Queriable Patient Interface Dossier) to integrate electronic records and streamline providers’ abilities to access details in a patient’s medical history.
Other areas for attention include voice-enabled documentation, accompanied by structured reporting and data sets that pre-populate a radiology report. These not only reduce human error when inputting data but also enables radiologists to interpret and diagnose a study when a referring physician is most in need of the information – while meeting a patient.
From automation to deep machine learning
The greatest benefit of automation is to maximize the use of available data. This enhances the ability to provide not just personal but precision medicine, too. When interfaced to an appropriate radiology-focused IT platform, individual radiologists and the broader radiology (as well as clinical) community will be empowered to benefit from feedback loops that reinforce positive lessons, de-emphasize negative ones and continuously build appropriateness and best-practice guidelines. Based on the templated information in a report, colleagues (real and virtual) would be able to rapidly offer second opinions and perspectives on how to best serve a specific patient-case.
Further down the road are deep machine learning tools which will provide sophisticated, structured and in-depth data on a patient, to enable increasingly informed decisions in the context of specific and individual challenges – influenced by factors ranging from pharmacogenomics to disease staging, age and lifestyle. Such knowledge, which would create highly actionable reports, are expected to dramatically impact upon patient outcomes.
Radiology and public perception
It is no secret that professional radiological societies strongly believe there is a need to improve patient (and public) perception of the role played by radiologists in healthcare, and that this necessitates closer contact with patients. Patients after all seldom choose a radiologist. This choice is made by a referring physician or health plan.
Though radiology is essential to patient care, radiological services often seem inconvenient, a threat to privacy, sometimes mysterious and scary. The connect between a radiologist and patient is intermediated by nurses and assistants (e.g. for injecting contrast material or preparing them for the imaging procedure), or by technologists seen as managers of machines. Various studies have shown that radiologists are not always present during performance of a study and seldom introduce themselves to a patient.
As a result, patients increasingly consider radiologists to be supervisors of a technological process. The clinician requesting the examination and receiving the radiology report is considered to be the one interpreting the study and making the decision.
Patient at the core
To sum up, the core value proposition in transforming and keeping radiology up to date involves the patient. Although the growing digitization of healthcare pushes radiologists away from patients, there is a need to make these interactions more prominent. Some radiologists warn that otherwise, there is a risk of their services becoming commoditized. For such a process, there is clearly a need to draw more patient data into decision-making. One of the most ambitious efforts on this count was launched at the turn of the decade by RSNA, with funding from the National Institute of Biomedical Imaging and Bioengineering. The project, which promotes patient access to self-management tools, is known as Image Share, and consists of a secure network based on open-standards architecture. Images are exchanged between servers at radiology departments and imaging centres via the Cloud. A two-year pilot began in 2011 at Mount Sinai Medical Center in New York, followed by university hospitals in several states. In 2016, RSNA introduced a validation programme for the project, to test vendor system compliance with standards for exchange of medical images. To date, results have been satisfying.
Although initiatives like this will continue to grow in importance, they are unlikely to do more than enhance the efficiency of radiologists – and their professional judgement – in improving patient care.
Magnetom Vida, the new high-end 3-Tesla MRI scanner with BioMatrix
technology from Siemens Healthineers, was launched to the public at University Hospital Tübingen, where the first system is installed. It has been undergoing clinical tests in the hospital’s Department for Diagnostic and Interventional Radiology since December 2016.
Magnetom Vida is the first scanner equipped with BioMatrix, a brand-new, innovative scanner technology that addresses inherent anatomical and physiological differences among individual patients, as well as variability among users. Magnetom Vida and BioMatrix allow users to meet the growing demand for MR imaging, perform the full range of routine as well as complex examinations, and deliver robust results for every patient. Furthermore, the scanner also makes MRI more cost-effective by reducing rescans and increasing productivity. High-precision imaging means that radiologists can deliver essential and robust information to choose the right treatment for each patient every time. Siemens Healthineers, in collaboration with its customers, is playing an important role in taking healthcare forward in the development of precision medicine.
Siemens Healthineers has been developing this disruptive and innovative BioMatrix technology for over five years. Its introduction represents a further advance in MRI imaging as well as the next level of automation and patient centricity.
High image quality and efficient workflows – regardless of user or patient
Due to high levels of exam variability, MRI is often considered to be one of the most complex medical imaging modalities. Physiological and anatomical differences between patients as well as different experiences levels in users contribute to this unwanted variability. This frequently is a source of errors, rescans, and inefficient workflows in MR imaging, making it all the more important that MRI scanners deliver reliable and reproducible image data irrespective of the patient being examined or the person operating the system. This issue is precisely addressed with the new BioMatrix technology.
BioMatrix sensors in the table automatically track a patient’s respiratory pattern, giving users insights into a patient’s individual ability to hold his or her breath during the scan. This allows the user to select the optimal exam strategy, while also saving time during the examination. BioMatrix tuners can help avoid rescans, which represent a major burden on productivity as well as a driver of additional costs in radiology. In cervical spine examinations, for example, this feature uses intelligent coil technology to automatically set the optimal scan parameters based on the individual patient anatomy, all without any additional user interaction. BioMatrix tuners also improve the quality and reproducibility of whole-body diffusion. Precise control of scan parameters in real-time to match the individual patient anatomy makes it possible to avoid distortions, which can render diffusion imaging non-diagnostic, especially in 3 Tesla MRI. Innovative interfaces also help ensure a consistently high examination quality, accelerating workflows, and improving quality of care. BioMatrix Interfaces accelerate the scanning process by up to 30 percent. Automated patient positioning based on intelligent body models automatically moves the patient table to the correct scan position. An intuitive touchscreen user interface integrated onto the scanner allows for one-touch positioning. A new, easy-to-move motorized patient table further simplifies examinations, especially for adipose, immobile, and trauma patients.
Magnetom Vida is the first system to be equipped with the new BioMatrix technology, designed to tackle the challenges of variability and thereby, reduce unwanted variability in MRI examinations. It will help users achieve fewer rescans, predictable scheduling, and consistent, high-quality personalized examination results.
The ability to provide consistent and reproducible quality regardless of the individual patient and user will help reduce rescans, which can be a great financial burden for healthcare institutions. As publications have shown, rescans can account for up to €100,000 per year and system in additional costs.
Professor Konstantin Nikolaou, Medical Director of the Department of Diagnostic and Interventional Radiology at University Hospital Tübingen considers Magnetom Vida to be part of the general trend toward precision medicine: “To provide our patients with individual therapies, we need every piece of information available. When it comes to imaging, this means that we need robust, standardized, and reproducible image data that are always of the same quality regardless of the patient or user. Only then we can compare results and link them with additional information, such as data from laboratory medicine or genetics,” says Nikolaou, referring to the clinical validation of the new MRI scanner in his department. “Magnetom Vida gives us this data quality and comprehensive image information so that we can choose the right kind of personalized therapy and evaluate it – to see, for instance, how a patient responds to chemotherapy before tumour removal. This MRI scanner along with BioMatrix technology is the perfect fit for our current medical approaches, and is helping us on our way to quantitative radiology,” says Nikolaou.
Faster scans with very high patient comfort
Magnetom Vida has another major advantage: “We can examine sick patients faster with Magnetom Vida,” says Professor Mike Notohamiprodjo who, as head of MRI at University Hospital Tübingen, works intensively with the new scanner. “The scanner offers the highest degree of patient comfort with the performance of a research system, which speeds up our workflows,” he says. As examinations in Tübingen show, the new scanner decreases measurement times for musculoskeletal and prostate imaging compared to previous MRI systems. What is more, it does so with significantly improved image quality: “The signal-to-noise ratio in the clinical images is up to 30 percent higher than with systems from the previous generation,” says Notohamiprodjo.
While this is partly due to BioMatrix technology, it is also a result of the diverse insights that developers at Siemens Healthineers gathered from intense fundamental research and close customer collaborations. Key learnings from the development of a 7-Tesla research MRI system translated into a new 3-Tesla magnet design. Magnetom Vida’s all-new system architecture offers extremely high performance and unmet long-term stability – without requiring any more space than previous clinical systems. The new scanner’s 60/200 XT gradient system provides over 2.7 megawatts of power, making it the most powerful commercially available gradients in a 70-centimeter bore scanner. And, thanks to a very large field of view (55x55x50 cm), Magnetom Vida can also cover larger body regions in one step, such as full coverage abdominal exams.
The result is a great increase in productivity for routine examinations of the brain, spine, and joints – from correct patient positioning at the touch of a button to transferring the clinical images to the PACS archiving system. This is made possible by the GO technologies, which automate and simplify workflows from the start of the scan right through to the quality control of the image data. A new user interface allows not only for automated acquisition and processing, but also for more advanced post-processing applications to run at the scanner. With spine examinations, for instance, GO technologies reduce the time needed by about a fifth. This means that a department could carry out four additional spine examinations per day and per system. Given the decline in reimbursement rates, this is of great value to many radiological institutes.
Broader patient groups and new clinical growth areas
The system also allows customers to access additional clinical growth fields – for instance, by serving patient groups that were previously deemed unsuitable for MRI due to issues such as cardiac arrhythmias, excess weight, or health problems that prevent them from actively supporting the scan. With the introduction of Magnetom Vida, Siemens Healthineers expands its Compressed Sensing applications – which can make MRI scans up to ten times faster – to cover more body regions. It features Compressed Sensing Cardiac Cine, which allows free-breathing cardiology examinations (even when using contrast medium for comprehensive tissue characterization). Now, Compressed Sensing Grasp-Vibe, which enables dynamic, free-breathing liver examinations in one comprehensive scan by the push of button and for every patient, is also available. Until today, in contrast, dynamic liver imaging required four steps with exhausting breath-holds and complex timing. Grasp-Vibe technology also makes the post-processing of liver images significantly faster. During the studies he carried out in Tübingen, Professor Notohamiprodjo found that post-processing times fell from 20 to just four minutes.
Magnetom Vida even simplifies whole-body scans, which are currently particularly challenging, because they have to cover multiple scan sections and demand highly trained users. A new special technology, the Whole-Body Dot Engine, allows these difficult scans to be carried out in predictable time slots, as short as 25 minutes, with very high quality. This is accomplished through intelligent automation. The planning and execution of the scan requires only a few simple clicks. Providing high-quality diffusion weighted imaging is important for whole body exams; Magnetom Vida, with its BioMatrix Tuner technology, can deliver this distortion-free. Combined also with its strong 60/200 gradients and a large homogeneous field of view, Magnetom Vida makes whole-body examinations simple to perform, reproducibly, and with very high-quality. This is a major advantage, particularly when treating oncology patients, such as those with multiple myeloma, where guidelines have recently been moving toward whole-body MRI scans for therapy control.
Magnetom Vida offers not only numerous clinical advances, but also a number of improvements in energy consumption. These help to lower the total cost of ownership of the system over its entire life-cycle. Technologies such as Eco-Power provide an intelligent control of power-hungry components by switching them off when they are not needed for longer periods of time. The result is a MR scanner that consumes 30 percent less energy than the industry average for 3-Tesla scanners, as reported by the European Coordination Committee of the radiological, electromedical and healthcare IT industry (COCIR).
April 2024
The medical devices information portal connecting healthcare professionals to global vendors
Beukenlaan 137
5616 VD Eindhoven
The Netherlands
+31 85064 55 82
info@interhospi.com
PanGlobal Media IS not responsible for any error or omission that might occur in the electronic display of product or company data.
This site uses cookies. By continuing to browse the site, you are agreeing to our use of cookies.
Accept settingsHide notification onlyCookie settingsWe may ask you to place cookies on your device. We use cookies to let us know when you visit our websites, how you interact with us, to enrich your user experience and to customise your relationship with our website.
Click on the different sections for more information. You can also change some of your preferences. Please note that blocking some types of cookies may affect your experience on our websites and the services we can provide.
These cookies are strictly necessary to provide you with services available through our website and to use some of its features.
Because these cookies are strictly necessary to provide the website, refusing them will affect the functioning of our site. You can always block or delete cookies by changing your browser settings and block all cookies on this website forcibly. But this will always ask you to accept/refuse cookies when you visit our site again.
We fully respect if you want to refuse cookies, but to avoid asking you each time again to kindly allow us to store a cookie for that purpose. You are always free to unsubscribe or other cookies to get a better experience. If you refuse cookies, we will delete all cookies set in our domain.
We provide you with a list of cookies stored on your computer in our domain, so that you can check what we have stored. For security reasons, we cannot display or modify cookies from other domains. You can check these in your browser's security settings.
.These cookies collect information that is used in aggregate form to help us understand how our website is used or how effective our marketing campaigns are, or to help us customise our website and application for you to improve your experience.
If you do not want us to track your visit to our site, you can disable this in your browser here:
.
We also use various external services such as Google Webfonts, Google Maps and external video providers. Since these providers may collect personal data such as your IP address, you can block them here. Please note that this may significantly reduce the functionality and appearance of our site. Changes will only be effective once you reload the page
Google Webfont Settings:
Google Maps Settings:
Google reCaptcha settings:
Vimeo and Youtube videos embedding:
.U kunt meer lezen over onze cookies en privacy-instellingen op onze Privacybeleid-pagina.
Privacy policy