KIMES 16-19 March 2017

Medica, Nov 14-16 2016

Mortara’s New Milwaukee Manufacturing Facility

Based in Milwaukee (Wisconsin, USA), Mortara Instrument, Inc. recently opened its new manufacturing and distribution facility.

The 64,000-square-foot, air conditioned, high tech facility consolidate and expand Mortara’s manufacturing and distribution operations which were previously split between the company’s headquarters building and its warehousing operation on Sleske Drive in Milwaukee.

‘I am proud of our continued growth and our increasing prominence in the market and in our community,’ said Justin Mortara, the company’s Chief Executive Officer, on the occasion of the ground-breaking ceremony in August 2015. ‘We’re excited to further invest in our community and truly live out our promise that all Mortara products are – Built with Pride in Milwaukee.”

The facility will allow Mortara to continue its growth in Milwaukee, where the company is committed to growing. Since May 2013, Mortara has added approximately 150 jobs, bringing its total global workforce to over 420. The expansion is part of a larger growth plan that will allow for the creation of more than 150 additional jobs over the next five years.

Mortara’s entire portfolio of products is ‘Built with Pride in Milwaukee.’ Mortara is committed to delivering the highest quality products to healthcare providers and their patients. In order to consistently deliver such quality, Mortara remains dedicated to manufacturing its entire portfolio of products in the United States, and more specifically in Milwaukee.

The company is also committed to keep the supply chain as physically close as possible, with 30 per cent of components being produced within 100 miles of its facility. This allows Mortara to ship most orders within 72 hours even if 80 per cent of the orders require custom configurations, whereas many are shipped within 24 hours, if not the same day.

Local sourcing is strategic both to reduce production times, and to invest in the community. As Mayor Tom Barrett said during the ceremony, ‘Mortara’s investment in Milwaukee pays dividends for our entire community. Mortara’s success means more jobs and more economic activity in our city.’

A flourishing community is also beneficial the company’s commitment to innovation. ‘We try to innovate on a timeline of one to three years, whereas the typical rhythm in medical devices is five to seven years,’ Mortara said. The company invests about 8 percent of its revenue into research and development, especially devoted to enhance diagnostic capability and connectivity. This commitment requires recruiting top talent, which can be attracted from the main universities only if the region is thriving.

The new facility was built with aim to reduce the environmental footprint. Among the main features, porous asphalt, LED lighting, heating and cooling powered by a geothermal system, and a blue roof that retains water and releases it slowly, so as to work as a retention pond.

CardioConfirm: a Brand-New Connectivity Solution from Mortara Instrument

CardioConfirm is Mortara Instrument’s latest tool for connectivity and IT. CardioConfirm has been launched almost a decade after Mortara Instrument started a successful path leading to the adoption of the DICOM standard in all its ELITM series cardiographs, Stress Testing, Holter and Monitoring equipment.

The DICOM standard allows users to seamlessly integrate reports from Mortara devices with existing information systems available in hospitals. CardioConfirm takes connectivity a step further: in addition to traditional viewing options, its user-friendly interface is designed to provide full editing capabilities for all DICOM-enabled systems. Besides opening, editing and storing resting ECGs, physicians may now use dedicated tools for zooming in or measuring ECG waveforms, and may take advantage of a library of statements that conveniently appear with just a few key-strokes, on the basis of those normally used for reports.

CardioConfirm also offers the possibility of editing final reports of stress and Holter tests. Preliminary exports generated by DICOM-friendly systems can be edited by physicians from the main system workstation, a feature that makes the workflow smoother and reduces the time needed to review and edit these types of reports. This new OEM software allows any hospital or clinic to leverage their existing system by simply embedding CardioConfirm into it, thus eliminating the need to spend significant capital investment on an entirely new operation system.

CardioConfirm allows medical professionals to concentrate on patient care with its unique ability to integrate high quality diagnostic display of tests together with all patient information – including test results, vitals, and personal and family history – in one convenient location so that cardiologists do not need to look for additional test results that a technician may have recorded elsewhere.

As the health care landscape continues to grow and change, Mortara’s CardioConfirm is making a big impact on the continued transformation. Hospitals and health care professionals will have a more streamlined workflow, increased efficiency and the ability to focus more attention on patient care.

Mortara Instrument supplies CardioConfirm to all PACS and EMR providers and hospitals that want to expand or complete their PACS/EMR systems to include diagnostic cardiology workflow. It is available in a variety of versions that meet your need for a seamless integration with third party systems.

For further information, click here

DICOM is the registered trademark of the National Electrical Manufacturers Association for its standard publications relating to digital communications of medical information.

Reducing Alarm Fatigue, the New Challenge of Mortara Suite of Algorithms

In addition to designing and manufacturing a complete line of diagnostic cardiology and patient monitoring equipment, Mortara Instrument has always been recognized as a leader in the development of algorithms for safe and reliable ECG analysis. Now, Mortara has taken up a new challenge: Alarm Fatigue management.

Due to the increased number of monitored parameters available today and the need to reduce healthcare costs, algorithms must both be more sensitive and prevent more false alarms than in the past. Cardiac Care Units are always so busy that the large number of false alarms generated every day by monitoring systems can become a serious issue for healthcare personnel. This large number of false alarms induces so-called Alarm Fatigue’: in a nutshell, healthcare professionals, tired of wasting time in silencing false alarms, not only lose trust in their monitoring system, but tend to ignore possibly real alarms.

Mortara VERITASTM covers a large variety of diagnostic fields: from automatic resting ECG interpretation, to ambulatory Holter monitoring, to real-time algorithms specifically designed for bedside monitors and central stations, largely employed in Coronary Care and Intensive Care Units. Integrated in all Mortara product lines, VERITAS is constantly updated with new features and improved specificity and sensitivity.

Having obtained levels of sensitivity and specificity in line with major manufacturers is not enough to fight Alarm Fatigue. That is why much attention and investment have been devoted to reducing false alarm rates without affecting sensitivity. The updated VERITAS Arrhythmia algorithm defines a new standard in Alarm Fatigue management: up to 60percent less false alarms for lethal arrhythmias when compared to the most common algorithms available on the market, resulting in vast improvement of reliability of the systems on which it is installed.

The new version of VERITAS will be available on the Mortara monitoring line – SurveyorTM Central, Surveyor S4 telemetry, Surveyor S12 and S19 bedside monitors – starting November 2016*.

For further information, click  here

Mortara, SurveyorTM and VERITASTM are trademarks or registered trademarks of Mortara Instrument, Inc.

*Not available in the U.S.

The hazards of radiation exposure in the cath lab

The medical device industry is continually improving diagnostic imaging systems in order to lower radiation dose without compromising image quality, and both company articles and studies by cardiologists published in peer-reviewed journals stress the benefits for patients. However, much less emphasis is given to radiation exposure of relevant healthcare workers, a problem that is particularly acute in the catheterization lab where the use of albeit low radiation dose imaging approaches has increased exponentially. Diagnostic procedures utilizing ionizing radiation, such as coronary angiography, are now standard, as are interventions such as coronary artery angioplasty and stenting. Interventions such as atrial fibrillation ablation can take several hours and require up to an hour’s screening time. And the huge growth in the number of trans-catheter aortic valve implantation (TAVI) procedures carried out in the cath lab also impacts on the cumulative radiation dose to which operators are exposed.
The potential hazards of operator exposure include skin erythema from hands being constantly within the primary beam, and damage to eyes. Relatively low radiation doses can irreversibly damage the lens; higher doses can affect the conjunctiva, iris, sclera and retina. And of most concern, increasing radiation exposure can result in irreversible damage to cellular DNA and carcinogenesis; the brain, thyroid and skin are most susceptible to cancers. A survey published earlier this year in the American heart association journal compared 466 healthcare personnel with an average of ten years cath lab experience with 280 personnel working in cardiology but without radiation exposure. The prevalence of skin lesions, cataracts and cancers were all significantly higher in the radiation-exposed group, as were hypertension and orthopedic problems such as back pain. But in the high stress environment of the cath lab, exacerbated because these healthcare workers are frequently on call’ after completing their regular shifts, it is understandable that monthly reports of radiation exposure are not scrutinized by staff, and that effective protective measures such as special glasses, thyroid collars, gloves and lead aprons- the wearing of which has been linked to lower back pain- are not always utilized.
So surely it is essential that hospitals provide intensive training in radiation protection for the whole cath lab team, ensure that all staff know the relevant protocols and adhere to them, and regularly examine shielding equipment for defects. In addition radiation protection supervisors should monitor exposure on a monthly basis, via operator badges and ideally by the systems available that can provide real-time data throughout every procedure involving ionizing radiation.

Dose reduction in medical radiation – regulators, industry and healthcare professionals seek common front

Ionizing radiation, from the sun and even the earth, is a daily fact of life. There is little that can be done about this, except to stay away from too much sunlight and protect the skin with sunscreens. On the other hand, people are also sometimes exposed to radiation for medical reasons – such as diagnostic X-Rays or CT scans, or a range of interventional radiology procedures. These procedures offer tremendous benefits for patients and for healthcare providers. The evidence for such benefits has become indisputable in recent years, and covers a wide range of diseases and conditions.

Medical imaging has profound impact on patient management
The American Journal of Roentgenology’ reported in 2011 that abdominal surgeries reduced significantly after CT scans. Physicians planned to admit 75percent of patients to hospital before CT. This level was changed to hospital discharge with follow-up in 24percent of patients after CT. The conclusions of the researchers, from Massachusetts General Hospital, were conclusive: CT ‘changes the leading diagnosis, increases diagnostic certainty, and changes potential patient management decisions.’
Massachusetts General Hospital was indeed one of the first institutions to study the impact of medical imaging. In 1998, a team from the hospital reported that CT was 93-98percent accurate in confirming or ruling out appendicitis. The condition accounted for 1 million patient-days per year in the US, with a similar level eventually found to have other conditions.

From emergency rooms to lung cancer
More recently, the New England Journal of Medicine’ published a study on non-invasive coronary CT imaging in the emergency room. The study found that out of the 8 million visits per year to emergency rooms by patients with chest pain, only 5-15percent were eventually found to be suffering from heart attacks or other serious cardiac diseases. As many as 60percent of patients faced unnecessary admission and testing to exclude acute coronary syndrome.
Meanwhile, it has also been reported that low-dose CT screening reduced lung cancer deaths by at least 20percent in a high risk population of current and former smokers aged 55 to 74. These findings were reported by the National Lung Cancer Trial in the US.

Fight against Alzheimer’s, speeding up clinical trials

In the future, medical imaging holds forth significant promise as a tool in the fight against diseases ranging from osteoporosis to Alzheimer’s, whose incidence is likely to grow sharply as the population ages.
Medical imaging also offers increasing promise as a surrogate endpoint in clinical trials, allowing measurement of the effect of a new drug far earlier than traditional endpoints, such as survival times or clinical benefit.

Concerns about over-use, some alarmist
Nevertheless, there are several concerns about over-use’ – especially for imaging accompanied by radiation such as CT. In the US, according to a June 2012 review in the Journal of the American Medical Association’, CT scans tripled in the period 1996-2010, corresponding to a 7.8percent annual increase. Although this was less than a near four-fold increase in MRI and a 30percent fall in nuclear medicine use, CT has been the target of sometimes emotive campaigns.
One good illustration of this was an Op-Ed in the New York Times’ on January 31, 2014. The article was titled ‘We Are Giving Ourselves Cancer.’ It opened with the observation that we are ‘silently irradiating ourselves to death,’ while its closing sentence urged finding ways to use CTs ‘without killing people in the process.’

The Times’ Op-Ed cited a British study which ‘directly demonstrated’ evidence of the ‘harms’ of CT, and it is here that its authors over-stretched their credibility. The study they referred to was published in Lancet’ in August 2012 and titled Radiation exposure from CT scans in childhood and subsequent risk of leukemia and brain tumours: a retrospective cohort study’. Its authors used data on 175,000 children and young adults and found that the cumulative 10-year risk was higher in relative terms, but translated into one extra case of leukemia and one extra case of brain tumour per 10,000 head CT scans.

ALARA and the principle of necessity and justification
In other words, while few would argue that there is no risk from radiation, it is clear that such risks are small and that even these small potential risks could be controlled further by reducing exposure to radiation.
Both industry and healthcare professionals are endeavouring to ensure that such a goal is achieved.
Manufacturers of CT and other radiation imaging equipment seek to keep exposure to radiation for both patients and medical staff to a minimum – and below their regulatory limits – by using the ALARA (As Low As Reasonably Achievable) principle to design their products. Key methods include use of the most dose-efficient technologies available and seeking to ensure that optimum scan parameters are used for a patient and examination type.
Meanwhile, in the clinical setting, doctors seek to ensure that radiation imaging examination is ordered only when absolutely necessary and justified, while radiographers optimize the radiation dose used during each procedure.

Safety, information and awareness
Since the mid-2000s, radiologists and medical physicists have taken steps to increase controls on radiation risks to patients. These have essentially focused on promoting the safe use of medical imaging devices, supporting informed clinical decision making and increasing patient awareness.
One of these initiatives is known as Image Gently, a collaborative initiative by radiology professional organizations and other concerned groups. Its target is to specifically lower radiation dose during the imaging of children.
A related initiative, led by the American College of Radiology (ACR) and the Radiology Society of North America (RSNA), is Image Wisely. This is essentially an awareness campaign whose goals are to eliminate unnecessary’ procedures and lower doses to minimal levels required for clinical effectiveness when necessary. One aspect of Image Wisely is collaboration between medical radiologists and manufacturers to improve performance of radiology equipment and allow physicians to make real-time assessments of whether radiation levels are acceptable.

Initiatives by professional societies
Such initiatives are closely supported by professional radiology societies. The ACR has developed Appropriateness Criteria (corresponding to the federal requirements on appropriate use) to assist referring physicians and radiologists in prescribing the best imaging examination for patients – based on symptoms and circumstances. One tool consists of the display of imaging options and associated radiation levels for a specific procedure. The aim is to reduce imaging examinations by assuring that the most suitable exam is done first.
In Europe, the European Society of Radiology’s flagship EuroSafe Imaging’ has the same objective, to maximize radiation protection and quality/safety in medical imaging. The initiative was launched at the European Congress of Radiology in 2014 and has so far attracted over 50,000 individual supporters (known as Friends of EuroSafe Imaging’). Over 200 institutions (industry and healthcare providers) have also endorsed the initiative.

Accreditation programmes
Accreditation programmes are also being targeted by the ACR and ECR, in order to assess facilities based on imaging competence, adherence to latest dose guidelines, and personnel training. Given the pace of technology development in imaging, certified radiology and nuclear medicine professionals are increasingly recommended or (in some cases) required to earn continuing education credits on radiation safety.
In Europe, the ECR has joined forces with the European Federation of Organizations for Medical Physics (EFOMP), the European Federation of Radiographer Societies (EFRS), the European Society for Therapeutic Radiology and Oncology (ESTRO), the European Association of Nuclear Medicine (EANM), as well as the Cardiovascular and Interventional Radiological Society of Europe (CIRSE) on an EU-promoted radiation education project called MEDRAPET. The findings, published in 2014, revise the previous Radiation Protection 116 Guidelines on Education and Training.

The Bonn Call for Action sets roadmap for the future

Many of these initiatives have been inspired by a conference held in Bonn, Germany, at the end of 2012, which was sponsored jointly by two United Nations bodies – the International Atomic Energy Agency (IAEA) and the World Health Organization (WHO). The outcome of the conference, which was attended by participants from 77 countries, is known as the Bonn Call for Action, and aims to strengthen medical radiation practices into the 2020s.

The Bonn Call consists of ten major actions. These are described below:

  • To enhance implementation of the principle of justification. There is explicit emphasis on the use of clinical decision support (CDS) technology towards such a goal.
  • To enhance implementation of the principle of optimization of protection and safety. There is a specific call to ensure the establishment, use and regular updating of diagnostic reference levels for radiological procedures, including interventional procedures, and to develop and apply technological solutions for patient exposure records, harmonize dose data formats provided by imaging equipment and increase utilization of electronic health records.
  • Strengthen manufacturers’ role in contributing to the overall safety regime. This seeks to enhance radiation protection features in the design of both physical equipment and software, and to make these available as default features rather than optional extras.
  • Strengthen radiation protection education and training of health professionals.
  • Increase availability of improved global information on medical exposures and occupational exposures in medicine, with specific attention to developing countries.
  • Improve prevention of medical radiation incidents and accidents. One interesting facet here is a call to work towards including all modalities of medical ionizing radiation as part of a voluntary safety reporting process, with specific emphasis on brachytherapy, interventional radiology, and therapeutic nuclear medicine, in addition to external beam radiotherapy.
  • Strengthen radiation safety culture in healthcare.
  • Foster an improved radiation benefit-risk-dialogue.
  • Strengthen the implementation of safety requirements globally.
  • Develop practical guidance to provide for the implementation of the International Basic Safety Standards in healthcare globally.

Although some of the Bonn Call points are repetitive, the document is noteworthy in terms of setting a minimal set of common rules for a very wide range of stakeholders – manufacturers, health professionals and professional societies.

Point 6 seeks new work on effective’ dose
Point 6 of the Bonn Call is both ambitious and timely. Although the concept of effective dose’ (or effective dose equivalent) was introduced in the mid-1970s to provide a common framework for evaluating the impact of exposure to ionizing radiation via any means, technology’s uneven leaps have not made it easy to follow through. Data for doses by different radiographic imaging modalities used in radiation therapy are scattered widely through literature, making it difficult to estimate the total dose that a patient receives during a particular treatment scenario. In addition, interventional systems are often configured differently from diagnostic set-ups and imaging systems do not distribute radiation in similar ways. For example, planar kV imaging attenuates rapidly along the line of sight, while CT dose is uniformly distributed through a patient. This makes it difficult to sum dose in a radiobiologically consistent manner.

Dynamic contrast-enhanced magnetic resonance – new frontiers against cancer, but some way still to go

Dynamic contrast-enhanced magnetic resonance (DCE-MRI) is a functional imaging technique. It consists of MRI scans coupled to the injection of a contrast agent. The latter leads to a decrease in relaxation time and provides extremely detailed characteristics of the micro-circulation of blood through tissue.
DCE-MRI assessments typically use the characteristics of signal intensity (SI) and time-intensity curves (TIC) regarding regions of interest (ROI). Early DCE-MRI efforts assumed a linear relationship between signal enhancement and contrast uptake. However, given that signal enhancement depends to a very great degree on intrinsic tissue and acquisition parameters, more complex models have been developed to control the effect of tissue characteristics such as the pre-contrast longitudinal relaxation time and the longitudinal or transverse relaxivities of the contrast agent.

Two-phased process
DCE-MRI is a two-phased process. Typically, at first, a T1-weighted MRI scan is conducted. This is followed by injection of the contrast agent, and then repeated acquisition of T1-weighted fast spoiled gradient-echo MRI sequences to obtain measurements of signal enhancement as a function of time.
The contrast agents are usually based on gadolinium and include gadoterate meglumine (Gd-DOTA), gadobutrol (Gd-BT-DO3A) gadoteriol and albumin-labelled Gd-DTPA.

Image acquisition and voxel comparison
Typically, 3D image sets are obtained sequentially every few seconds for up to 5-10 minutes. Shorter intervals allow for detection of early enhancement, although many researchers consider 10 seconds to be good enough. Longer intervals than this typically makes it tougher to identify early enhancement.
At the moment, the debate about the upper limit for intervals continues.
After image acquisition, the comparison of T1 values per voxel in each scan allows identification of permeable blood vessels and tumour tissue. Both spatial and temporal resolution must be adjusted to obtain an adequate sampling of the contrast enhancement over time, for each tissue voxel. The speed with which MRI images must be acquired necessitates larger voxels, so as to maintain adequate signal-to-noise ratios. Thus, DCE-MRI is often not as high in resolution as conventional T2-weighted sequences.

Range of biomarkers
Although DCE-MRI can be performed on conventional scanners (typically 1.5 T), it requires specialist image analysis to analyse the enhanced biomarker information which is to be provided. Such information includes tissue perfusion, vascularity, endothelial permeability, cellularity etc.
The biomarkers can be used to provide measurements of tumour vascular function and to improve the diagnosis and management of diseases in a variety of organs.

DCE-MRI in the brain
Clinical applications of DCE-MRI have principally focused on in-vivo characterization of tumours.
One of its earliest applications was to analyse blood vessels in a brain tumour, since the blood-brain barrier (BBB) blocks the contrast agent in normal brain tissue, but not in vessels generated by a tumour.
The contrast agent’s concentration is measured as it passes between the blood vessels and the extracellular space of tissue, and then as it returns to the vessels. In tissues with healthy cells or high cell density, the re-entry of the contrast agent into vessels is quicker since it cannot pass cell membranes. In tissues which are damaged or have a lower cell density, the agent is present in the extracellular space for a longer duration.

Numerous DCE-MRI studies on the brain have researched the correlation between BBB disruption and diseases such as acute ischemic stroke, pneumococcal meningitis, brain metastases, multiple system atrophy, multiple sclerosis and Type-II diabetes. One of the most exciting areas of research is the difference in signal intensity profiles over time between Alzheimer’s disease patients and controls.

Tumours and DCE-MRI
Elsewhere, researchers have also established the benefits of DCE-MRI for differential diagnosis of tumours in the head and neck region, such as salivary gland tumours and lesions in the jaw bone. DCE-MRI has also been used to demonstrate the nature of a lymphoma and making a differential diagnosis versus other lesions.
Prostate cancer is becoming a major area of application for DCE-MRI. One of the key limitations to standards of care in the past was the need for random prostate biopsies after discovery of elevated PSA values. This often led to discovery of inconsequential tumours. Meanwhile, the very same biopsies sometimes missed out on significant disease. DCE-MRI, in conjunction with PSA, can identify tumours likely to cause death if left untreated.

Assessing response to chemotherapy
DCE-MRI is also being used to assess responses to chemotherapy. One example of an ongoing project in this area is CHERNAC (Characterizing Early Response to Neoadjuvant Chemotherapy with Quantitative Breast MRI), which is funded by the Breast Cancer Campaign in the UK.
Elsewhere, DCE-MRI has shown promise in detecting cancer recurrence. For example, biochemical relapse after radical prostatectomy can occur in as much as 15 to 30percent of prostate cancer patients. Detection of tumour recurrence in such cases can be difficult due to the presence of scar tissue. Determining the precise site of recurrence since patients with isolated recurrence could benefit from less-invasive treatments, such as radiation to the resection bed.
Other areas for DCE-MRI application include cardiac tissue viability – for example, to evaluate sub-clinical fibrosis and micro-vascular dysfunction. Researchers have also shown its utility in measuring renal function and partial/segmental liver function.

A full spectrum of methods
In general, the analysis of DCE-MRI is based on a full spectrum of methods from the qualitative to quantitative, with an intermediary semi-quantitative approach.

Qualitative analysis
Qualitative analysis is visual and depends on clinical experience and expertise. It assumed that tumour vessels are leaky and more readily enhance after IV contrast material is expressed. As a result, DCE-MRI patterns for malignant tumours show an early and rapid enhancement of the time-intensity curve (TIC) after injection of the agent, followed by a rapid decline. On the other hand, normal tissue shows a slower and steadily increasing signal after agent injection.

Quantitative analysis
Quantitative analysis is based on the pharmacokinetics of contrast agent exchange. It is complex, but allows for a degree of comparability. The limitation is due to a lack of standards. However, better and wider use of software has led to a growing consensus on approaches to quantitative analysis of DCE-MRI data.
One of the most widely used tools is the Toft and Kermode (TK) model, which is showing considerable promise in predicting and monitoring tumour response to therapy.

TK provides data about the influx forward volume transfer constant, KTrans, from plasma into the extravascular-extracellular space (EES). Ktrans is equal to the permeability surface area product per unit volume of tissue, and represents vascular permeability in a permeability-limited situation (high flow relative to permeability), or blood flow into tissue in a flow-limited situation (high permeability relative to flow). KTrans is known to be elevated in many cancers.

Pharmacokinetic modeling for analysing DCE-MRI dates to the early 1990s, and was followed by a consensus paper at the end of the decade ( Tofts P.S., Brix G., Buckley D.L., Evelhoch J.L., Henderson E., Knopp M.V. Contrast-enhanced T 1 -Weighted MRI of a diffusible tracer: Standardized quantities and symbols. Journal of Magnetic Resonance Imaging. 1999′).
Over the years, improvement of imaging techniques (e.g. higher temporal resolution and contrast-to-noise ratio) and greater knowledge of the underlying physiology have catalysed development of more complex pharmacokinetic models.
The TK model, for example, had been developed for measuring BBB (blood-brain barrier) permeability, and overlooked the contribution of the plasma to total tissue concentration. However, as the model gained popularity in assessing tumours throughout the body, vascular contributions to signal intensity were also included.

Semi-quantitative models
The semi-quantitative model seeks to fit a curve to data. Like the visual/qualitative, this approach also assumes early and intense enhancement and washout as a predictor of malignancy. However, semi-quantitative analysis also calculates a variety of dynamic curve parameters types after initial uptake, such as the shape of the time-intensity curve (TIC), the time of first contrast uptake, time to peak, maximum slope, peak enhancement, and wash-in and washout curve shapes.
Broadly speaking, there are three types of curve: Type 1 (persistent increase), Type 2 (plateau) and Type 3 (decline after initial upslope). One of the most attractive features of the semi-quantitative model is its relative simplicity in using parameters to differentiate malignant from pathologic but benign tissue.
For example, in the head-and-neck region, a rapid increase in TIC (fast wash-out pattern) indicates a strong possibility of Warthin’s tumour – a benign, sharply demarcated tumour. A persistent increase suggests the possibility of pleomorphic adenoma. A plateau pattern with a slow washout is characteristic of both a malignant tumour and adenoma.

In spite of enthusiasm about the semi-quantitative approach, it cannot be generalized across acquisition protocols and sequences as well as several other factors which impact on MR signal intensity. In turn, these affect curve metrics, such as maximum enhancement and washout percentage. Differences in temporal resolution and injection rates can also change the shape of wash-in/washout curves, making comparison difficult. Finally, such descriptive parameters provide no physiologic insights into the behaviour of the tumour vessels.

The limitations of DCE-MRI
DEC-MRI itself faces some major limitations. Firstly, there is a lack of standardization in DCE-MRI sequences and analysis methodology, making it difficult to compare published studies. In general, shorter acquisition times lend themselves to more comparability.
One frequent problem is movement by the patient and organ motion (e.g. in the gut, the kidney, bladder etc.). Since a DCE-MRI study procedure is over 5 minutes, there can be considerable misregistration between consecutive imaging slices, leading to noise in the wash-in and washout curves, and problems fitting pharmacokinetic models to the curve.
New DCE-MRI postprocessing software seeks to correct this by automatically repositioning sequential images for better alignment. However, these too do not use common algorithms to process the data and generate parametric maps and can show differences – e.g. in tumour vascularity. To enable further investigation of the value of DCE-MRI of the prostate, the technique of DCE-MRI and the pharmacokinetic model used to analyse it must become more standardized.

One of the most serious problems with DCE-MRI, however, is its non-specificity which can lead to to both false negatives and false positives.
Other sources of uncertainty in DCE-MRI studies include a lack of data. For example, one typical assumption is fast water exchange between compartments in spite of suspicions about the influence of restricted water exchange. Indeed, many quantitative models disregard intracellular space since it is assumed that there is no contrast media exchange. However, others have pointed out that water itself can exchange between the cell and the extracellular space, thereby influencing signal changes in the extracellular space. This is clearly an areas which calls for more study.
Further research is also required in areas such as relaxivity values for a contrast agent, field strength and tissue/pathology. Currently, relaxivity across tissues and compartments is generally assumed to be uniform.

To conclude, DCE-MRI is a significant and promising diagnostic modality. However, for most clinical applications, it cannot be used on a standalone basis, regardless of curve shape or intensity of enhancement. DCE-MRI needs to be viewed in the context of other MRI parameters such as diffusion-weighted MRI and MR spectroscopic imaging as well as T2-weighted MRI.

Point-of-care testing – enhancing throughput in emergency departments

Point-of-care testing (POCT) refers to diagnostic tests which are performed physically close to a patient, with the results obtained on site. They are conducted at primary care centres and at hospital bed sides (increasingly, in emergency departments and intensive care units, too).
POCTs are also used in the field in settings such as natural or man-made disasters, and accompanied by telemedicine, in patients’ homes.

Saving time and space
While traditional diagnostic tests involve taking patient specimens, transporting them to a laboratory for analysis and then returning the results to a physician, POCTs cut out both the transport and laboratory. As a result, they provide quicker turnaround time (TAT), sometimes near-instantaneously.
In the past, the traditional laboratory-centric process was unavoidable due to the sheer size of equipment required for diagnostic tests. In recent years, technology developments – especially in terms of miniaturization – have made it possible to perform a growing number of tests outside of the laboratory. One recent book on biomedical engineering (D. Issadore and R.M. Westervelt (eds.), Point-of-Care Diagnostics on a Chip, Biological and Medical Physics, Biomedical Engineering’, Springer-Verlag, Berlin 2013) notes the array of sophisticated, low-power and small ‘microfilters, microchannels, microarrays, micropumps, microvalves and microelectronics …. integrated onto chips to analyse and control biological objects at the microscale’, that have made decentralized diagnostics possible.

Impact on efficiency, outcomes – and costs
Such time savings can have a dramatic impact on downstream clinical efficiency and patient outcomes. In many cases (although not universally or under all circumstances), they also save costs.
For example, POCT can reduce revenue losses due to workflow delays of test-dependent medical procedures – such as disruptions in magnetic resonance imaging (MRI) or computer tomography (CT) queue. This is not a rare occurrence, and delays in radiology testing have been shown to extend total length of stay in the emergency department (ED).

From lab downscaling to targeted solutions
Early POCTs were based on the simple transfer of traditional methods from a central laboratory, accompanied by their downscaling to smaller platforms. At a later stage, unique, innovative assays were designed specifically for POCT (such as the rapid streptococcal antigen test). This was accompanied by the development of wide arrays of POCT-specific analytic methods, ranging from the simple (such as pH paper for assessing amniotic fluid) to the ultra-sophisticated (for example, thromboelastogram for intra-operative coagulation assessment).
Today, the typical POCT test arsenal includes cardiac biomarkers, hemoglobin concentrations, differential complete blood count (CBC), blood glucose concentrations, coagulation testing, platelet function, pregnancy testing as well as tests for streptococcus, HIV, malaria etc.

Beside and near-bedside POCT
POCT devices are used in a wide range of healthcare settings. They can be divided into two broad groups, depending on size and portability – bedside and near-bedside.
Bedside POCT devices are smaller, usually hand-held, and offer the greatest mobility. Due to their compact nature they are often more specialized and limited in overall functionally. Many are enclosed in test cassettes (such as easy-to-use membrane-based strips) and based on portable, sometimes handheld, instruments. This family of POCT requires only a single drop of whole blood, urine or saliva, and the tests can be performed and interpreted by a general physician in minutes. Nevertheless, some of them can be quite sophisticated.
New POCTs for early detection of rheumatoid arthritis, for example, require only a single drop of whole blood, urine or saliva, and can be performed and interpreted by any general physician within minutes. Two of the earliest efforts in this area were made in Europe. The first, from Sweden’s Euro-Diagnostica detects antibodies to CCP, while Rheuma-Chec from Orgentec in Germany combines two biomarkers – rheumatoid factor and antibodies to MCV. These tests are targeted at primary care.

Near-bedside (or neighbourhood) devices are larger and typically located in a designated testing area. They provide higher calibration sensitivity and quality control and are used for more complex diagnostic tests than their smaller bedside counterparts.
They are themselves also far more complex, with high degrees of automation in comparison to their bedside POCT counterparts. This automation contributes to the increased speed and ease-of-use of the devices. However, it also leads to challenges in training users.

The imperatives of turnaround time
As mentioned, the principal interest in POCT is to reduce turnaround time (TAT) – the duration between a test and the obtaining of results which aid in making clinical decisions. The impact of this has been profound in the emergency department.
Already in 1998, a randomized, controlled trial in the A&E department of a British teaching hospital assessed the impact of POCT on health management decisions. The results, published in British Medical Journal’ in 1998, found that physicians using POCT reached patient management decisions an average of 1 hour and 14 minutes faster than patients evaluated through traditional means.

Use in emergency departments
Though the bulk of POCT is conducted by primary care physicians, one of its fastest growing users has been hospital EDs, which the British Medical Journal’ study hinted at almost 20 years ago.
POCT’s relevance for emergency departments is multi-faceted.
In the ED, prolonged wait times and overcrowding directly correlate to reduced patient satisfaction and adverse clinical outcomes. Several European countries have regulations on length-of-stay time targets in EDs, requiring that patients must transit through four to 8 hours. Though there are several factors at play here, no one would argue that reducing the delay between sample collection and test results can enable healthcare professionals to arrive at quicker decisions and increase patient throughput. POCTs make this possible.
One study in Switzerland evaluated adding POCT to B-type natriuretic peptide levels for ED patients presenting acute dyspnea as their primary symptom. POCT was not only associated with significant decreases in time to treatment initiation, but was also associated with a shorter length of stay and a 26percent reduction in total treatment costs.
Another study on D-dimer POCT in the ED found a 79percent reduction in TAT compared to central laboratory testing and resulted in shorter ED lengths of stay and reduced hospital admissions, while a randomized study in coagulopathic cardiac surgery patients found that POCT-guided hemostatic therapy led to reduction in transfusion and complication rates, and improved survival.

From ACS to pregnancy tests, and overcrowding

Favourable perspectives on POCT in the ED have strengthened over time. One recent study in Critical Care’ found POCT increased the number of patients discharged in a timely manner, expedited triage of urgent but non-emergency patients, and decreased delays to treatment initiation. The study quantitatively assessed several conditions such as acute coronary syndrome (ACS), venous thromboembolic disease, severe sepsis and stroke, and concluded that POCT, when used effectively, ‘may alleviate the negative impacts of overcrowding on the safety, effectiveness, and person-centeredness of care in the ED.’
A great deal of attention has been given to the use of POCT in emergency settings for screening patients who presented with symptoms of acute coronary syndromes (ACS). The rapid identification and treatment of ACS patients is critical.
Due to the time-sensitive nature of ACS, reduced TATs can offer a clear advantage. POCT has been shown to increase the speed at which positive cases of ACS are accurately identified, allowing physicians the ability to admit and initiate treatment at a faster rate than previously possible. Decreased TATs also can result in the earlier identification of negative cases of ACS, thereby increasing the number of successful discharges, and allowing for more efficient use of hospital resources .

The ICU and POCT
Unlike the ED, the use of POCT in intensive care units is still in its infancy. In 2013, researchers at Germany’s Klinikum rechts der Isar in Munich sought to retrospectively investigate whether POCT predicted hospital mortality in over 1,500 ICU admissions. The results were mixed. Lactate and glucose seemed to independently predict mortality. So did some forms of metabolic acidosis, especially lactic acidosis. However, anion gap (AG)-acidosis failed to show any use as a biomarker.
One of the most important areas for POCT focus in the ICU consists of sepsis – which is directly correlated to poor outcomes. ICU patients often have other ongoing disease processes whose biomarkers are shared with sepsis, such as raised white blood cell count and fever. More crucially, many ICU patients are already on antibiotics at admission, making microbiological cultures redundant.

POCT as part of health management strategy
Overall, POCTs have an impact and make most sense when utilized as part of an overall health management strategy which enhances the efficiency if clinical decision-making. Indeed, the rapid TAT provided by POCT allows for accelerated identification and classification of patients into high-risk and low-risk groups, improving quality of care and increasing clinical throughput.
POCT results are often available in minutes. However, decreased TATs on their own mean nothing, until they provide clinical pathways means to impact on workflow. The latter varies widely across healthcare settings.

Differences in practice

Such a scenario is by no means straightforward. In Europe, for example, POCT use is highly irregular and differs greatly between institutions and countries. Though differences in operating procedures are natural by-products of institutional cultures, there are some oversight and quality control issues which healthcare leaders must address to take maximum advantage of POCT.
Answers to the above are not a question of if’ but when’.

Regulation – the future ?

The future of POCT may well be shaped by regulators, and their response to the kind of pressures mentioned above.
In Europe, POCT devices are regulated under the 1998 European Directive 98/79/EC on in vitro diagnostic medical devices, which became operational in 2001. POCT devices are not specifically mentioned or referred to in this directive, and at the European level, coverage of POCT is referred by international standard ISO 22870:2006, used in conjunction with ISO 15189 which covers competence and quality in medical laboratories.
In the US, CLIA88 (Clinical Laboratory Improvement Amendments of 1988) provided a major impetus for growth in POCT. The rules, published in 1992, expanded the definition of laboratory’ to include any site where a clinical laboratory test occurred (including a patient’s bedside or clinic) and specified quality standards for personnel, patient test management and quality.
One of CLIA88’s biggest contributions to POCT growth was to define tests by complexity (waived, moderate complexity and high complexity control), with minimal quality assurance for the waived category.
CLIA88 has been followed by US federal and state regulations, along with accreditation standards developed by the College of American Pathologists and The Joint Commission. These have established POCT performance guidelines and provided strong incentives to ensure the quality of testing.