All-optical ultrasound – rejuvenating an imaging workhorse

One of the most exciting recent developments in imaging consists of all-optical ultrasound. Unlike traditional ultrasound, which is achieved using piezoelectric transducers, optical systems perform ultrasonic generation via pulsed light. This is followed by optical reception of ultrasonic reflections from the tissue which is being imaged.

Reducing the need for trade-off
Though ultrasound is one of the most common medical imaging tools, conventional devices tend to be bulky and cannot typically be used at the same time as some other imaging technologies. This is why a hybrid combination of optics and ultrasound, coupled to inexpensive fibre-based probes for intravascular imaging, promises to open up new possibilities for medical imaging. 
In terms of imaging, optical techniques ensure satisfactory contrast levels, while ultrasound provides high resolution. Optical technologies can also be manipulated to generate low frequency ultrasound which yields greater penetration into tissue, or high frequency ultrasound to obtain higher resolution images, albeit at a shallower depth. In practical terms, such a combination also gives flexibility to physicians in how they use imaging technology to diagnose and treat medical problems. For example, to provide intravascular imaging and detection of conditions like plaque, ultrasound can provide details of morphology while the optical imaging highlights its composition.

Broadband imaging

In technical terms, traditional ultrasound image formation covers a narrow frequency band (usually half the central frequency), while signal generation in optical ultrasound is broadband (covering sub-MHz to several hundred MHz frequencies). In addition, the tomographic principles used by optical ultrasound generally entail data collection over wide angles. This improves image quality and resolution, while minimizing image artifacts.
Efforts to develop broadband all-optical ultrasound transducers date to over a decade. One prototype was developed and tested for high-resolution ultrasound imaging in 2007 at the University of Michigan, Ann Arbor. It consisted of a two-dimensional gold nanostructure on a glass substrate, followed by polydimethylsiloxane plus gold layers. The system achieved a signal-to-noise ratio of a pulse-echo signal of over 10dB in the far field of the transducer, where the centre frequency was 40MHz with −6dB bandwidth of 57MHz. In a paper published in the August 2008 issue of ‘IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control’, the developers of the system concluded that preliminary imaging results “strongly suggest that all-optical ultrasound transducers can be used to build high-frequency arrays for real-time high-resolution ultrasound imaging.”
Innovation not enough to offset drawbacks
As mentioned, the major driver of research into hybrid optical alternatives has consisted of limitations in conventional ultrasound systems. Such drawbacks persist in spite of developments in processing speed, a reduction in noise-to-signal ratios and enhancement in the quality and timing of image capture.
Although innovations such as matrix transducers have enabled the emergence of volumetric ultrasound and 3-D/4-D, elastography has offered physicians the ability to view both stiffer and softer areas inside of tissue.
Elastography uses b-mode ultrasound to measure the mechanical characteristics of tissues, which are then overlaid on the ultrasound image. However, its use in clinical practice remains complicated due to a wide range of techniques used by different manufacturers, alongside differences in parameters used to characterize tissues.
Other areas of innovation include micro-ultrasound, which harnesses ultrasound at microscopic levels and provides 3- to 4-fold improvements in resolution compared to conventional ultrasound. One of the first applications of micro-ultrasound is to allow better targeting of biopsies – for example, in the treatment of prostate cancer by urologists.

Ultrasound-modulated optical tomography
In the late 2000s, ultrasound-modulated optical tomography (UOT) showed considerable promise in imaging of biological soft tissues, with promising application in several areas, including cancer detection. UOT detects ultrasonically modulated light to localize and image subjects. The key limitation of UOT, however, is weak modulated signal strength.

Photoacoustic tomography

Considerable attention has also been given to photoacoustic tomography, which converts absorbed light energy into an acoustic signal. The technique provides compositional information on body tissue in real time without requiring any contrast agents. It also allows much higher depth penetration than conventional optical techniques. Photoacoustic tomography has been used for mapping the deposition of lipids within arterial walls.
Photoacoustic tomography begins by sending pulsed light into tissue, typically from a Q-switched Nd:YAG laser. This creates a slight hike in temperature which causes the tissue to expand, creating an acoustic response which is detected by an ultrasound transducer. The data is then used to visualize the tissue.
However, photoacoustic tomography systems have proved difficult to translate into clinical applications due to their high cost, as well as a relatively large footprint which requires a dedicated optical table to house the laser. On a technical level, moreover, a low pulse repetition rate (in the dozens of hertz) prevents photoacoustic tomography from being used in high frame rate imaging, which are required for clinical applications such as cardiac related problems, where the rate of blood flow is high, or in other similarly fast-moving settings.
There are nevertheless several efforts to cope with the challenges facing photoacoustic tomography. The first is enhancing signal-to-noise ratio and the depth of penetration of optical absorbers. Researchers at Purdue University in the US, who are at the forefront of investigations into the technique, believe that new optical manipulation techniques to maximize photon density might provide a way forward. They have recently announced development of a motorized photoacoustic holder, which allows manoeuvring the aim of the device and fine-tuning the depth to where light is focused. This, they believe, could significantly improve light penetration as well as the signal-to-noise ratio.
Other efforts seek to cope with fast-moving and dynamic settings. In Singapore’s Nanyang Technological University, for example, researchers have demonstrated up to 7,000 Hz photoacoustic imaging in B-mode, using a pulsed laser diode as an excitation source and a clinical ultrasound imaging system to capture and display the photoacoustic images.

All-optical ultrasound

All-optical ultrasound, which has recently catalysed maximum interest, involves using pulsed laser light to generate ultrasound. Scanning mirrors control where the waves are transmitted into tissue. After this, a fibre optic sensor receives the reflected waves, recombines them and creates a visualization of the area being imaged.

Bandwidth, acquisition time and electromagnetic interference

Such a modality exhibits wide bandwidth and satisfactorily addresses one of the major shortcomings of previous efforts at clinical application – namely, prolonged acquisition times (ranging from minutes to hours). Unlike conventional ultrasound imagers which use electronic transducer arrays to transmit sound waves into tissue and receive the reflections for reconstruction as images by a computer, all-optical ultrasound imagers are also immune to electromagnetic interference. As a result, an all-optical ultrasound system can be safely used alongside a magnetic resonance imaging (MRI) scanner, allowing physicians to obtain a more comprehensive picture of tissues around an area of interest, such as a tumour or blood vessel. Immunity from electromagnetic interference and MRI compatibility also means that all-optical ultrasound can be used during brain or fetal surgery, or for guiding epidural needles.

Miniaturization
The absence of electronic components gives yet another advantage, too. Components of conventional ultrasound devices are difficult to miniaturize for internal use. This is due to two factors: a drop in sensitivity after a reduction in the area of the active piezoelectric transducer, and the impact on size of the transducer due to the casing of the piezoelectric element and electrical insulation.
Miniaturization is particularly important in minimally invasive measurements such as medical endoscopy or for inspection of the lumen in non-destructive testing. Small-area detectors are also preferred in tomographic applications, given that detector size inversely correlates to spatial resolution.
Due to difficulties in miniaturization, most ultrasound devices use large, handheld probes placed against the skin. Although some high-resolution probes have been developed, they are considered too expensive for routine clinical use.
Unlike the above limitations for conventional ultrasound devices, the miniaturization of optical detectors (e.g. via interferometric resonators) does not impact on active detection area. In other words, there is no loss of sensitivity. Finally, optical components are not only easily miniaturized but also significantly less expensive to manufacture, compared to compacting electronic ultrasound systems.

First video-rate all-optical ultrasound system
The world’s first all-optical ultrasound system capable of video-rate, real-time imaging of biological tissue has been demonstrated by a research team from University College London (UCL) and Queen Mary University of London (QMUL). It was used to capture the dynamics of a pulsating ex-vivo carotid artery within the beating heart of a pig, and revealed key anatomical structures required to safely perform a transseptal crossing, namely left and right atrial walls, the right atrial appendage and the limbus fossae ovalis.
The researchers believe the new technology will allow ultrasound to be integrated into a wide range of minimally invasive devices in different clinical contexts, and provide ultrasound imaging of new and previously-inaccessible regions of the body. Above all, its real-time imaging capabilities allows differentiation between tissues at significant depths, helping to guide surgeons in some of the highest risk moments of procedures. This will reduce the chances of complications occurring in cases such as cardiac ablation.

Designed for clinical advantage

The new system from UCL and QMUL uses light guided by miniature optical fibres, encased within a customized clinical needle, which generate ultrasonic pulses. Reflections of these ultrasonic pulses from tissue are detected by a sensor on a second optical fibre, in order to provide the real-time imaging.
The developers based their design on a nano-composite optical ultrasound generator coupled to a fibre-optic acoustic receiver with extremely high sensitivity. In turn, harnessing eccentric illumination provided an acoustic source with optimal directivity. This was then scanned with a fast galvo mirror which provided video-rate image acquisition (compared to a time-frame of several hours in previous experiments). It also increased image quality in both 2D and 3D, and made it possible to acquire the images in different modes.
The scanning mirrors in the new system are flexible. They allow for seamless toggling between 2D and 3D imaging as well as a dynamically adjustable trade-off between image resolution and penetration depth. Unlike conventional ultrasound systems, these are achieved without requiring a swap of imaging probe. In a minimally invasive interventional setting, in particular, such probe swapping extends procedure times and introduces risks to the patient.
The technology has been designed upfront by the researchers for use in a clinical setting, with sufficient sensitivity to image moving tissue inside the body at centimetre-scale depth and fit into existing workflow. The researchers are currently working on developing a flexible imaging probe for free-hand operation, as well as miniaturized versions for endoscopic applications.

POC tests – market drivers and barriers

Point-of-care (POC) testing consists of diagnostic tests performed in the physical proximity of a patient, with results obtained on site. One useful definition is from the German Society of Clinical Chemistry and Laboratory Medicine, which explains it as “diagnostic testing at or near the site of patient care, with an easy-to-use instrument, under the immediate health care (e.g. emergency room, operating room, intensive care unit) and not by laboratory personnel.”

Turnaround time, regulations and guidelines
POC tests are a major contrast to their traditional counterparts, which involve taking patient specimens, transporting them to a laboratory and then returning the results to a physician. By eliminating both transport and laboratory, POC tests provide quicker turnaround time. In turn, this enables clinicians to focus on patient care rather than spending critical time on waiting for test results from the lab. This leads to better patient flow to hospitals and clinics, since patients can be diagnosed, triaged and treated earlier. Regulations and guidelines have also encouraged adoption of POC testing. For example, in Germany, certification of a chest pain clinic requires that it be able to perform blood gas analysis within 15 minutes. According to German Society of Cardiology (DGK) guidelines, “the time from blood collection to result documentation may not exceed 45-60 minutes. If this is not possible, a point-of-care test unit on site to determine cardiac markers is mandatory.”

Concerns remain

In spite of such drivers, concerns about reliability and benefits have impeded POC tests from achieving full potential. There are no universally accepted standards on their use or effectiveness. In addition, testing is often performed by personnel without training in clinical laboratory science, or occasionally (in emergency situations) by volunteers. Several POC tests are also conducted by patients themselves, in what is known as self-testing.

One meta-study in 2016, published in the peer-reviewed journal ‘Critical Reviews in Clinical Laboratory Sciences’ echoed such observations. The most prevalent barriers to growth of POC testing, it noted, were associated with the economics of adoption and regulatory issues (such as accreditation), alongside poorly trained staff. Another problem was competition with clinicians who favoured traditional centralized tests. The authors also highlighted the greater cost per POC test when compared to centralized testing, although they acknowledged the existence of difficulties in gauging its cost-effectiveness, given the complexities of making comparisons.

Integrating POC tests into health management strategy
In spite of these limitations, POC tests have yielded measurable improvements in workflow efficiency and patient care, according to numerous studies, some of which go back several years.
Two decades ago, a randomized, controlled trial at a British teaching hospital assessed the impact of POC tests on health management decisions. The results, published in the ‘British Medical Journal’ in 1998, found that physicians using POC tests reached patient management decisions an average of 1 hour and 14 minutes faster than patients evaluated through traditional means.
Indeed, the rapid turnaround time provided by POC tests allows for accelerated identification and classification of patients into high-risk and low-risk groups, leading directly to improvements in quality of care and an increase in clinical throughput.
More recently, studies have shown that POC tests can reduce revenue losses due to workflow delays of test-dependent medical procedures – such as disruptions in magnetic resonance imaging (MRI) or computer  tomography (CT) queues.
Overall, POC tests make the maximum impact when they are implemented as part of an overall health management strategy to increase the efficiency of clinical decision-making. The quick availability of test results can provide clinical pathways which directly impact on outcomes.

Decentralization of healthcare

One of the key drivers of growth in POC testing consists of the progressive decentralization of healthcare and patient-centric care. In early 2018, the influential Joint Commission International highlighted these two factors as key to the growth in use of POC testing.
Interest in POC testing and its potential contribution to decentralized healthcare models, however, date back several years. In the late 2000s, ongoing efforts to optimize information and communications technologies and enhance the efficacy of healthcare began to emphasize the impact of diagnostic services at the point of care.
Such processes have accelerated more recently, after the closure of several laboratories and the emergence of new structures such as micro hospitals, community para-medicine systems etc. The pace of such developments has been accelerating in recent years.

Lab closures
In the US, the federal Centers for Medicare and Medicaid Services imposed major reductions in clinical laboratory test fees in 2018, in order to make savings, with further cuts planned in 2019-2022. The impact is expected to be profound on small community lab companies, particularly those servicing nursing homes and those in small, rural hospitals. Some labs, such as Peace Health Labs in Oregon, estimated a cutback in revenues of 20% due to the decision by Centers for Medicare and Medicaid Services, and put themselves up for sale.
Peace Health Labs was bought in late 2017 by Quest Diagnostics. Soon after, Quest began to close some of Peace Health Lab’s facilities and patient service centres in many smaller communities in Oregon and Washington. Other small labs have rationalized, for example by selling business units. Many have simply begun closing.

Such a phenomenon extends to laboratories elsewhere, too. In Britain, labs are being downsized by government programs to consolidate medical diagnostic testing at larger facilities – especially at the regional level, and target lowering costs through economies of scale. Such a process is especially pronounced at hospitals in communities where the scale of local demand is inadequate to support full-service clinical and pathology testing.
In 2006, a British government report known as the Carter Review recommended that clinical pathology labs be run as managed pathology networks. The report noted that the standard District General Hospital delivering a full range of services will become increasingly unsustainable, and impact adversely on both service quality and cost-effectiveness. The impact of the Carter Review can be seen in the recent downsizing of the pathology laboratory at Queen Elizabeth Hospital (QEH), a 480-bed facility located in Norfolk County, about 150 kilometres from London, and an accompanying effort to create a regional pathology network known as the Eastern Pathology Alliance.
 
Remote settings and POC testing
In spite of its vast and sparsely populated interiors, Australia too has witnessed major cutbacks and consolidation of its pathology laboratory network. Casualties include the pathology laboratories at Maryborough Base Hospital  in Queensland, about 250 kilometres from Brisbane, and at Gold Coast Hospital, about 80 kilometres southeast of the same city. The economics of a high-volume laboratory functioning at a large scale ensure that officials will seek to further consolidate clinical laboratory testing in smaller hospitals, so as to make savings via increased economies of scale.

Due to issues of accessibility and distance, remote settings would be considered to be natural demonstrators of the need for POC testing. In March 2015, for example, an article on European POC testing perspectives published by researchers from Sweden, the Netherlands and Britain in ‘The Upsala Journal of Medical Sciences, observed that outside a hospital setting, POC testing “provides laboratory quality services to underserviced areas and general practitioners.”

Hybrid models
One suggested approach to replace the existing community hospital model for rural area is called a hybrid model. It is based on freestanding emergency departments which have links to primary care providers. Such a care model, however, challenges the ability of large, regional clinical laboratories to provide necessary medical laboratory testing to rural freestanding EDs, and requires the presence of small rural labs.
However, such advantages are hardly straightforward.
In Australia, a Government-funded study between 2005 and 2007 investigated POC tests covering a total of 4,968 patients in urban, rural and remote locations. This multi-centre, cluster randomized controlled trial, led by a team from Flinders University Rural Clinical School, Adelaide, sought to determine the safety, clinical effectiveness, cost-effectiveness and satisfaction of the tests.
One of the key findings of the study was that rural and remote practices showed a greater need for training compared to their urban counterparts.

POC testing in EDs and chronic care

Most such macro-processes focus on consolidation at acute care hospitals, with attention traditionally given to emergency departments. Indeed, EDs have become the gateway to unscheduled hospital care – by some estimates accounting for over three of four such admissions in the US.
However, the benefits of POC testing are also evident with individuals suffering from chronic disease, who require regular check-ups to monitor disease progression. In such patients, decisions to change or modify treatment are often directly dependent on clinical testing. POC tests provides healthcare professionals the means to perform and act upon test results during the same office visit.
Such observations have profound resonance in the context of increasingly decentralised medical care.

Compliance and adherence: the contribution of POC testing

Meanwhile, there is increasing evidence about the impact of POC testing beyond convenience to compliance. In November 2009, ‘The Medical Journal of Australia’ found similar or greater levels of self-reported medical adherence in patients undergoing long-term treatment for diabetes or coagulation disorders when POC testing was performed at regular office visits.
Having access to immediate test results through POC is associated with the same or better medication adherence compared with having test results provided by a pathology laboratory. POC tests can provide general practitioners and patients with timely and complete clinical information, facilitating important self-management behaviours such as medication adherence.
In March 2010, ‘The British Journal of General Practice’ published results of a study on POC testing in a general practice setting. The authors sought to determine if patients were more satisfied with point-of-care testing than with pathology laboratory testing for three chronic conditions, namely diabetes, hyperlipidemia and anticoagulant therapy. Their findings conclusively established that patients had significantly higher levels of confidence in physician and more motivation to look after their condition when POC testing was used. Such subjective factors, according to the authors, can translate into quantifiable improvements in disease management.

Lab staff hold key to good training
In a decentralized healthcare setting, the wider acceptance of POC testing will depend on the training of staff in key processes such as sample collection, the calibration, maintenance and use of instruments, documentation and reporting of critical findings. These vary across healthcare settings, and across countries, but this on its own may not be an impediment.
In August 2013, the ‘Journal of Clinical Nursing’ published a study on six different POC test training initiatives for nurses on three continents. One of their most interesting findings was that a key factor for success consisted of the involvement of laboratory staff.

Functional MRI – opening new frontiers in the brain

Functional magnetic resonance imaging (fMRI) is by far the principal method used to investigate the brain’s cortical areas and subcortical structures. fMRI has dramatically transformed perceptions of the human brain, allowing precise delineation of regions associated with a vast range of external stimuli and moods – ranging from depression and anger to laughter and play. 
Researchers are now exploring further expansion in the scope of fMRI. These range from the development of more precise sensors and probes with quicker response times to the use of fMRI in new applications such as artificial intelligence. Some have even sought to extract images seen by viewers directly out of their brains.

From dog language to crocodile music
Some have also sought to see if fMRI can work in other species of living things.
In 2016, scientists in Hungary concluded that dogs can understand the meaning and tone of human speech, and that they process language in the same way humans do. To reach this conclusion, they managed to get 13 pet dogs to lie completely motionless in an fMRI scanner for eight minutes while wearing earphones and a radio-frequency coil on their heads.
Earlier this year, a team at Germany’s Ruhr-University in Bochum went further than canines by using fMRI to study the brain of a Nile crocodile as it heard complex sounds, including classical music by Bach.

The eye sees, the brain predicts vision

Given the increasing number of ultra-high field systems available worldwide, experts expect a dramatic impact on our understanding of the brain due to sustained enhancements in resolution (both spatial and temporal), as well as in sensitivity and specificity.
Early this year, researchers from the University of Glasgow published results of a fMRI-based experiment to confirm the capability of the visual cortex to make predictions about what a viewer would see next. The study sought some answers to a seemingly perplexing question. Human beings move their eyes approximately four times per second, requiring their brains to process new visual data every 250 milliseconds. In spite of such rapid and constant variation in perspective and image, how is it that the world remains stable ?
The functional MRI used by the Glasgow researchers showed that the brain rapidly adjusts its predictions, with the visual cortex feeding back updates to a new predicted coordinate every time the eyes move.

The Glasgow study established the importance of fMRI in new frontiers of neuroscience research. fMRI is now seen as a means to contribute to research into mental illness as well as help the development of artificial intelligence. Indeed, a better understanding of the predictive mechanism in the human brain may directly lead to breakthroughs in brain-inspired artificial intelligence in the future – especially in terms of visual predictive capabilities.

The role of calcium ions in brain activity
Beyond such frontiers, MRI technology is also undergoing other forms of evolution. Some of these, which involve new sensors and pathways to monitor neural activity deep within the brain, are not just path-breaking but also offer the possibility of profound new insights into understanding how human beings think.
One of the most exciting developments in such a context involves the tracking of calcium ions, which are closely correlated to neuronal firing and brain signalling. MRI typically detects changes in blood flow, and its utility derives from the fact that when a region of the brain is in use and neuronal activation ensues, blood flow to that region also increases. However, such a process provides only indirect clues; the signals are difficult to attribute to a specific underlying cause. By contrast, sensing based on calcium ions may allow linkage of neuron activity patterns to specific brain functions, and thereby enable researchers to understand how different parts of the brain intercommunicate during particular tasks.
Indeed, it has been several years since neuroscientists know that calcium ions rush into a cell after a neuron fires an electrical impulse, and have used fluorescent molecules to label calcium and then image it via traditional microscopy. Though the technique has allowed for precisely tracking neuron activity, its practical use has been limited to small regions of the brain.

MIT designs calcium detecting molecular probe

At the Massachusetts Institute of Technology (MIT), researchers have sought a way to image calcium using MRI, in order to allow for the analysis of much larger volumes of brain tissue than was possible by fluorescent labelling. To do this, the MIT researchers designed a new molecular probe whose architecture can detect subtle changes in calcium concentrations outside of cells and respond in a way that can be tracked with MRI. Such a process allows for direct correlation to neural activity deep within the part of the brain known as the striatum.
Tests in rats enabled the MIT researchers to establish that calcium sensors accurately detect changes in neural activity from electrical or chemical stimulation. The levels of extracellular calcium correlate with low neuron activity. In other words, when calcium concentrations drop, neurons in the area are firing electrical impulses.
The goal of the researchers is to greatly enhance precision in mapping neural activity patterns. By measuring activity in different regions of the brain, they hope to find how different types of sensory stimuli are encoded by the spatial pattern of neural activity which is induced.

The MIT probe essentially consists of a sensor made up of two kinds of particles which bind in the presence of calcium. The first is synaptotagmin, a naturally occurring calcium-binding protein, and the other a lipid-coated magnetic iron oxide nanoparticle which binds to synaptotagmin, but does this only if calcium is present. Calcium binding leads to the particles clumping together, and appearing darker in the MRI image.
The researchers are now attempting to increase the speed of response by the sensor, which currently requires a few seconds after the stimulation. A more important goal is to modify the sensor such that it can pass through the blood-brain barrier. This would enable the delivery of the particles without the need to inject them directly in the test site, as is required at present.
Research into new sensors and neurochemical pathways, as being done at MIT, will no doubt open new vistas in fMRI. However, other efforts too are expected to greatly enhance the range and spectrum of its applications.

Powering up fMRI machines

In May 2013, the European Journal of Radiology published results of a study comparing fMRI at 7T compared to 3T in imaging of the amygdala, a ventral brain region of specific importance to psychiatry and psychology. Traditionally, MRI imaging of such areas is prone to signal losses along susceptibility borders – alongside signal fluctuations due to physiological artifacts from respiration and cardiac action. The increase from 3T to 7T showed a significant gain in percental signal change and demonstrated the potential benefits of ultra-high field fMRI in ventral brain areas.

UC Berkeley targets massive resolution boost in fMRI

More recent efforts are also aimed at enhancing resolution. Today’s top-of-the line scanners, incorporating 10T magnets, can typically localize activity within a region comprising 100,000 neurons or more, about the size of a grain of rice. To be able to concentrate more finely, on smaller groups of neurons, requires a bottom-up re-design of almost the entire gamut of scanner components and sub-systems.
The University of California at Berkeley is currently targeting a 20-fold boost in fMRI resolution in order to provide the most detailed images of the brain ever seen. The project is funded by a BRAIN Initiative grant from the National Institutes of Health.

New approach to fMRI design and architecture
The leap in resolution will be directly due to innovations in hardware design, scanner control and image computation. Currently, spatial resolution of fMRI recordings is based on variations in the magnetic field as well as, indirectly, on the size of detector. The latter consist of coils of wire, which are arrayed around the head of a subject and pick up signals. The Berkeley system uses a far larger number of smaller coils than clinical MRIs, which use smaller numbers of large coils. The result is straightforward – a much higher resolution of the brain’s outer surface, which is needed to identify key layers of the cortex.
Reducing dimensions in such ultra-high resolution MRI holds the key to image the brain in functional regions, where neurons are all essentially involved in the same type of processing. The target which researchers hope to reach is in the range of 0.4 millimetres This is because the cerebral cortex, the brain’s outer layer, consists of columns of neurons which correspond to a specific sensory feature (such as the vertical rather than horizontal edge of an object) and such columns are 0.4 millimetres on the side and 2 millimetres long. The Berkeley researchers are reported to be confident of their ability to build machines which can scan down to the 0.4 millimetre target by 2019.

Peering into the brain’s depths
If successful, the new fMRIs would allow researchers to study cortical microcircuits and glimpse the deepest recesses of human brain function so far. The developers of the system are ambitious. They aim to provide “the most advanced view yet of how properties of the mind, such as perception, memory and consciousness, emerge from brain operations.” This will open ways to observe disturbances in brain structures and functions, and it is hoped, radically enhance the diagnosis and understanding of neurological diseases.

Extracting images out of the brain
One of the most far-reaching possibilities of fMRI was recently announced by a team from the Japan’s Kyoto University, who used machine-learning and artificial intelligence to translate brain activity into images in test subjects.
These ranged from pictures being looked at by the subjects, to things they remembered seeing. The images included a lion, a fly, a DVD player, a postbox, alphabets and geometric shapes, and were recreated pixel by pixel, based on a deep neural network (DNN). 
The images were projected on to a screen in an fMRI scanner, with the heads of subjects secured in place via a bar on which they had to bite down. The subjects, who participated in multiple scanning sessions for a period of more than 10 months, stared at each image for several seconds before taking a rest. After this, they had to recall one of the images seen previously and picture it in their mind.
The DNN was then used to decode the signals recorded by the fMRI scanner and produce a computer-generated reconstructed image of what the participants saw.

Advanced ultrasonic protein removal technology: set to change how surgical instruments are cleaned

Standard methods of decontamination, such as Disinfector Washers (e.g. using hot alkaline solutions and surfactants), are known to be inconsistently effective in removing protein from surgical instruments. Advanced Ultrasonics offers an exciting and tantalizing alternative. Preliminary results suggest that intense cleaning using “advanced ultrasonic technology” can potentially result in disinfection without the need for any thermal or chemical methods.

by David Jones

In the UK, concerns about Creutzfeldt-Jacob Disease (CJD) date back to the mid-1980s when an outbreak of Bovine Spongiform Encephalopathy (BSE, a similar transmissible neuro-degenerative brain disease) in cattle raised concerns that the disease might be transmissible to humans. Confirmation came in 1996 [1] that BSE can indeed lead to a form of human CJD (variant (v)CJD) that particularly affected younger adults. This resulted in widespread public health concern, heightened again a few years ago when a study in the British Medical Journal [2] suggested that as many as 1 in 2000 Britons may be infected with the abnormal prion protein that causes vCJD. To date there have been 178 deaths due to vCJD in the UK with a few more elsewhere [3]. In both model experiments and in actual human studies it has been shown that the prion protein is readily transmitted on stainless steel instruments from one animal to another.
vCJD highlighted to clinicians and decontamination / sterile services professionals alike, the critical requirement to remove protein, as well as other infectious agents, from neuro-surgical and other reusable surgical instruments. In addition to the risk of patient-to-patient transferal of vCJD prions, there is a danger that bacteria hidden in or under any residual protein e.g. biofilms could also be passed on. A recent study in the journal Acta Neuropathologica [4] also highlighted the potential dangers associated with cross-contamination of neurosurgical instruments with the peptide amyloid beta (Aβ), a substance implicated in brain hemorrhages and Alzheimer’s disease.  
Standard methods such as Disinfector Washers (e.g. hot alkaline solutions and surfactants) are known to be inconsistently effective in removing protein from surgical instruments [5,6] and other difficulties in ensuring consistent cleanliness has led to a move towards single use instruments. However, questions remain as to how manufacturers of single use instruments can achieve consistent cleanliness and sterility when modern, well equipped Sterile Service and Decontamination (SSD) units apparently cannot. Unfortunately, single use instruments are not always clean and sterile as recent unpublished investigations have shown.
In the UK, concerns about contamination mean that GPs and dentists, who have historically performed minor interventions such as lancing of boils, removal of small cysts and abscesses etc., are now being discouraged from doing so. This, in turn, is funnelling more patients to A&E departments, which are already under tremendous strain. Post-operative infections also add to strain on the health service, leading to extended hospital stays and bed-blocking.
There is a clear need for a new approach to improve the cleaning of surgical devices. “Commercial grade” ultrasonic cleaning systems have been available for a number of years and have been used as a first stage in the cleaning process.
Ultrasonics works via the process of cavitation. Transducers bonded to the base or side of a tank are excited by high frequency electricity causing them to expand and contract at very high speed. This mechanical action causes high speed downward flexure of the radiating tank face. The speed of this movement is too fast for the water in the tank to follow, resulting in the production of vacuum chambers. On the upward flexure the vacuums are released in the form of vacuum bubbles which rise up through the fluid until they hit an object, upon which the bubbles implode under high pressure, thus drawing away any contamination that may be on the surface of the object. 
However, it has been shown that machines used in sterile services departments in the past have an erratic distribution of sound that does not consistently render instruments clear of residual protein. It was felt by many that a new way of applying ultrasound into a fluid was required. To achieve the safe cleaning of these items, the sound needs to be applied in a way that is both even as well as intense, with no gaps in activity where cleaning would not be effective.
 
In order to develop a new cleaning technology, a reliable method for measuring residual protein was needed and agreement reached on acceptable levels. The UK HTM 01-01 Guidance on the Management and Decontamination of Surgical Instruments [7], released in 2016, specifies that “there should be <5µg of protein in situ, on a side of any instrument tested”. In situ testing is specified since: “detection of proteins on the surface of an instrument gives a more appropriate indication of cleaning efficacy related to prion risk” than the swabbing techniques used in the past [8,9,10]. Currently the ProReveal system, from Synoptics Health, Cambridge UK, is the only in situ system on the market worldwide. As well as high levels of accuracy, the system also identifies the precise location of any remaining proteins on the instrument. To comply with UK HTM 01-01 guidance, therefore, any new cleaning system, ultrasonic or otherwise, needs to be validated against the levels of detection offered by ProReveal.
A second issue to be addressed by any ultrasonic cleaning technology is how to measure the ultrasonic activity. HTM 01-01 states that machines should be periodically tested for ultrasonic activity.
Historically, the only method available to Sterile Services Managers and AED’s for validating the activity in an ultrasonic tank has been to insert a piece of aluminium foil into the fluid for a set time and then visually analyse the indentations in the foil to determine the ultrasonic activity. This is a somewhat inaccurate way of validating what is a critical phase in the decontamination process. Troughs of sound can be either macroscopic or microscopic and, as such, the reliance on sight alone is unacceptable when such high levels of consistent cleanliness are expected.
With both these issues in mind, Alphasonics (a Liverpool/UK company with over 25 years’ experience in the field of ultrasonic cleaning systems) launched the ‘Medstar’ project with a view to developing ‘advanced ultrasonic technology’ for cleaning surgical equipment.  The project started in 2013 but it was not until 2015 when a ProReveal was purchased that substantive advances were made.  Progress then accelerated quickly and over a 3-year period, a point was reached whereby instruments could be rendered “completely” free of residual protein, as assessed by ProReveal technology.

To overcome the problems around accurately measuring ultrasonic activity, the world’s first Cavitation Validation Device (CVD) was developed from 2016 to 2018 which, for the first time, allows the validation of ultrasonic cleaning devices by listening exclusively for cavitation noise.
CVDs are included within most Medstar systems and the below graphs show how Medstar devices perform compared to existing ‘commercial grade’ ultrasonic cleaners (Data on File).
It is this unique, intense ultrasound technology that is so effective in removing protein residue from medical devices, as measured by the in situ ProReveal method.  To assess the effect on removal of bacteria, a UKAS (UK Accreditation Service) accredited laboratory was engaged to carry out independent trials. Instruments were contaminated by the laboratory, first with Enterococcus faecium and Staphlyococcus aureus (as specified within ISO15883 annex N- “test soils and methods for demonstrating cleaning efficacy”) and then with “dirty” conditions (specified in ISO13727). They were then cleaned in a Medstar device. Since all residual protein was being removed, the question arose: was the (now exposed) bacteria also being removed by the intense ultrasound?
Work is on-going, but preliminary results suggest that intense cleaning using ‘advanced ultrasonic technology’ can potentially result in disinfection without the need for any thermal or chemical methods.
Medstar devices have several other features to allow compliance with UK HTM01-01 guidance, such as the Generator Output Monitoring System- which constantly monitors the generator output and adjusts the input accordingly, thus ensuring that the system is always performing optimally. The CVD device is then used for periodic independent validation.
Advanced Ultrasonics offers an exciting and tantalizing alternative to thermal disinfection devices. The HTM01-01 UK guidelines are only the start of things to come and it is already widely recognized that the 5µg limit set out in the guideline is still too high. The many trials undertaken by the manufacturer have clearly shown that the Medstar range of equipment leaves no more than 0.5µg of residual protein per side on an instrument and as such renders the bacteria fully exposed to the intense, very even, action of the ultrasound and enzymatic chemicals.
High throughput systems are also available that would be of great benefit to single-use instrument manufacturers and SSD units alike. These systems will deliver a consistently lower residual protein count and a better log reduction than thermal disinfection devices.

References

1. John Collinge, Katie CL Sidle, Julie Meads, James Ironside, Andrew F Hill. Molecular analysis of prion strain variation and the aetiology of “new variant” CJD. Nature, 1996; 383(6602), 685. doi:10.1038/383685a0
2. Gill O, Spencer Y, Richard-Loendt A, Kelly C, Dabaghian R, Boyes L, Linehan J, et al. Prevalent abnormal prion protein in human appendixes after bovine spongiform encephalopathy epizootic: large scale survey. British Medical Journal, 2013; 347, 11.
3.   See www.cjd.ed.ac.uk/sites/default/files/figs.pdf
4. Jaunmuktane Z, Quaegebeur A, Taipa R, Viana-Baptista M, Barbosa R, Koriath C, Sciot R, et al. Evidence of amyloid-β cerebral amyloid angiopathy transmission through neurosurgery. Acta Neuropathologica, 2018; 135(5), 671–679. doi:10.1007/s00401-018-1822-2
5. Murdoch H, Taylor D, Dickinson J, Walker JT, Perrett D, Raven NDH, Sutton JM. 
Surface de-contamination of surgical instruments – an ongoing dilemma. Journal of Hospital Infection 2016; 63: 432-438
6. Baxter RL, Baxter HC, Campbell GA, Grant K, Jones A, Richardson P, Whittaker G. Quantitative analysis of residual protein contamination on reprocessed surgical instruments. J Hosp Infect 2006; 63, 439-444.
7. Department of Health and Social Care. Health Technical Memorandum (HTM) 2006; 01-01: management and decontamination of surgical instruments (medical devices) used in acute care.. Available: https://www.gov.uk/government/publications/management-and-decontamination-of-surgical-instruments-used-in-acute-care. Last accessed July 2018.
8. Nayuni N, Cloutman-Green E, Hollis M, Hartley J, Martin S, Perrett D. A critical evaluation of ninhydrin as a protein detection method for monitoring surgical instrument decontamination in hospitals. J Hospital Infection 2013; 84 97-102
9. Nayuni N, Perrett D.  A comparative study of methods for detecting residual protein on surgical instruments. Medical Device Decontamination (incorporating the IDSc Journal) 2013; 18 16-20
10. Perrett D, Nayuni N. Efficacy of current and novel cleaning technologies (ProReveal) for assessing protein contamination on surgical instruments 2014; Chapter 22 in Decontamination in Hospitals and Healthcare Edited by Dr. J.T. Walker, Woodhead Publishers, Cambridge, UK.

The author

David Jones
Alphasonics, Liverpool, UK
www.alphasonics.co.uk

Healthcare within reach

Special pricing available on SONY 4K andk 3D monitors

Wakey wakey

IHF: Traceability and barcoding

SmartLab

KIMES 2019, 14-17 March, Coex, Seoul