The ease of use and effectiveness of ultrasound image-guided surgical procedures might be improved by observing single or multiple echographic views properly positioned “within” the patient. Several computer hardware and visualization technologies have to be extended and combined in order to build a system in which such visualizations are available in real time, e.g., during the ultrasound imaging of the patient.
A system we have been developing for several years is now beginning to be used for experiments in needle biopsies on breast models (“breast phantoms”) of a commercial dress mannequin. Although the system is still under development, it has already shown some promise. The operator of the ultrasound study can observe in stereo the ultrasound images overlaid within the live image of the patient/mannequin. By wearing head-gear with a miniature video display to each eye, the physician can now see both the outside of the patient/ mannequin and the inside, with appropriate continuation of a needle image from the outside live video to the inside image from the ultrasound scanner.
Major problems that need much improvement include: 1) combining in realtime the incoming ultrasound data with older, higher quality imaging data to yield more useful combined images, e.g., previously acquired CT or MRI data of same region of the patient , 2) improving tracking calibration, especially improving our method of dealing with tracking errors due to system latency, 3) improving the composition operation between the live camera images and the synthetic image volumes, 4) improving the volume rendering of the region of interest from multiple ultrasound images acquired from free-hand scanning.
Recent improvements to the UNC system include the development of a more compact, lighter-weight head-mounted display that minimizes parallax distortions, and a new method to track the movement of the needle in the synthetic environment.
Traditionally, users of HMD’s with video cameras mounted on top of the helmet report that the parallax distortion resulting from the difference between the location of the video camera and the user’s eyes results in significant impact on hand—eye coordination for tasks performed while wearing the HMD. Our new prototype HMD uses folded optics to match the locations of the video cameras to the viewer’s right and left eye locations. We have also recently implemented a method for tracking the progress of the needle through the synthetic environment by tracking the base of the needle with the same type of high-precision, hand-held mechanical tracker that tracks the ultrasound probe. We display the projected path of the needle tip in the HMD in the hope that this will allow the physician to guide the needle more accurately to the target tissue area.
In conclusion, image-guided surgical procedures may become more popular with the increasing sophistication and availability of the necessary technologies. Likely future targets are deeper nodules within the abdomen, smaller targets such as foreign body fragments in accident and trauma emergencies, more effective guidance for complex catheterization procedures. With some luck, a future generation of physicians may consider the current method of observing patient imagery out of context on a cartmounted display just as quaint and amusing as we now consider individuals communicating in the last century by telegraph instead of by telephone and video teleconferencing.