
Ebook: Interactive Technology and the New Paradigm for Healthcare

This work contains the state-of-the-art in Virtual Reality as applied to Medicine. Interactive technology, used in many research and development programs, can be applied to health care by involving: robotics, computer vision, simulation, artificial intelligence, image manipulation and storage, data gloves, man-machine interfaces, etc. The Health Telematics Application Program, for example, is advancing virtual reality and enabling technologies (simulation, visualization and robotics) in health care services for patients, the elderly and persons with disabilities. This book addresses the following items from the end-user's perspective: technology transfer, telerobotics, telemedicine, education and training, and virtual reality.
We are in the midst of an Awakening, as Medicine makes the transition from the Industrial Age to the Information Age. We have realized that most information a health care provider needs can be acquired in electronic form (images, scans, vital signs, the medical record). And with the emergence of teleoperation, we can leverage the power of the advanced information tools of software (AI, 3-D visualization and decision support), hardware (high performance computing) and networking (the information superhighway). All this will enhance the skills of the health care provider beyond mere physical limitations to enable a quality of care previously considered unachievable. Better access will be provided through remote telemedicine. Lower cost will be achieved through flexible manufacturing, just-intime inventory, and best-in-class business management.
Clearly, now is the most exciting time in the history of Medicine, and the researchers and innovators of today are the prophets of the next generation of health care.
Col. Richard M. Satava, MD, FACS
Arlington, Virginia
1991: Late summer in the San Diego Supercomuter Center seminar room: slightly myopic and self-centered, I am in the front row. Kevin Kunzelman, Cris Baray, Art Grapa and I had just completed a great deal of work on a simulator development system and a language called SLANG. I was ready to sit and listen to talks about multi-media systems. Multimedia was ultra-hip on campus among a small group of innovators, who would only later experience “dean-ly” graces of recognition. Helene Hoffman introduced me to the woman seated behind me: Karen Morgan and I talked computers and education, computers and research, computers and entertainment, Arnie Mandell, psychiatry, computers and medicine. “I'm going to organize a conference Medicine Meets Virtual Reality. Do you want to come and talk?” “Yeah, sure”, I said.
1992: That year we invented Cybermouse, and I was going to talk about it at MMVRII in January '94. At MMVRI, I had gone immersive, pretending to give a rectal exam on Channel 8. A virtual one, for sure, using the inside of a pepper in place of a real rectum. A good anchor-person, Mitch Duncan hits me with a sudden question, “What do you think is the future of virtual reality applications in Psychiatry?”. “Well, uh, uh, the field is actually still too young to think about that.” Next time I see Karen, I ask, “What do you think is the future of virtual reality applications in psychiatry?”. “Education, and ... why don' t we do a workshop about VR and Psychiatry?”.
1994: The year of the Web. Hype-r-multi-media on the Internet. At MMVRIII, I am going to talk about CyberMensch, an Internet tool for planning clinical trials. In 1973, I was still feeding punched cards and tapes to gigantic calculators, yet the transition to 1994 and the Web and VR Markup Language was as smooth as my generation going from marihuana to Prozac, from Gauloise to Diet Coke. “Reality Bytes”, the MMVRIII workshop on Psychiatry and Virtual Reality, is about VR power users. Folks like military, arcaders, and auto-eroticists whom you expect to be immersed for 8 hours or more. Besides frying brains, can it treat behavior disorders? When we last talked at the Good Earth in Palo Alto, Ralph Lamson was hopeful. We agreed the next step is to find a scientific basis, ideally expressed in physiologic measures for brain activity under the influence of VR. Before the FDA declares VR a drug.
Thanks for the journey, Karen.
Hans B. Sieburg, PhD
La Jolla, California
Interactive technology is the subject of several different European research and development programs. Basic programs have dealt with robotics, computer vision, simulation, artificial intelligence, image manipulation and storage, data gloves, man-machine interfaces, etc. This has served as the basis for applications in specific areas, such as health care. When added to the results of programs on advanced telecommunication, the outcome is a solid platform for further work which will be supported by the European Union and its member states: multimedia technology and the information superhighway in support of the new Information Society. The Health Telematics Application Programme is one benefitting from this myriad of modern possibilities. When driven by user needs, and taking the legal, standardization, and regulatory aspects into consideration, the Programme will be enhanced by incorporating VR techniques in its next phase, and it will contribute to further development of virtual reality and its integration into health care and other services for patients, the elderly, and persons with disabilities.
Participation and contribution to this conference seems natural for us representatives of the European Union; it makes the conference a truly international event, from which all parties will benefit. Thanks to advances in telecommunications, in January 1996, the fourth Medicine Meets Virtual Reality conference will be held simultaneously in Brussels and San Diego. Welcome to the Information Age!
Rudy Mattheus, MSc
Jens P. Christensen, MSc MBA
Brussels, Belgium
Virtual reality is explored as a medical and patient education and evaluation tool. The case study presented is a person with altered perceptual capabilities due to traumatic brain injury sustained in a car accident.
MAGNOBRAIN is an applied research project which has been designed to integrate known radiological, anatomical and electrophysiological information about the human brain for use in localization of brain activity. The main aspects of MAGNOBRAIN include a mathematical approach to source localization which makes use of a priori information about the brain, the building of the Brain DataBase (BDB) which contains a multitude of data about the brain, and the development of software tools for Magnetic Resonance Image processing and for 3D visualization. It is expected that these tools will evolve into a virtual medical environment which will allow interactive navigation through the brain, as well as easy and inexpensive access to a huge amount of medical information.
East Carolina University is performing telemedicine consultations to the largest prison in North Carolina and two rural hospitals. Originally established to provide only emergency consultations for trauma cases, this network's usage has expanded to include 31 School of Medicine physicians from 15 medical disciplines. Physicians see and talk to the patients via the telemedicine link and then diagnose and prescribe medications, when necessary. Practitioners also have access to a digital stethoscope, a graphics camera and a miniature, handheld dermatology camera to aid in patient examinations. The working model developed for the prison system is now being expanded to six rural hospitals and a large naval hospital. A unique aspect of the ECU program is the hybrid communications network and hardware which have been integrated. With the addition of an ATM network, this will be the only telemedicine program in the world operating integrated T1 lines, microwave, and ATM links.
This paper reports the results of three studies, each of which investigated the sense of presence within virtual environments as a function of visual display parameters. These included the presence or absence of head tracking and the presence or absence of stereoscopic cues in the design of the visual display. In each study, subjects were required to navigate a virtual environment and to complete a questionnaire designed to ascertain the level of presence experienced by the participant within the virtual world. The results of the studies indicated that the reported level of presence was significantly higher when head tracking and stereoscopic cues were provided but that the addition of binocular disparity did not increase the realism in appearance of the virtual environment.
This paper describes the possibilities of Virtual Environments in the field of orthopaedic surgical planning. The example of osteotomy operations is discussed in detail. Traditionally, planning in orthopaedics is carried out in 2D, which means that a surgeon may draw directly onto an X-Ray, thus neglecting the third dimension totally. The IAO, together with the Orthopaedic University Clinic Heidelberg (OUK), has developed an application for the planning of osteotomy operations. The surgeon, with the aid of tracked shutter-glasses and a 6D input device, is able to view and rotate the femur and the hip joint. The user can view a simulated X-Ray of the bones, determine the 2D-CCD angle, and allow the computer to perform the osteotomy automatically. The user is also able to view a section through the hip joint and thus offer an opinion on the gap in the joint. After entering the relevant angles and height for the osteotomy, the computer carries out the simulation and displays all the data on the screen.
The ergonomic evaluation of instruments in endoscopy has developed into an important research area due to the many new operational procedures in minimal invasive surgery. The use of Virtual Reality (VR) elements and, especially in this case the Dataglove, as measuring tool; the generation of virtual infra-abdominal working areas; and the virtual design of the working side of the instruments offer distinct advantages and open up new possibilities with respect to the ergonomic evaluation of instruments. The paper describes the possibilities and experiences when using VR elements, such as the Dataglove, as a means of measurement and also how this device has been integrated in the assessment of the mechanical instruments used in minimal invasive surgery. In addition to the integration, the paper will conclude with a description of the plans to expand the system.
This paper describes the development of the “Delivery Room of the Future,” a project which utilizes virtual reality technology to enhance the clinical, educational and research functionalities of birthing rooms. Important features include the use of miniature wireless sensors for physiologic measurements, reconfigurable “virtual instruments” for information display which allow three-dimensional visualization of data, and a labor process monitor which represents the progress of the fetus down the birth canal using computer graphics.
The Fraunhofer-Institute for Industrial Engineering developed a concept of distributed rehabilitation to overcome the difficulties of the introduction of Virtual Reality to Rehabilitation. The concept will be realised in three steps. Virtual Reality-work places will be implemented in large rehabilitation centres for testing and improvement. In the second step sub-centres will be integrated via network and integrated in orthopaedic, physiotherapic or medical gymnastics departments. The primary advantage of this solution is closeness to the patient. Although the maintenance and care of the system will be done centrally. Step three is to decentralise applications to be used by patients at home.
Commercially available force sensors were investigated with respect to their applicability as force transducers to measure grasping and pinching forces during hand strength evaluation procedures. The sensors investigated were force sensitive resistors, ultrasonic force sensors and miniature strain gauges. The ultrasonic force sensor was found to have the most suitable statical and dynamical characteristics for our application. Electronic pinchmeter and J-mar dynamometer were also investigated as the devices currently used for evaluation of grip and pinch strength. A virtual reality graphics simulation was constructed in order to experiment with hand evaluation procedures.
The AquaThought Foundation is a privately funded research organization dedicated to the exploration of human-dolphin interaction. Since 1989, AquaThought has studied the neurological impact of close contact with dolphins on human subjects and the related healing phenomena. Our research in neurological imaging has led to the development of MindSet, a low-cost neuro-mapping electroencephalograph instrument which makes advanced EEG research affordable. In partnership with the Cancun Convention Center, a new virtual reality theme park, we are developing the “Cyberfin Dolphin Encounter Immersive Simulator” which will bring virtual dolphin contact to a massive audience.
Abstract and Purpose: As flight simulators have been valuable to aviators, VR surgical simulators will someday be valuable in medical education, both to reduce training costs and provide enhanced physician training [l]. Current surgical simulators rely on real-time volume rendering techniques, and run on hardware costing several hunded thousand dollars. It is currently impossible for these systems to become widespread, as most physicians and hospitals own PCs which cost 100 times less. We previously created interactive software using stereo-paired images of anatomical dissections to more clearly show 3-D structural relationships in gross anatomy [2]. Subsequently we realized that most surgical procedures could be represented using the “peeling away layers” metaphor [3]; in our new program, the mouse is used as a scalpel or suture to cut away or repair layers of the abdominal wall for a hernia repair.
Materials and Methods: Stereo-paired images of surgical procedures were photographed using a Stereo Realist camera, then digitized for display at 640x480 resolution x256 colors. Software for displaying the stereo-pairs and for cutting away layers was created in Borland C++. We used a 486-66 PC with an 8-bit SVGA graphics card, although the program is compatable with a wide variety of PCs. SEGA shutter-lens glasses with an RS-232 serial adapter (available for ~ $100) are used to create the illusion of 3D: the idea is to display the left-eye image and close the right lens of the glasses, then display the right-eye image and close the left lens of the glasses. Our algorithm is:
• Detect and initialize the graphics card;
• Load two GIF files (images) into memory;
• Sit in a while loop;
• Flip between different images;
• Toggle the SEGA glasses;
• If the mouse was struck,
• Create a new pair of images which displays the new surgical field.
Results: In keeping with the “peeling away layers” metaphor, our program displays a sequence of five stereo pairs representing the skin, muscle layers, and spermatic cord in typical hernia repair. The mouse is used as the scalpel to cut away a layer, or as a suture to repair a layer of the abdominal wall.
Conclusion: Although the definition of VR is fuzzy, most agree that the use of a headset to provide enhanced 3D visualization is an essential element. We have created a very low cost, interactive VR application for the IBM PC, which provides a much clearer perception of surgical anatomy than a surgical atlas while remaining comparable in price. We believe that this program could be used to teach hundreds of other surgical procedures. Through VESA programming, this software could be distributed for a wide variety of PC/graphics card combinations. We are excited at the recent availability of low-cost HMDs (head-mounted displays) for the PC, which could be easily adapted to substitute for the SEGA glasses for use with our program.
In producing realistic, animatable models of the human body, we see much to be gained from developing a functional anatomy that links the anatomical and physiological behavior of the body through fundamental causal principles. This paper describes our current Finite Element Method implementation of a simplified lung and chest cavity during normal quiet breathing and then disturbed by a simple pneumothorax. The lung model interacts with the model of the chest cavity through applied forces. The models are modular, and a second lung and more complex chest wall model can be added without disturbing the model of the other lung. During inhalation, a breathing force (corresponding to exertion of the diaphragm and chest wall muscles) is applied, causing the chest cavity to expand. When this force is removed (at the start of exhalation), the stretched lung recoils, applying pressure forces to the chest wall which cause the chest cavity to contract. To simulate a simple pneumothorax, the intrapleural pressure is set to atmospheric pressure, which removes pressure forces holding the lung close to the chest cavity and results in the lung returning to its unstretched shape.
Preliminary experience using a low cost interactive image-directed neurosurgical system in Epilepsy Surgery is described. The device was made with readily available technology and uses rudimentary artificial reality concepts. It incorporates a partially immersive head mounted display bridging contemporary image-directed ideas with virtual reality notions. The system has been successfully used in twenty epilepsy surgery cases to position subdural electrode arrays, place burr holes, locate functional cortex and coregister electrodes. The system has improved surgery by locating lesions and surgical landmarks. Analysis of coregistration error and clinical utilization of this technology is discussed. Also described are the mathematical fundamentals correlating virtual reality image space to the operating field. Interactive image-directed techniques are a hybrid form of virtual reality which may best be described as augmented reality.
Triage is the assessment of physical conditions of casualties with limited support of staff and equipment. The critical factor in handling a mass casualty situation is time. The focus is on the quick and accurate assessment of the physical conditions of casualties and the application of life-saving treatment to those who stand a chance to survive their inflictions under current circumstances. However, the survival and revitalisation of a group of casualties has a higher priority than the treatment of an individual casualty. In a military triage training programme expert medical knowledge is conveyed to non-specialists who should be able to assess casualty conditions. Such a programme can be facilitated with computer-based training equipment and training in a Virtual Environment to allow for flexible “on-demand” training focused to a military deployment. In this paper we present Virtual Environment technology, including actuator and sensor technology, and elaborate on the key building blocks for triage training in a Virtual Environment and conclude with a future perspective.
The length and rigidity of the instruments used in minimally invasive surgery make it very difficult to handle these instruments with a high sense of touch. In this paper a novel system will be described which determines the distribution of pressures and the handling forces (X, Y and Z-axes) at the distal end of a laparoscopic forceps, represents it graphically on the PC screen and plotter, and generates tactile feedback to the finger tips of the surgeon. This system allows to measure the handling forces created during surgery and allows to transmit differences in tissue hardness directly to the finger tips of the surgeon.
The paper describes the concept of the Responsive Workbench (RW). This virtual environment was designed to support end users working on desks, workbenches, and tables as physicians, architects and scientists with an adequate human-machine interface. We attempt to construct a task-driven interface for this class of users by working in an interdisciplinary team from the beginning.
The system is explained and evaluated along three medical applications: medical education, a cardiological tutorial with a simulation system for ultrasonographic examinations of the heart, and surgery planning.
Virtual objects are located on a real “workbench”. The objects, displayed as computer generated stereoscopic images are projected onto the surface of a table. The participants operate within a non-immersive virtual environment. A “guide” uses the virtual environment while several observers can also watch events by using shutter glasses. Depending on the application, various input and output modules can be integrated, such as motion, gesture and speech recognition systems which characterize the general trend away from the classical multimedia desktop interface.
A new method of treatment of urinary incontinence is described. A tape recorder with two outlets generates musical stimuli. One outlet brings direct unconverted musical stimuli to the patient's ear, the other outlet conveys the musical stimuli to a converter where the musical pulses are converted into electrical stimuli which are then applied to the patient's anus. At first simultaneous application of stimuli into the ear and anus is performed and anal pressure response recorded. After conditioning takes place, music is applied to the patient's ear, stimulation to the anus is disconnected, and only anal pressure response is recorded. The patients are conditioned enough to generate good anal response to aural stimulation only.
The University of California at San Diego (UCSD), School of Medicine is undertaking a multi-year project to create an educational computing environment which integrates elements of Virtual Reality, Multimedia, and communications technologies. The goal of this endeavor, named the Virtual Reality-Multimedia Synthesis project, is to create next-generation educational tools which extend the flexibility and effectiveness of medical teaching, promote the development of lifelong learning, and gain acceptance within the mainstream academic community. The initial phase, a feasibility study which is now complete, resulted in the production of a mock-up and video which communicated this vision and enabled feedback from potential endusers. Phase 2, now underway, involves the development and evaluation of a prototype system. Over the next three to four years, a working version is expected to be available for expanded testing in venues beyond UCSD.
The MONSUN (Manipulator Control System Utilizing Network Technology) concept allows to control telemanipulators via Local Area Networks (LAN). This concept has been implemented and successfully tested for industrial and medical applications and is currently used at KfK as a basis for the ARTEMIS project (Advanced Robotics and TElemanipulation System for MIS), a telepresence system for minimal invasive surgery being under development. The shortcomings of the LAN-based version of MONSUN — 10 Mbit/s Ethernet is currently being used — are due to the fact that the LAN can be used to close the control loop, but transmission of video and audio signals has to rely on additional transmission media. ATM (Asynchronous Transfer Mode) based high speed communication systems promise an “all-in-one” solution, at least for in-house applications. The system concept of an ATM-based local communication system for telesurgery is presented and the resulting performance characteristics are discussed.
This paper discusses the technical design considerations of high-fidelity haptic interface hardware for medical virtual reality. Though few such systems are commercially available, this type of hardware is critical for the medical VR industry, because many medical procedures rely heavily on haptic cues. A brief listing of new off-the-shelf solutions from Immersion Corp. is included.