Ramesh Makam, CS Rajan, Tulip Brendon, V Shreedhar, K Saleem, Sangeeta Shrivastava, R Sudarshan, CRJ Prakash Naidu
228 - 230
In this article, we present the results of a pilot study that examined the performance of people training on a Virtual Reality based BEST-IRIS Laparoscopic Surgery Training Simulator. The performance of experienced surgeons was examined and compared to the performance of residents. The purpose of this study is to validate the BEST-IRIS training simulator. It appeared to be a useful training and assessment tool.
Tele-echography may be an interesting alternative to conventional care when the medical expert is not close to the patient to be examined. In a system for tele-echography, medical expert proprioception and gesture feelings must be preserved to synchronize the ultrasound images with the motion made by the physician. It is why we develop a master-slave system for robotic tele-echography integrating force feedback. Because existing haptic devices are not fully adapted to this application – their workspace is often small and their basis is fixed –, we have developed a one-dimensional free-hand force feedback device. In this paper, we describe the design and a first evaluation of this new haptic device for robotic teleechography.
Endoscopic imaging for minimal access surgery has many limitations that include: 2D and narrow angle imaging, limited workspace of the endoscope caused by the fulcrum effect of the body wall, and the presence of the endoscope in the incision that prevents use of the incision for other instrumentation. We have designed a novel stereoscopic 3D imaging device with 5 DOF and remote control that can be inserted and attached in the body cavity. The device, contained within a 11/16" tube, includes two miniature cameras and five small motors that position the cameras to provide a stereoscopic view of the surgical site. When inserted the cameras are retracted and protected by an outer shell. After the device is fixed within the abdominal cavity, a motor rotates an inner shell to expose the cameras. Once exposed, the cameras can tilt in tandem, translate independently along the axis of the tube, and independently pan. The software controls the cameras to create new views for the surgeon, to move along the adjustable baseline, to verge for stereoscopic viewing, and to potentially track moving organs. We have completed a proof of concept design, which includes CAD models and animations of the device, and we are currently building a physical prototype. Once the prototype is completed, we will begin testing it in a surgical mock-up, followed by animal and clinical trials.
Louise Moody, Alan Waterworth, John G. Arthur, Aleksandar Zivanovic, Edward Dibble
241 - 243
There is limited research considering the usability of medical virtual environments. Usability evaluation is an essential validation phase that considers the extent to which a product achieves its specific goals, with effectiveness, efficiency and satisfaction. A four-stage iterative approach is adopted to enhance usability in the development of a knee arthroscopy training system. This process has drawn attention to issues that may impede system usability for example non-conformity to platform conventions, and visibility of the system status. The process highlights features that computer scientists can overlook when working closely with a system but that are essential to user acceptance and effective application.
Through definition of a comprehensive tutorial model, the Warwick, Imperial and Sheffield Haptic Knee Arthroscopy Training System (WISHKATS) aims to provide independent, flexible and consistent training and assessment. The intention is to satisfy user acceptance by limiting the constraints by which the system can be utilised, as well as demonstrating validity and reliability. System use can either be under the guidance and feedback offered by the system or. of a senior surgeon. Objective metrics are defined for performance feedback and formal assessment.
K Moorthy, M Mansoori, F Bello, J Hance, S Undre, Y Munz, A Darzi
247 - 252
WebSET is an Internet based educational tool that can be used on any standard personal computer. It has been developed by a European collaboration and integrates high quality multimedia learning materials with VR simulation. The aim of this study was to evaluate the benefit of the VR simulation on the learning of procedure based psychomotor skills. Subjects were divided into three groups. The group that used the entire package including the VR simulation were superior to the group that used only the multimedia component in terms of the their procedural skills in the post-training assessment. Both groups performed better than the controls.
Optimisation of parameters for elastic models is essential for comparison or finding equivalent behaviour of elastic models when parameters cannot simply be transferred or converted. This is the case with a large range of commonly used elastic models. In this paper we present a general method that will optimise parameters based on the behaviour of the elastic models over time.
The purpose of the research conducted was to develop a real-time surgical simulator for preoperative planning of surgery in congenital heart disease. The main problem simulating procedures on cardiac morphology is the need for a large degree of detail and simulation speed. In combination with a demand for physically realistic real-time behaviour this gives us tradeoffs not easily balanced. The LR-Spring Mass model handles these constraints by the use of domain specific knowledge.
Moad Y. Mowafi, Kenneth L. Summers, James Holten, John A. Greenfield, Andrei Sherstyuk, David Nickles, Edward Aalseth, Ward Takamiya, Stanley Saiki, Dale Alverson, Thomas Preston Caudell
259 - 261
Project TOUCH (Telehealth Outreach for Unified Community Health) is a collaborative effort between University of New Mexico and University of Hawaii. The purpose of the project is to demonstrate the feasibility of using advanced technologies to overcome geographical barriers to delivery of medical education and to enhance the learning process within a group setting. This has led to the design and implementation of a new system that addresses the critical requirements for collaborative virtual environments: consistency, networking, scalability, and system integration. The objective of this study is to evaluate the performance of the collaborative system based on use patterns during Project TOUCH sessions.
This paper proposes a hybrid model mixing geometry and volume data to improve representation of virtual bodies. This model applies object-oriented data models and rendering techniques to virtual organs, and enables both interactive VR simulation and detailed volume visualization of tissue of interest (e.g. coronary). Also, a physics-based framework interactively simulates estimated surgical fields which are used in preoperative discussion. Based on the proposed methods, a VR-based strategic planning system is developed. The system does not need high cost manual segmentation of patient dataset and efficiently supports planning of surgical approaches in cardiovascular surgery.
Conventional display in robotic surgery such as flat displays or stereoscopic displays decreases obtainable information around target tissue. For supporting manipulation and performing safe surgery, this paper proposes a haptic navigation method, which enables surgeons to avoid collision with untouchable regions around target tissue by producing force feedback through a master manipulator. This paper also developed an input interface for assignment of 3D untouchable regions through 2D device. Simulator based experiment clears effectiveness of the proposed haptic navigation for improving safety of robotic surgery.
Günter Niemeyer, Katherine J. Kuchenbecker, Raymond Bonneau, Probal Mitra, Andrew M. Reid, Jonathan Fiene, Grant Weldon
272 - 274
Telerobotic systems are revolutionizing minimally invasive surgery (MIS), giving the surgeon complete control over precise dexterous movements of tiny robotic instruments. Such ‘surgery-by-wire’ approaches also create unique opportunities for simulation and training, as the surgeon operates at a computer-mediated haptic console. Possible extensions include offline training in simulated environments and advanced guidance and mentoring during actual operations. To explore these options and further improve telerobotic interfaces, we have constructed a two-handed, fully articulating haptic console that provides force and torque feedback as well as a stereoscopic display.
Traditional open surgical techniques require a surgeon to assume a posture of leaning over the patient with a direct eye-to-hand perspective. As new minimally invasive and remote surgical procedures evolve, the surgeon is not required to maintain the same posture as in open techniques. While more ergonomic postures may be facilitated, some current remote systems have maintained surgeon configurations that are small variants of legacy postures (e.g., maintaining the eye to hand perspective). While the legacy configuration may be more familiar with some surgeons, studies have indicated that it can result in excessive fatigue. Robotics and human factors researchers have determined that fatigue due to inefficiencies in operator interfaces lead to longer completion times and increased task execution errors. This paper discusses operator interface design issues and guidelines that are relevant to remote and minimally invasive surgery, and presents one possible operator interface solution based on the compact remote console deployed for environmental restoration and remote handling of hazardous nuclear waste.
This pilot study is the first known in-depth case study of the effectiveness of virtual reality therapy (VRT) as a treatment for Test Anxiety (TA). The subject of the study was a 28-year-old male, whose anxiety and avoidance behavior was interfering with his normal academic activities. For treatment, he was placed in a virtual classroom and later in a virtual auditorium. The subject was exposed to six moderately increasing in difficulty level virtual situations. The subject rated each situation for discomfort. As a simple measure of anxiety, a modified version of the Subjective Units of Disturbance (SUD) scale was used every five minutes during exposure. This case study showed VRT to be an effective treatment method for reducing self-reported TA. Symptoms experienced by the subject during VRT sessions were just as real to the subject as actual test taking and general TA situations. They included increased heart rate, mild dizziness, and headaches. This case study of TA indicates that VRT may be used as an effective treatment method for reducing self-reported anxiety and improving the performance of subject(s) who suffer from TA.
Tobias Obst, Rainer Burgkart, Eugen Ruckhäberle, Robert Riener
281 - 287
This paper presents an elementary overview of the potential of Multimodal Virtual Reality (MVR) techniques in medical education, e.g. obstetrics. The study shows how to transfer the concept of MVR from a time-independent environment, e.g. the Munich Knee Joint Simulator, to a time-critical simulation environment as it can be found in flight simulators.
The simulator consists of a haptic, a graphical and an acoustic user interface, which are connected to a biomechanical model for the birth process itself and a physiological model of both mother and child, in order to simulate, e.g. a cardiotocograph (CTG). The user can just watch an uncomplicated birth or is acting as the responsible obstetrician who has a variety of treatment options during the delivery with the most relevant medication or forceps/vacuum-extraction. During this practical training a MVR feedback system assists the trainee and exposes his errors and, thus, allows him to learn faster without endangering a real mother and her child. This concept allows for the first time to transfer stored haptic expert-knowledge to the trainee without a tool-based feedback approach.
Sadao Omata, Yoshinobu Murayama, Christos E. Constantinou
288 - 290
Surgical practice would be significantly enhanced with robotic systems incorporating tactile sensors. Current tactile sensor technology consists mainly of strain gauge elements having a limited bandwidth. A novel tactile sensor system, has been developed using a piezoelectric transducer(PZT), to simulate the properties of the human hand for use as a surgical support instrument and a palpation probe. Visualization of tactile information as an audio signal is provided, representing tissue properties in terms of an amplitude and frequency modulated signal. Representative data measured from pig brain, lung, pancreas, tongue and liver show that the changes in frequency corresponds to tissues stiffness and contact pressure. The technology developed in this new surgical support system has potential applications in virtual systems or robotic tele-medical care.
Abhilash Pandya, Mohammad-Reza Siadat, Greg Auner, Mohammad Kalash, R. Darin Ellis
291 - 297
This paper is focused on the human factors analysis comparing a standard neuronavigation system with an augmented reality system. We use a passive articulated arm (Microscribe, Immersion technology) to track a calibrated end-effector mounted video camera. In real time, we superimpose the live video view with the synchronized graphical view of CT-derived segmented object(s) of interest within a phantom skull. Using the same robotic arm, we have developed a neuronavigation system able to show the end-effector of the arm on orthogonal CT scans. Both the AR and the neuronavigation systems have been shown to be within 3mm of accuracy. A human factors study was conducted in which subjects were asked to draw craniotomies and answer questions to gage their understanding of the phantom objects. The human factors study included 21 subjects and indicated that the subjects performed faster, with more accuracy and less errors using the Augmented Reality interface.
This pilot study compares the differences in learning outcomes when students are presented with either an active (student-centered) or passive (teacher-centered) virtual reality-based anatomy lesson. The “active” lesson used UCSD's Anatomic VisualizeR and enabled students to interact with 3D models and control presentation of learning materials. The “passive” lesson used a digital recording of an anatomical expert's tour of the same VR lesson played back as a QuickTime™ movie. Subsequent examination of the recal and retention studied anatomic objects were comparable in both groups. Issues underlying these results are discussed.
Francesco Pinciroli, Marco Masseroli, Livio A. Acerbo, Stefano Bonacina, Roberto Ferrari, Mario Marchente
301 - 303
This paper presents a low cost software platform prototype supporting health care personnel in retrieving patient referral multimedia data. These information are centralized in a server machine and structured by using a flexible eXtensible Markup Language (XML) Bio-Image Referral Database (BIRD). Data are distributed on demand to requesting client in an Intranet network and transformed via eXtensible Stylesheet Language (XSL) to be visualized in an uniform way on market browsers. The core server operation software has been developed in PHP Hypertext Preprocessor scripting language, which is very versatile and useful for crafting a dynamic Web environment
Sonia Pujol, Matthieu Pecher, Jean-Luc Magne, Philippe Cinquin
310 - 312
Endovascular surgery provides a minimally invasive solution for the treatment of aortic aneurysms. Fluoroscopic guidance involves X-rays exposure and loss of space information. We have developed a navigation system allowing real-time visualisation of the endovascular tools in a 3D model of the vessels without any radiation exposure. A modified endoprosthesis is equipped with a magnetic sensor tracked by the Aurora™ magnetic localizer. The registration step uses 2.5D ultrasonography to replace pre-operative CT data in the Operating Room referential. The Virtual Reality based navigation system shows the location of the endoprosthesis inside a 3D CT model of the aorta. Endovascular procedure benefits from a reduced radiation exposure.
In this paper we discuss a new approach to generate 3D models for a simulation system for training an angioplasty. The underlying data for these models are obtained from angiograms that are captured during routine interventions in cardiology. For the extraction of the arteries we use a non-linear classificatory with features based on vesselness information (using a scale-space approach), the gray value, and motion information of the arteries. As result we can correctly find 80 % of the arteries in the image and we have 4 % pixels incorrectly classified as arteries.
These models serve for a virtual catheter laboratory that is based on original instruments like catheters, wires, control instruments for the X-ray, syringes, and pressure pumps for the balloon catheter but instead of a patient an input instrument is used. This instrument sends positional and pressure data to a PC that simulates the patient. The cardiologist then obtains the visual and haptic feedback as if we operated a real patient.
Mark E. Rentschler, Adnan Hadzialic, Jason Dumpert, Stephen R. Platt, Shane Farritor, Dmitry Oleynikov
316 - 322
Laparoscopic techniques have allowed surgeons to perform operations through small incisions. However, the benefits of laparoscopy are still limited to less complex procedures because of losses in imaging and dexterity compared to conventional surgery. This project is developing miniature robots to be placed within the abdominal cavity to assist the surgeon. These remotely controlled in vivo robots provide the surgeon with an enhanced field of view from arbitrary angles as well as provide dexterous manipulators not constrained by small incisions in the abdominal wall.
Robert Riener, Bundit Sae-Kee, Martin Frey, Rainer Burgkart
323 - 326
Force-torque measuring input devices can significantly enhance the performance of classical simulation environments that are, for example, based on pure passive phantoms. Such devices allow not only the determination of force/torque amplitude and direction but also the contact point on the phantom. The force/torque information can be displayed visually or acoustically, drive a realistic graphical animation environment or it can be saved and compared with a haptic library comprising the force/torque history of any medical specialist. In this paper the technical principle is exemplified by an interactive human torso. A plastic phantom model of a human torso is instrumented with a 6-degree-of-freedom force/torque sensor, thus, allowing an intuitive and interactive use for education of human anatomy.
Bundit Sae-Kee, Robert Riener, Martin Frey, Thomas Pröll, Rainer Burgkart
327 - 332
In this paper, we propose a new interactive simulation system for dental treatment training. The system comprises a virtual reality environment and a force-torque measuring device to enhance the capabilities of a passive phantom of tooth anatomy in dental treatment training processes. The measuring device is connected to the phantom, and provides essential input data for generating the graphic animations of physical behaviors such as drilling and bleeding. The animation methods of those physical behaviors are also presented.
This system is not only able to enhance interactivity and accessibility of the training system compared to conventional methods but it also provides possibilities of recording, evaluating, and verifying the training results.