Development of an Augmented Reality Force Feedback Virtual Surgery Training Platform

Ruei-Jia Chen, Hung-Wei Lin, Yeong-Hwa Chang, Chieh-Tsai Wu, Shih-Tseng Lee

Abstract


In order to develop a virtual surgery training platform with a force feedback function so as to facilitate the training of new medical personnel, this study first had to establish a virtual environment, and then implement interactions involving vision and tactile sensations. The system's augmented reality function modules include the establishment of an augmented reality environmental space, image loading and model establishment, and force feedback modules, as well as the required design for collision detection, object parameter settings, and controller functions. Apart from combining a virtual environment with force feedback and establishing diverse force feedback modules, this project also overcomes the single-point sensor restriction of most force feedback hardware, and establishes a tactile cutting function. In addition to the establishment of force feedback modules, the project further employs the conservation of energy principle in the design of the energy estimator and controller, and completes the design of a stable virtual surgery training platform.

Keywords


Augment Reality; force feedback

References


  1. C. W. M. Leão, J. P. Lima, V. Teichrieb, E. S. Albuquerque, and J. Keiner, "Altered reality: Augmenting and diminishing reality in real time," in IEEE Virtual Reality Conference (VR), Sigapore, 2011, pp. 219-220.
    doi: 10.1109/VR.2011.5759477
  2. C. V. Hurtado, A. R. Valerio, and L. R. Sanchez, "Virtual reality robotics system for education and training," in IEEE Electronics, Robotics and Automotive Mechanics Conference (CERMA), Cuernavaca, Mexico, 2010, pp. 162-167.
    doi: 10.1109/CERMA.2010.98
  3. A. Kotranza and B. Lok, "Virtual human + tangible interface = mixed reality human an initial exploration with a virtual breast exam patient," in IEEE Virtual Reality Conference (VR), Reno, Nevada, USA, 2008, pp. 99-106.
    doi: 10.1109/VR.2008.4480757
  4. M. Vankipuram, K. Kahol, A. Ashby, J. Hamilton, J. Ferrara, and M. Smith, "Virtual reality based training to resolve visio-motor conflicts in surgical environments," in IEEE International Workshop on Haptic Audio visual Environments and Games (HAVE), Ottawa, Ontario, Canada, 2008, pp. 7-12.
    doi: 10.1109/HAVE.2008.4685290
  5. J. Ackerman, (2000). Ultrasound visualization research [Online]. Available: http://www.cs.unc.edu/Research/us/
  6. J. Dankelman, "Surgical robots and other training tools in minimally invasive surgery," in 2004 IEEE International Conference on Systems, Man and Cybernetics, The Hague, The Netherlands, 2004, pp. 2459-2464.
    doi: 10.1109/ICSMC.2004.1400699
  7. J. Pettersson, K. L. Palmerius, H. Knutsson, O. Wahlstrom, B. Tillander, and M. Borga, "Simulation of patient specific cervical hip fracture surgery with a volume haptic interface," IEEE Transactions on Biomedical Engineering, vol. 55, no. 4, pp. 1255-1265, 2008.
    doi: 10.1109/TBME.2007.908099
  8. K. D. Reinig, C. G. Rush, H. L. Pelster, V. M. Spitzer, and J. A. Heath, "Real-time visually and haptically accurate surgical simulation," Studies in health technology and informatics, vol. 29, no. pp. 542-545, 1996.
  9. R. Konietschke, A. Tobergte, C. Preusche, P. Tripicchio, E. Ruffaldi, S. Webel, and U. Bockholt, "A multimodal training platform for minimally invasive robotic surgery," in 19th IEEE International Symposium on Robot and Human Interactive Communication (Ro-Man), Viareggio, Italy, 2010, pp. 422-427.
    doi: 10.1109/ROMAN.2010.5598608
  10. C. Basdogan, S. De, J. Kim, M. Manivannan, H. Kim, and M. A. Srinivasan, "Haptics in minimally invasive surgical simulation and training," IEEE Computer Graphics and Applications, vol. 24, no. 2, pp. 56-64, 2004.
    doi: 10.1109/MCG.2004.1274062
  11. A. M. Tahmasebi, P. Abolmaesumi, D. Thompson, and K. Hashtrudi-Zaad, "Software structure design for a haptic-based medical examination system," in IEEE International Workshop on Haptic Audio Visual Environments and their Applications, Ottawa, Ontario, Canada, 2005.
    doi: 10.1109/HAVE.2005.1545658
  12. M. A. Otaduy and M. C. Lin, "A modular haptic rendering algorithm for stable and transparent 6-DOF manipulation," IEEE Transactions on Robotics, vol. 22, no. 4, pp. 751-762, 2006.
    doi: 10.1109/TRO.2006.876897
  13. K. J. Kuchenbecker, J. Fiene, and G. Niemeyer, "Improving contact realism through event-based haptic feedback," IEEE Transactions on Visualization and Computer Graphics, vol. 12, no. 2, pp. 219-230, 2006.
    doi: 10.1109/TVCG.2006.32
  14. G. Song and S. Guo, "Development of an active self-assisted rehabilitation simulator for upper limbs," in The Sixth World Congress on Intelligent Control and Automation (WCICA), Dalian, China, 2006, pp. 9444-9448.
    doi: 10.1109/WCICA.2006.1713830
  15. A. Bardorfer, M. Munih, A. Zupan, and A. Primozic, "Upper limb motion analysis using haptic interface," IEEE/ASME Transactions on Mechatronics, vol. 6, no. 3, pp. 253-260, 2001.
    doi: 10.1109/3516.951363
  16. C. Youngblut, E. J. Rob, H. N. Sarah, A. W. Ruth, and A. W. Craig, (1996). Review of virtual environment interface technology [Online]. Available: http://www.hitl.washington.edu/scivw/IDA/
  17. B. R. Brewer, M. Fagan, R. L. Klatzky, and Y. Matsuoka, "Perceptual limits for a robotic rehabilitation environment using visual feedback distortion," IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 13, no. 1, pp. 1-11, 2005.
    doi: 10.1109/TNSRE.2005.843443
  18. Y. Tao, H. Hu, and H. Zhou, "Integration of vision and inertial sensors for home-based rehabilitation," in 2nd Workshop on Integration of Vision and Inertial Sensors (InerVis), Barcelona, Spain, 2005.
  19. Sensable technologies, inc. Available: http://www.sensable.com/haptic-phantom-omni.htm
  20. N. Diolaiti, G. Niemeyer, F. Barbagli, and J. K. Salisbury, "A criterion for the passivity of haptic devices," in IEEE International Conference on Robotics and Automation (ICRA), Barcelona, Spain, 2005, pp. 2452-2457.
    doi: 10.1109/ROBOT.2005.1570480
  21. A. Jazayeri and M. Tavakoli, "Stability analysis of sampled-data teleoperation systems," in 49th IEEE Conference on Decision and Control (CDC), Atlanta, Georgia USA, 2010, pp. 3608-3613.
    doi: 10.1109/CDC.2010.5718117
  22. J. H. Ryu, Y. S. Kim, and B. Hannaford, "Sampled and continuous time passivity and stability of virtual environments," in IEEE International Conference on Robotics and Automation (ICRA), Taipei, Taiwan, 2003, pp. 822- 827 vol.821-822- 827 vol.821.
    doi: 10.1109/ROBOT.2003.1241695
  23. N. Diolaiti, G. Niemeyer, F. Barbagli, and J. K. Salisbury, "Stability of haptic rendering: Discretization, quantization, time delay, and Coulomb effects," IEEE Transactions on Robotics, vol. 22, no. 2, pp. 256-268, 2006.
    doi: 10.1109/TRO.2005.862487
  24. J. H. Ryu, J. H. Kim, D. S. Kwon, and B. Hannaford, "A simulation/experimental study of the noisy behavior of the time domain passivity controller for haptic interfaces," in IEEE International Conference on Robotics and Automation (ICRA), Barcelona, Spain, 2005, pp. 4321-4326.
    doi: 10.1109/ROBOT.2005.1570785
  25. K. Hertkorn, T. Hulin, P. Kremer, C. Preusche, and G. Hirzinger, "Time domain passivity control for multi-degree of freedom haptic devices with time delay," in IEEE International Conference on Robotics and Automation (ICRA), Anchorage, Alaska, 2010, pp. 1313-1319.
    doi: 10.1109/ROBOT.2010.5509148
  26. J. H. Ryu, D. S. Kwon, and B. Hannaford, "Stable teleoperation with time-domain passivity control," IEEE Transactions on Robotics and Automation, vol. 20, no. 2, pp. 365-373, 2004.
    doi: 10.1109/TRA.2004.824689
  27. J. Yoneyama, "Robust stability and stabilization for uncertain discrete-time fuzzy systems with time-varying delay," in 7th Asian Control Conference (ASCC), Hong Kong, 2009, pp. 1022-1027.


Full Text: PDF HTML

Refbacks

  • There are currently no refbacks.


Copyright © 2011-2018 AUSMT ISSN: 2223-9766