Bu kaydın yasal hükümlere uygun olmadığını düşünüyorsanız lütfen sayfa sonundaki Hata Bildir bağlantısını takip ederek bildirimde bulununuz. Kayıtlar ilgili üniversite yöneticileri tarafından eklenmektedir. Nadiren de olsa kayıtlarla ilgili hatalar oluşabilmektedir. MİTOS internet üzerindeki herhangi bir ödev sitesi değildir!

Haptic User Interface Integration for 3D Game Engines








BROWSE_DETAIL_CREATORS: Şengül, Gökhan (Author), Çağıltay, Nergiz E. (Author), Özçelik, Erol (Author),

BROWSE_DETAIL_CONTRIBUTERS: Tuner, Emre (Research Responsible),

BROWSE_DETAIL_PUBLISHER: Human-Computer Interaction. Applications and Services BROWSE_DETAIL_PUBLICATION_NAME: 16th International Conference, HCI International 2014, Heraklion, Crete, Greece, June 22-27, 2014, Proceedings, Part III BROWSE_DETAIL_PUBLICATION_PAGE: 654-662



Surgical simulation, haptic devices, game engines, interaction.


Touch and feel senses of human beings provide important information about the environment. When those senses are integrated with the eyesight, we may get all the necessary information about the environment. In terms of human-computer-interaction, the eyesight information is provided by visual displays. On the other hand, touch and feel senses are provided by means of special devices called “haptic” devices. Haptic devices are used in many fields such as computer-aided design, distance-surgery operations, medical simulation environments, training simulators for both military and medical applications, etc. Besides the touch and sense feelings haptic devices also provide forcefeedbacks, which allows designing a realistic environment in virtual reality applications. Haptic devices can be categorized into three classes: tactile devices, kinesthetic devices and hybrid devices. Tactile devices simulate skin to create contact sensations. Kinesthetic devices apply forces to guide or inhibit body movement, and hybrid devices attempt to combine tactile and kinesthetic feedback. Among these kinesthetic devices exerts controlled forces on the human body, and it is the most suitable type for the applications such as surgical simulations. The education environments that require skill-based improvements, the touch and feel senses are very important. In some cases providing such educational environment is very expensive, risky and may also consist of some ethical issues. For example, surgical education is one of these fields. The traditional education is provided in operating room on real patients. This type of education is very expensive, requires long time periods, and does not allow any error-andtry type of experiences. It is stressfully for both the educators and the learners. Additionally there are several ethical considerations. Simulation environments supported by such haptic user interfaces provide an alternative and safer educational alternative. There are several studies showing some evidences of educational benefits of this type of education (Tsuda et al 2009; Sutherland et al 2006). Similarly, this technology can also be successfully integrated to the physical rehabilitation process of some diseases requiring motor skill improvements (Kampiopiotis & Theodorakou, 2003). Hence, today simulation environments are providing several opportunities for creating low cost and more effective training and educational environment. Today, combining three dimensional (3D) simulation environments with these Haptic User Interface Integration for 3D Game Engines 655 haptic interfaces is an important feature for advancing current human-computer interaction. On the other hand haptic devices do not provide a full simulation environment for the interaction and it is necessary to enhance the environment by software environments. Game engines provide high flexibility to create 3-D simulation environments. Unity3D is one of the tools that provides a game engine and physics engine for creating better 3D simulation environments. In the literature there are many studies combining these two technologies to create several educational and training environments. However, in the literature, there are not many researches showing how these two technologies can be integrated to create simulation environment by providing haptic interfaces as well. There are several issues that need to be handled for creating such integration. First of all the haptic devices control libraries need to be integrated to the game engine. Second, the game engine simulation representations and real-time interaction features need to be coordinately represented by the haptic device degree of freedom and force-feedback speed and features. In this study, the integration architecture of Unity 3D game engine and the PHANToM Haptic device for creating a surgical education simulation environment is provided. The methods used for building this integration and handling the synchronization problems are also described. The algorithms developed for creating a better synchronization and user feedback such as providing a smooth feeling and force feedback for the haptic interaction are also provided. We believe that, this study will be helpful for the people who are creating simulation environment by using Unity3D technology and PHANToM haptic interfaces. Keywords: Surgical





Tam metin erişim için URL'yi tıklayınız.

BROWSE_DETAIL_TAB_REFERENCESRobles-De-La-Torre, G.: The importance of the sense of touch in virtual and real environments. IEEE Multimedia 13(3), 24–30 (2006)CrossRefGoogle Scholar2.Bar-Cohen, Y.: Haptic devices for virtual reality, telepresence, and human-assistive robotics. Biologically Inspired Intelligent Robots 73 (2003)Google Scholar3.D’Aulignac, D., Balaniuk, R.: Providing reliable virtual, echographic exam of the human thigh. In: Salisbury, J.K., Srinivasan, M.A. (eds.) Proceedings of the Fourth PHANToM User’s Group Workshop. AI Lab Technical Report No. 1675 and RLE Technical Report No. 633. MIT, Cambridge (1999)Google Scholar4.Mor, A.B.: 5 DOF force feedback using the 3 DOF PHANToM and a 2 DOF device. In: Proceedings of the Third PHANToM Users Group, PUG98. AI Technical Report no. 1643 and RLE Technical Report no. 624. MIT, Cambridge (1998)Google Scholar5.Shulman, S.: Digital antiquities. Computer Graphics World 21(11), 34–38 (1998)Google Scholar6.Dillon, P., Moody, W., Bartlett, R., Scully, P., Morgan, R., James, C.: Sensing the fabric. Paper presented at the Workshop on Haptic Human-Computer Interaction, Glasgow (2000)Google Scholar7.Durbeck, L.J.K., Macias, N.J., Weinstein, M., Johnson, C.R., Hollerbach, J.M.: SCIRun haptic display for scientific visualization. In: Salisbury, J.K., Srinivasan, M.A. (eds.) Proceedings of the Third PHANToM Users Group, PUG98. AI Technical Report no. 1643 and RLE Technical Report no. 624. MIT, Cambridge (1998)Google Scholar8.Tsuda, S., Scott, D., Doyle, J., Jones, D.: Surgical Skills Training and Simulation. Curr. Probl. Surg. 2009 46, 271–370 (2009)CrossRefGoogle Scholar9.Sutherland, L.M., Middleton, P.F., Anthony, A., Hamdorf, J., Cregan, P., Scott, D., Maddern, G.J.: Surgical Simulation: A Systematic Review. Ann. Surg. 243(3), 291–300 (2006)CrossRefGoogle Scholar10.Kampiopiotis, A., Theodorakou, K.: Virtual Reality in the Teaching of Motor Skills: Literature Review. Pakistan Journal of Applied Sciences 3(1), 36–39 (2003)CrossRefG





    • TEXT_STATS_TOTAL: 2473