Wataru HASHIMOTO,Hiroo IWATA
Graduate school of Engineering, University of Tsukuba
Tsukuba, Ibaraki, 305 JAPAN
Institute of Engineering Mechanics, University of Tsukuba
Tsukuba, Ibaraki, 305 JAPAN
Force sensation coupled with visual display allows people to interact intuitively with virtual environments. Since haptic devices are implemented in various mechanical configuration, software of virtual environment depends on control program of haptic interface. This problem is a hazard for development of further applications. In this paper, we describe about an software tool for construction of virtual environment with force feedback. Our software tool called LHX(Library of Haptics) consists of seven modules divided by function. Application of haptic virtual environment is easily reconfigured by exchanging these modules. This paper also presents latest application "Volume Haptics Library" which is developed on LHX.
Force sensation plays an important role in the manipulation of virtual objects. Haptic interface is a feedback device that generates skin and muscle sensation, including sense of touch, weight and rigidity. We have been working in research on haptic interface in virtual environments for a number of years. We have developed various force feedback devices and its applications. In most cases of haptic interface, software of virtual environment is tightly connected to low-level control program such as force displays. This problem is a hazard for development of further application of haptic virtual environment. Some works on software tools for haptic interface are shown in . We have been improving the software tools to support various force displays and their applications. Our latest system "LHX" is composed of seven modules: device driver of force display, haptic renderer, model manager, primitive manager, autonomy engine, visual display manager and communication interface.
The emphasis of this paper is how to support various mechanical configuration of force display and their applications. Though studies of our previous works on mechanical design and software structure of applications, we categorize the requirements as follows:
Haptic interface is a mechanical device which generates reaction force from virtual objects. There are three approaches to implement haptic interface: tool handling type, exoskeleton type and object oriented type force display. LHX should transparently support each type of force display.
Modular functions increase productivity of software of virtual environment. We have been developed various applications of haptic virtual environment, and LHX includes typical functions which are extracted from existing applications.
Our components support following functions:
In order to deal with requirements as discussed in the previous section, LHX is composed of seven modules. Dividing into these modules, force displays and virtual environments are easily reconfigured. Figure 1 shows basic structure of LHX.
Device driver manages sensor input and actuator output for haptic interface. Various types of haptic interface can be connected to LHX by changing device driver. We developed device driver of above mentioned force displays so that they can be connected to LHX.
Currently rendering means generation of visual image. However, force sensation also needs rendering. Hardness, weight and viscosity of virtual objects are generated by haptic rendering. We have developed a software package for haptic rendering. Haptic render of LHX has three categories according to three types of force display:
Tool handling type force display is similar to joystick and free from fitting it to user's hand. This type device provides users to interact with tool such as grip. On the other hand, exoskeleton type force display doesn't require tool between users and virtual environment because of their mechanism.
LHX supports two haptic renders: surface renderer and volume renderer. Surface renderer is implemented by spring and dumper model. Volume renderer is implemented by mapping voxel data to force and torque.
Exoskeleton type force display applies force to multiple fingers. Renderer of this category is composed of multiple surface renderer of tool handling type.
Object oriented type renderer determines stiffness of the surface of Haptic Screen according to physical model of virtual object.
Model of virtual objects are implemented in model manager module of LHX. Shapes and attributes of virtual objects are defined in this module. Users of LHX program the methods for interaction between virtual objects and operators.
Primitives of virtual object are stored in the primitive manager. Primitives includes cube, sphere, cylinder, free-form surface and 3D voxels. Haptic icons for user interface are also included. This module supervises ID code of each primitive. Users of LHX interactively generates or erase primitives. Working primitives are placed in shared memory.
Autonomy engine determines behavior of virtual objects. Physical laws for the virtual world are contained in this module. Gravity, elasticity, and viscosity are currently implemented. Collision between primitives are detected in real time. This module defines "time" of virtual environment. Time of virtual environment increases independently from the user. This function enables autonomous growth of virtual objects.
LHX has network interface by which multiple force displays are connected each other. Multiple users can simultaneously interact in the same virtual environment. This function enables easy construction of groupware program. LHX supports TCP/IP so that the system can use existing internet.
Visual display manager generates graphic image of virtual environment. This module translates the haptic model to OpenGL format. HMD, stereo shutter glasses, spherical screen and polygonal screen are supported as visual displays.
LHX is currently implemented in SGI workstation and Windows NT workstation. Considering connection of haptic interface to visual image generator, SGI workstation and Windows NT workstation are most promising platform.
C and C++ are used for its implementation.
Since our force displays are interfaced to PC, device driver module is implemented in PC.
Host workstation and PC are connected by RS-232C(Figure 1). LHX is composed of two processes: visual feedback process and force feedback process.
Visual feedback process runs visual display manager, primitive manager and autonomy engine.
Force feedback process runs other modules.
Shared memory is used for communication channel between these processes. Required update rate of force feedback is much higher than that of visual feedback.
Image can be seen continuously at update rate of 10Hz.
On the other hand, force feedback requires 40Hz at least. In LHX force feedback process has higher priority than visual feedback process. LHX enables high update rate of force display in complex virtual environment.Fig.2 Implementation of LHX
Volume visualization is one of the most powerful method to represent scientific data.
Most of these data consist of three or much higher dimensional space.
However, visual information is essentially consists of two-dimensional image.
Three dimensional scene is recognized by binocular parallax cues or motion parallax.
Visualized volumetric data often cause misconception because of occlusion.
The major objective of this application is representation of volumetric data by force sensation.
The basic idea of "Haptization" is force/torque mapping with voxel data(Figure 3).
LHX includes "Volume Haptics Library".
The library supports management of massive volumetric data, and mapping methods of parameters to force.Fig.3 Haptization
The Volume Haptics Library provides following three function sets in order to realize environment of Volume Haptization. Figure 4 shows structure of these function sets.Fig.4 Structure of Volume Haptics Library
Volumetric data should be stored on the memory in order to represent all of data as visual/haptic at once. To maintain high update rate in visual/haptic servo loop, data management function preloads all of volumetric data into shared memory and reconstructs original data before the servo loop begins. Data management function also serves as preprocessor of data for compression or extraction.
We should define how to generate haptic feedback from stored volumetric data. Recently several haptic representation of volumetric data have been discussed. We implemented force/torque mapping method which is generated from scalar/vector voxel data to volume render function. Therefore, minimum change of the function enables to exchange another method of haptic representation. In figure 4, this function receive hand location based on voxel's local coordinate. Then the function refers shared memory in order to generate force vector.
In order to treat volumetric data as a primitive, it is necessary to attach attribute of primitive to the volumetric data. Data control function gives such additional parameters of primitive so that LHX can utilize various functions such as transformation of position or collision detection.
As user moves/rotates volumetric data, data control function updates the transformation parameter, and then LHX looks up the parameter to changing coordinate system of user's hand.
Two applications have been developed by using three functions as mentioned in section 6.2.
The first one is Multi-dimensional data browser that presents virtual 4-D or higher dimensional space represented by visual and haptic sensation. In principle force sensation contains six dimensional information: three dimensional force and three dimensional torque. Therefore six dimensional data can be represented by force sensation.
Our multi-dimensional space where the data are exposed as volumetric data are geometrically generated by scanning 3D cube. The user's hand can essentially move in 3D space. We therefore use rotational motion of the hand for scanning 3D cube in multi-dimensional cube. The 3D cube is cutting volume of multi-dimensional cube which moves by rotational motion around roll and pitch axis of the user's hand. Force display presents potential field which indicates axis of rotation. The user can easily separate rotational motion from translational motion by haptic guide.
The second application is non-invasive surgery support system. This system proposes method of haptic representation of non-invasive area to support system of micro surgery. In case the virtual tool approaches non-invasive area, force is applied to the operator's hand. Generated force is determined by volume data of CT image.
We developed software infrastructure for haptic interface. It achieved reproductivity of software of virtual environment with force feedback. The software tool has been improved through various applications. Future work will be development of new method which enables higher update rate of force feedback.
 Hiroo IWATA, Hiroaki YANO: Artificial Life in Haptic Virtual Environment, Proceedings of ICAT'93,1993
 Hiroo IWATA, Hiroaki YANO: Interaction with Autonomous Free-form Surface, Proceedings of ICAT'94,1994
 Hiroaki YANO, Hiroo IWATA: Cooperative Work in Virtual Environment with Force Feedback, Proceedings of ICAT/VRST'95,1995
 Diego C.Ruspini, et.al.:The Haptic Display of Complex Graphical Environments, Proceedings of SIGGRAPH'97,1997
 Eabdolph, et.al.: Adding Force Feedback to Graphics System, Proceedings of SIGGRAPH'96,1996
 Michitaka HIROSE, et.al: Development of Haptic Interface Platform(HIP), Proceedings of VRSJ,1997 (In Japanese)
 Hiroo IWATA, Haruo NOMA: Volume Haptization, Proceedings of IEEE Symposium on Research Frontiers in Virtual Reality,1993
 Andrew B.Mor, Sarah Gibson, Joseph T.Samosky: Interacting with 3-Dimensional Medical Data Haptic Feedback for Surgical Simulation, Proceedings of The First PHANToM User's Group Workshop, 1996
 Wataru HASHIMOTO, Hiroo IWATA: Multi-dimensional Data Browser with Haptic Sensation, Transactions of VRSJ, Vol.2 No.3,1997 (In Japanese)
 Wataru HASHIMOTO, Hiroo IWATA: Support system for surgery by using haptic representation of Non-invasive region, Proceedings of VRSJ, 1997 (In Japanese)