Team IGG : Computer Graphics and Geometry

THEME 2 OPERATION5

From Team IGG : Computer Graphics and Geometry
Jump to navigation Jump to search

Operation 5: 3D Interaction

Head : Dominique Bechmann PR1

Participants : Dominique Bechmann PR1, Antonio Capobianco MC, Caroline Essert MC, Jérôme Grosjean MC, Olivier Génevaux IR CNRS.

Ph.D. candidate : Jonathan Wonner

Postdoc : Manuel Veit

Presentation

In parallel to the modeling of objects we wish to be able to interact with these models at any level. We wish to manipulate, deform, edit their topology as well as their embedding, high-level constraints and descriptions as well as the proposed solutions. This leads us to study the problem of interaction in virtual reality. Our approach relies on visual feedback, bimanual interaction and force feedback to interact with the modeled objects.

3D interaction is characterized by a cutting of interaction in elementary tasks [3-SB07] which are the navigation, selection and manipulation with gestural and force feedback control. The application's control comes in completion of the elementary tasks. We present our work, following this cutting.


VR platform

WBGeneralFront.jpg

VR currently suffers from a lack of standards for the conception of a VR application. One of the first problems is the heterogeneity of physical devices which are still in great evolution. The second one is the fact that few fundamental concepts in 3D interaction have come out to this day, thus the interaction techniques are in full emergence. In this context any investment in VR environment requires almost the development of a VR software platform well-suited on one side to the in-out available devices and to the targeted applications, on the other side.

Also, within the thesis of Ludovic Sternberger defended in 2006, a first 3D interaction library was conceived and developed : the vrLIB [4-SBB07, 8-Ste06]. The latter doesn't pretend to enhance or replace the numerous VR libraries known accross the globe. Its positioning is to offer a software toolbox allowing to move - as simply as possible - from an existing workstation application to a VR application of type workbench.

WBVRLibCloseup.jpg

In order to extend this goal and best exploit the available VR hardware at LSIIT, software components grouped within a VR platform are being developed within IGG team. They have two goals: allowing the hardware's usage and offering the interaction tools developed by the team's research. We find e.g. components dedicated to the usage of data gloves or the haptic peripheral device SPIDAR. The VR interaction aspects, with menus C3 and spin-menu are being developed and will soon be available.

A toolbox approach offering isolated and focalized components on precise techniques, as independent as possible, is being favored in order to offer the simplest possible integration and the least intrusive to client applications. The reliability of the various components is also one of the major conducted goals.

This software platform is mainly being developed by the CNRS research engineer nearby his management activities of VR platform at LSIIT.


Application control

The goal of this activity is to offer new techniques for the control of applications in immersive environment [8-Geb04]. The first research works were performed in the context of Dominique Gerber's thesis, defended in 2005. They deal with the design of a new menu well-suited to immersive devices. This menu, from the family of pie menus, called Spin menu [4-GB05, 4-GB04, 6-GBG05], is controlled by a simple hand rotation : a right rotation implies either a menu rotation (shifting the selected element to the left) or a rotation of the selected element. The two options correspond to cognitive schemes, changing with the user, that are both almost as frequent as the other. A correction of the user's movements has been set up to reduce the input errors. A hierarchical version of the Spin menu has been designed, thus allowing to create control menus with as many elements as needed. Finally this menu has the benefit of being efficient for a large class of users from beginners to advanced users. The Spin menu, and the CCube menu - developed by Jérôme Grosjean during his thesis within the project INRIA I3D - are integrated in the VR platform.


Selection and manipulation

Geolo pilote.jpg
Geolo select.jpg
Geolo wb.jpg
Dogmerv.jpg

Contrary to early hopes that VR raised up, the selection and manipulation of virtual objects in an immersive environment currently generate more problems than solutions. Rapidly the necessity of bringing support to manipulation appeared via visual indices and/or constrained manipulations. This supports make sense only for a given application context, this is why we considered a certain number of supports for diverse immersive applications.

Immersive geological pilot

The first immersive application [4-HBB03, 2-BSP05] - on which we worked on in cooperation with the French Petrol Institute (IFP) and the engineering school "Ecole des Mines de Paris" - offered the possibility to label surfaces interactively, representing geological layers or faults, in order to construct a geological evolution graph of the subsoil by surfaces. This first application especially allowed us to realize how difficult it is to work in an immersive environment.

3D deformation (DogmeRV)

The second immersive application, DogmeRV [4-GB04], developed within the thesis of Dominique Gerber, aimed at taking advantage of the immersive environment for the control of objects deformations via the deformation model Dogme [2-BG03]. Here again we confronted to numerous difficulties, the first one being the control of applications. This lead us to the development of the Spin menu previously described. Nevertheless some of these set up manipulations showed us the potential of work in immersive environment : direct input of the deformation trajectory by hand gesture, control of the deformed area by handles shifting.

Radio-frequency planning

The third application is medical [4-VSCG04] : it allows to plan a burn surgery of a tumor by thermal heating (radio-frequency). The immersive environment allowed to interactively place the needle on the abdomen.

4D modeling

The fourth application, StigmaVR [4-BRA07], aims at capitalizing the acquired experience in terms of manipulation to extend even further some tests in immersive environment : this time we manipulate 4D objects through several 3D views of the object.

Notes on 3D interaction

These four immersive applications lead us to the following observation concerning 3D interaction : an immersive environment task needs to be completely redesigned from scratch as the simple port implies less efficiency and ergonomics compared to the task usually performed on a workstation, which has several reasons :

  • In the virtual environment the partial viewpoint of the user on the object doesn't allow him/her to visualize all the realized effects. Moreover the semi-immersive working environment of type workbench limits the possibility of the user of modifying his/her viewpoint by moving around the object.
  • The depth feeling of virtual objects in the scene can be imperfect and create a shifting between the real hand's position and the point indicated in the virtual world: this actually poses gesture accuracy problems.

Starting from those observations our work consists in offering real improvements to construct an efficient immersive working environment and navigation around virtual sketches by especially using gesture control and the add of modalities, in particular, the force feedback.

Complex terrain modeling

VR and direct 3D interaction immersive systems haven't yet been used for applications dealing with natural phenomena. Within the project ANR "DNA" (Details in Natural Scene) on large data, we are working on the development - together with our partners (LIRIS in Lyon and XLIM in Limoges) - of an immersive modeling platform and new interaction paradigms for modeling and ecosystems editing with large natural details. The current platform set up over the Workbench allows the immersive editing of a multi-layer complex terrain model (with overhangs, caves and arches) over which the new interaction techniques are being tested. We are studying the problem of fast circulation around the digital model, of visibility during manipulation, and of shape editing tools on a new and more efficient approach for the task of 3D pointing.


Gesture control

One of the strengths of immersive environments is the possibility of imagining and designing an efficient gesture control. A first study [4-FSSB06, 4-FSSB05] dealing with the improvements of gestures in immersive environment was conducted around the set up of geometric constraints in 3D. A second one, more advanced, in the scope of a deformation application of geometric shapes, enables the specification of deformation constraints on virtual volumes simply by reproducing an «intuitive» deformation gesture (input of the constraint's application point and deformation gesture: stretching, torsion, etc.). We also offered several improvements to the basic deformation paradigm in virtual environment.

The more advanced offers a bimanual interaction mode [4-VCB07] enhancing the freedom of the user in his interaction with the system: he can choose to express his constraints alternately with one or the other hand, realize simultaneous deformations (allowing to get complex results difficult to obtain with sequential deformations), or even grab/shift an object with one hand and set a deformation with the other. The introduction of a physical reference frame in the environment, through the use of the non-dominant hand and the addition of proprioceptive information, provides an increased control feeling to the user. Nevertheless the effective gains in terms of precision and execution speed remain uncertain. Our preliminary study shows that bimanual interaction leads to more important execution times. Moreover it seams that the expected precision gains only appear in cases where the environment's visual indices are insufficient to allow a fine control of the interaction.

This work on bimanual interaction techniques made us question ourselves on the control's integration of the several dimensions of the ongoing interaction task (i.e. fusion and simultaneous control of several task's dimensions in a single command action, e.g. the simultaneous control of the positioning and orientation of the manipulated object). The notion of degrees of freedom (DOF) can highlighting these task's dimensions. We currently work on the definition of measures for precisely evaluating the integration degree of DOF during a manipulation task. We recommend the MDS measures (Magnitude of Degrees of freedom Separation) which provide the number of simultaneously manipulated DOF at any time of the task and the TFS (Task Fulfilment State) which gives the realization degree of a task. Their purpose is to provide quantitative indices allowing the analysis of exploitation strategies of the DOF by users during the VR manipulation tasks. In a first step we exploited these two measures in the case of simple tasks : positioning (3 DOF) or orientation (3 DOF) of objects. These first studies allowed us to validate the proposed measures by verifying their behavior. In addition they allowed to update complex strategies of task realization, especially for orientation tasks in VR. These measures, as the results of experimental studies for the orientation and positioning, are currently under submission.


Force feedback

In order to allow to perform accurate movements along axes or privileged directions in 3D, the improvement of the gestural interaction also targeted consists in physically constraining the hand's or handed tool's movements thanks to a force feedback device.

Caro phantom small.jpg
SpidarWB.jpg

A first study [2-VSG05], conducted within a medical application of radio-frequency showed the great interest of a 3D localization tool and of the force feedback for the realistic simulation of surgical gesture. The goal is mainly the training of practitioners as his/her experience plays a non negligible role in the surgery's chances of success. This application used a commercial force feedback system coupled with a classical workstation.

We currently work on the definition of a formalism in order to be able to automatically add constraints of dimension reduction for selection tasks. Applied to VR manipulation these techniques can improve the user's performance for application control tasks (selection in a menu) or pointing/selection of interest points in a points cloud. Preliminary studies (in the process of publication) seem encouraging as they indeed show that large performance gains can be obtained by, at the same time, reducing the cognitive load linked to the task.

A collaboration between the LSIIT and the lab of Professor Makato Sato from Tokyo Institute of Technology enabled to install a force feedback system of type Spidar on the Workbench at the University of Strasbourg. Sylvain Thery stayed one month in Japan in 2006 to acquire the necessary experience of such a force feedback system. This dedicated Spidar has the benefit of offering a larger work space and allows us to apply haptic techniques that we have designed in immersive VR environments.