Team IGG : Computer Graphics and Geometry

Difference between revisions of "THEME 2 OPERATION5"

From Team IGG : Computer Graphics and Geometry
Jump to navigation Jump to search
 
 
(21 intermediate revisions by one other user not shown)
Line 1: Line 1:
== Opération 5: Interaction 3D ==
+
== Operation 5: 3D Interaction ==
  
'''Responsable''' : Dominique Bechmann PR 1C
+
'''Head''' : Dominique Bechmann PR1
  
'''Participants''' : Antonio Capobianco MC 71ème section, Jérôme Grosjean MC, Pascal Schreck PR 2C, Sylvain Thery IR, Caroline Villard MC, IR CNRS (Recrutement au 1-12-2007).  
+
'''Participants''' : Dominique Bechmann PR1, Antonio Capobianco MC, Caroline Essert MC, Jérôme Grosjean MC, Olivier Génevaux IR CNRS.  
  
'''Doctorants''' : Manuel Veit
+
'''Ph.D. candidate''' : Jonathan Wonner
  
'''Post-doctorant''' : Ludovic Sternberger.
+
'''Postdoc''' : Manuel Veit
  
'''Présentation'''
+
'''Presentation'''
  
Parallèlement à la modélisation des objets, nous souhaitons pouvoir interagir avec ces modèles à tous les niveaux. Nous souhaitons manipuler, déformer, éditer, aussi bien leur topologie que leur plongement, aussi bien les contraintes et les descriptions de haut niveau que les solutions proposées. Cela nous conduit à nous intéresser à la problématique de l'interaction en réalité virtuelle. Notre approche repose sur le retour visuel, l'utilisation des deux mains et le retour d'effort pour interagir avec les objets modélisés.
+
In parallel to the modeling of objects we wish to be able to interact with these models at any level. We wish to manipulate, deform, edit their topology as well as their embedding, high-level constraints and descriptions as well as the proposed solutions. This leads us to study the problem of interaction in virtual reality. Our approach relies on visual feedback, bimanual interaction and force feedback to interact with the modeled objects.
  
L’interaction 3D se caractérise par un découpage de l'interaction en tâches élémentaires [3-SB07]. Les tâches élémentaires de l’interaction 3D sont la navigation, la sélection et la manipulation, avec contrôle gestuel et retour d’effort. Le contrôle d’application vient compléter les tâches élémentaires.  
+
3D interaction is characterized by a cutting of interaction in elementary tasks [3-SB07] which are the navigation, selection and manipulation with gestural and force feedback control. The application's control comes in completion of the elementary tasks. We present our work, following this cutting.
Nous présentons nos travaux suivants ce découpage.  
 
  
===Plate-forme de réalité virtuelle ===
 
  
La réalité virtuelle souffre, pour le moment, du manque de standard permettant de concevoir une application en environnement virtuel. Un des premiers soucis est l'hétérogénéité des dispositifs physiques qui sont encore en pleine évolution. Le second tient au fait que peu de concepts fondamentaux de l'interaction 3D aient été dégagés à ce jour et que donc, les techniques d'interaction sont en pleine émergence. Dans ces conditions, tout investissement en environnement virtuel nécessite quasiment le développement d'une plate-forme logicielle de réalité virtuelle, adaptée, d'une part aux dispositifs d'entrées-sorties disponibles, et d'autre part aux applications visées.
+
===VR platform===
  
Ainsi dans le cadre de la thèse de Ludovic Sternberger soutenue en 2006, une librairie d'interaction 3D a été conçue et développée : la vrLIB [4-SBB07, 8-Ste06]. Cette dernière ne prétend pas améliorer ou remplacer les nombreuses librairies de réalité virtuelle conçues à travers le monde. Son positionnement est de proposer une boîte à outils logiciels permettant de passer, le plus simplement possible, d'une application existante sur station de travail type PC, à une application en environnement immersif de type workbench.  
+
[[Image:WBGeneralFront.jpg|left|300px]]
Cette activité a vocation à évoluer en vue des applications scientifiques et en particulier, des applications médicales que nous visons. Elle sera portée par le futur ingénieur de recherche du CNRS que le LSIIT recrutera en 2007 pour prendre en charge la plate-forme de réalité virtuelle.  
+
VR currently suffers from a lack of standards for the conception of a VR application. One of the first problems is the heterogeneity of physical devices which are still in great evolution. The second one is the fact that few fundamental concepts in 3D interaction have come out to this day, thus the interaction techniques are in full emergence. In this context any investment in VR environment requires almost the development of a VR software platform well-suited on one side to the in-out available devices and to the targeted applications, on the other side.
  
===Contrôle d'applications ===
+
Also, within the thesis of Ludovic Sternberger defended in 2006, a first 3D interaction library was conceived and developed : the vrLIB [4-SBB07, 8-Ste06]. The latter doesn't pretend to enhance or replace the numerous VR libraries known accross the globe. Its positioning is to offer a software toolbox allowing to move - as simply as possible - from an existing workstation application to a VR application of type workbench.
  
L'objectif de cette activité est de proposer de nouvelles techniques permettant de contrôler des applications en environnement immersif [8-Geb04]. Les premiers travaux de l'équipe ont été effectués dans le cadre de la thèse de Dominique Gerber soutenue en 2005. Ils portent sur la conception d'un nouveau menu, adapté aux dispositifs immersifs. Ce menu, de la famille des menus circulaires, appelé menu Spin [4-GB05, 4-GB04, 6-GBG05], se contrôle par une simple rotation de la main : une rotation vers la droite impliquant soit une rotation du menu (décalant ainsi l'élément sélectionné vers la gauche), soit une rotation de l'élément sélectionné. Les deux options correspondent à des schémas cognitifs, différents selon l'utilisateur, qui sont, à peu près, aussi fréquent l'un que l'autre. Une correction des mouvements de l'utilisateur a été mise en place afin de diminuer les erreurs de saisie. Une version hiérarchique du menu Spin a été conçue, permettant ainsi de créer des menus de contrôle contenant autant d'élément que nécessaire. Enfin, ce menu a l'avantage d'être efficace pour une large gamme d'utilisateurs allant des débutants aux utilisateurs confirmés. Le Spin Menu, ainsi que le Menu CCube développé par Jérôme Grosjean durant sa thèse dans le projet INRIA I3D, sont intégrés dans la vrLIB.
+
[[Image:WBVRLibCloseup.jpg|right|200px]]
 +
In order to extend this goal and best exploit the available VR hardware at LSIIT, software components grouped within a VR platform are being developed within IGG team. They have two goals: allowing the hardware's usage and offering the interaction tools developed by the team's research. We find e.g. components dedicated to the usage of data gloves or the haptic peripheral device SPIDAR. The VR interaction aspects, with menus ''C<sup>3</sup>'' and ''spin-menu'' are being developed and will soon be available.
  
===Sélection et Manipulation ===
+
A toolbox approach offering isolated and focalized components on precise techniques, as independent as possible, is being favored in order to offer the simplest possible integration and the least intrusive to client applications. The reliability of the various components is also one of the major conducted goals.
 +
 
 +
This software platform is mainly being developed by the CNRS research engineer nearby his management activities of [[Virtual Reality | VR platform]] at LSIIT.
 +
 
 +
 
 +
===Application control===
 +
 
 +
The goal of this activity is to offer new techniques for the control of applications in immersive environment [8-Geb04]. The first research works were performed in the context of Dominique Gerber's thesis, defended in 2005. They deal with the design of a new menu well-suited to immersive devices. This menu, from the family of pie menus, called Spin menu [4-GB05, 4-GB04, 6-GBG05], is controlled by a simple hand rotation : a right rotation implies either a menu rotation (shifting the selected element to the left) or a rotation of the selected element. The two options correspond to cognitive schemes, changing with the user, that are both almost as frequent as the other. A correction of the user's movements has been set up to reduce the input errors. A hierarchical version of the Spin menu has been designed, thus allowing to create control menus with as many elements as needed. Finally this menu has the benefit of being efficient for a large class of users from beginners to advanced users. The Spin menu, and the CCube menu - developed by Jérôme Grosjean during his thesis within the project INRIA I3D - are integrated in the VR platform.
 +
 
 +
 
 +
===Selection and manipulation===
  
 
[[Image:Geolo_pilote.jpg|right|thumb|120px]]  
 
[[Image:Geolo_pilote.jpg|right|thumb|120px]]  
Line 34: Line 43:
 
[[Image:Dogmerv.jpg|right|thumb|120px]]  
 
[[Image:Dogmerv.jpg|right|thumb|120px]]  
  
Contrairement aux espoirs que la réalité virtuelle a suscités à ses débuts, la sélection et la manipulation d'objets virtuels dans un environnement immersif posent, pour le moment, plus de problèmes qu'elles n'en résolvent. Rapidement la nécessité d'apporter des aides à la manipulation, via des indices visuels et/ou des manipulations contraintes, est apparue. Ces aides n'ont de sens que dans un contexte applicatif donné, c'est pourquoi nous avons considéré un certain nombre d'aides pour diverses applications immersives.  
+
Contrary to early hopes that VR raised up, the selection and manipulation of virtual objects in an immersive environment currently generate more problems than solutions. Rapidly the necessity of bringing support to manipulation appeared via visual indices and/or constrained manipulations. This supports make sense only for a given application context, this is why we considered a certain number of supports for diverse immersive applications.  
  
'''Pilote géologique immersif'''  
+
'''Immersive geological pilot'''  
  
La première application immersive [4-HBB03, 2-BSP05] sur laquelle nous avons travaillé en collaboration avec l'Institut Français du Pétrole et l'Ecole des Mines de Paris, offrait la possibilité de labelliser interactivement des surfaces, représentant des couches géologiques ou des failles, afin de construire un graphe d'évolution géologique du sous-sol représenté par les surfaces. Cette première application nous a surtout permis de nous rendre compte de la difficulté du travail en environnement immersif.  
+
The first immersive application [4-HBB03, 2-BSP05] - on which we worked on in cooperation with the French Petrol Institute (IFP) and the engineering school "Ecole des Mines de Paris" - offered the possibility to label surfaces interactively, representing geological layers or faults, in order to construct a geological evolution graph of the subsoil by surfaces. This first application especially allowed us to realize how difficult it is to work in an immersive environment.
  
'''Déformation 3D (DogmeRV)'''  
+
'''3D deformation (DogmeRV)'''  
  
La seconde application immersive, DogmeRV [4-GB04], développée dans le cadre de la thèse de Dominique Gerber, tentait de tirer parti de l'environnement immersif pour le contrôle de la déformation d'objets via le modèle de déformation Dogme [2-BG03]. Là encore, nous nous sommes heurtées à de nombreuses difficultés dont la première fut le contrôle de l'application. Ceci nous a conduit au développement du Spin menu décrit précédemment. Néanmoins, quelques-unes des manipulations que nous avons mises en place, nous ont laissé entrevoir les possibilités de travailler en environnement immersif : la saisie directe de la trajectoire de déformation par mouvement de la main, le contrôle de la taille de la zone déformée par déplacement de poignées.
+
The second immersive application, DogmeRV [4-GB04], developed within the thesis of Dominique Gerber, aimed at taking advantage of the immersive environment for the control of objects deformations via the deformation model Dogme [2-BG03]. Here again we confronted to numerous difficulties, the first one being the control of applications. This lead us to the development of the Spin menu previously described. Nevertheless some of these set up manipulations showed us the potential of work in immersive environment : direct input of the deformation trajectory by hand gesture, control of the deformed area by handles shifting.
 
   
 
   
'''Planification de la radiofréquence'''  
+
'''Radio-frequency planning'''  
  
La troisième application est médicale [4-VSCG04] : elle permet de planifier une opération de brûlure d'une tumeur par échauffement thermique (la radiofréquence). L'environnement immersif permet de placer interactivement l'aiguille sur l'abdomen. Elle est actuellement en cours de développement dans le cadre du postdoc CNRS de Vincent Baudet avec prise en compte du retour d'effort.  
+
The third application is medical [4-VSCG04] : it allows to plan a burn surgery of a tumor by thermal heating (radio-frequency). The immersive environment allowed to interactively place the needle on the abdomen.
  
'''Modélisation d'objets 4D'''  
+
'''4D modeling'''  
  
La quatrième application, StigmaVR [4-BRA07], tente de capitaliser l'expérience acquise en terme de manipulation pour pousser encore plus loin des tests en environnement immersif : cette fois ce sont des objets 4D que l'on tente de manipuler via plusieurs vues 3D de l'objet.  
+
The fourth application, StigmaVR [4-BRA07], aims at capitalizing the acquired experience in terms of manipulation to extend even further some tests in immersive environment : this time we manipulate 4D objects through several 3D views of the object.
  
'''Constats sur l'interaction 3D'''  
+
'''Notes on 3D interaction'''  
  
Ces quatre applications immersives nous ont principalement permis de dresser un constat sur l'interaction 3D : il faut totalement repenser une tâche avant de vouloir la réaliser en environnement immersif car le simple "portage" conduit à une perte d'efficacité et d'ergonomie par rapport à la tâche habituellement effectuée sur station de travail et cela pour plusieurs raisons:  
+
These four immersive applications lead us to the following observation concerning 3D interaction : an immersive environment task needs to be completely redesigned from scratch as the simple port implies less efficiency and ergonomics compared to the task usually performed on a workstation, which has several reasons :
  
* Dans l’environnement virtuel, le point de vue partiel de l’utilisateur sur l’objet ne lui permet pas de visualiser tous les effets réalisés. De plus, l’environnement de travail semi-immersif de type workbench limite la possibilité de l’utilisateur de modifier son point de vue en se déplaçant autour de l’objet.  
+
* In the virtual environment the partial viewpoint of the user on the object doesn't allow him/her to visualize all the realized effects. Moreover the semi-immersive working environment of type workbench limits the possibility of the user of modifying his/her viewpoint by moving around the object.
  
* La perception de la profondeur des objets virtuels dans la scène peut être imparfaite et créer un décalage entre la position de la main réelle et le point désigné dans le monde virtuel: ceci pose notamment des problèmes de précision des gestes.  
+
* The depth feeling of virtual objects in the scene can be  imperfect and create a shifting between the real hand's position and the point indicated in the virtual world: this actually poses gesture accuracy problems.
  
A partir de ces constats, notre travail a consisté ensuite à proposer de réelles améliorations pour la sélection et la manipulation en environnement immersif via le contrôle gestuel et l'ajout de modalité, en particulier, le retour haptique.
+
Starting from those observations our work consists in offering real improvements to construct an efficient immersive working environment and navigation around virtual sketches by especially using gesture control and the add of modalities, in particular, the force feedback.
  
===Contrôle gestuel ===
+
'''Complex terrain modeling'''
  
Un des atouts des environnements immersifs réside dans la possibilité d'imaginer et de concevoir un contrôle efficace du geste. Une première étude [4-FSSB06, 4-FSSB05] portant sur l'amélioration du geste en environnement immersif a été mené autour de la pose de contraintes géométriques en 3D. Une seconde, plus poussée, dans le cadre d'une application de déformations de formes géométriques, permet de spécifier des contraintes de déformation sur des volumes virtuels, simplement en reproduisant un geste «intuitif» de déformation (saisie du point d’application de la contrainte et geste de déformation pour spécifier la déformation: étirement, torsion, etc.). Nous nous sommes attachés à proposer plusieurs améliorations au paradigme de déformation de base en environnement immersif.  
+
VR and direct 3D interaction immersive systems haven't yet been used for applications dealing with natural phenomena. Within the project ANR "DNA" (Details in Natural Scene) on large data, we are working on the development - together with our partners (LIRIS in Lyon and XLIM in Limoges) - of an immersive modeling platform and new interaction paradigms for modeling and ecosystems editing with large natural details.
 +
The current platform set up over the Workbench allows the immersive editing of a multi-layer complex terrain model (with overhangs, caves and arches) over which the new interaction techniques are being tested. We are studying the problem of fast circulation around the digital model, of visibility during manipulation, and of shape editing tools on a new and more efficient approach for the task of 3D pointing.  
  
La plus aboutie offre la possibilité d’utiliser un mode d’interaction bi-manuelle [4-VCB07] qui permet d’augmenter la liberté de l’utilisateur dans son interaction avec le système: il peut choisir d’exprimer des contraintes alternativement avec l’une et l’autre main, de réaliser des déformations simultanées (permettant d’obtenir des résultats complexes difficiles à obtenir avec des déformations successives), ou encore il devient possible de saisir/déplacer l’objet dans le monde virtuel d’une main, en imposant une déformation de l’autre. L'introduction d'un repère physique dans l'environnement, par le biais de l'utilisation de la main non-dominante, permet d'envisager un meilleur contrôle du geste de déformation réalisé, grâce à l'ajout d'informations proprioceptives. L'évaluation du gain effectif ainsi obtenu, aussi bien en termes de rapidité d'exécution que de précision, est en cours d'évaluation.
 
  
===Retour d'effort ===  
+
===Gesture control===
  
Pour permettre d'effectuer des mouvements précis selon des axes ou des directions privilégiées de l’espace à trois dimensions, l'amélioration de l'interaction gestuelle également visée consiste à contraindre physiquement les mouvements de la main ou d’un outil tenu en main, grâce à un dispositif à retour d'effort.  
+
One of the strengths of immersive environments is the possibility of imagining and designing an efficient gesture control. A first study [4-FSSB06, 4-FSSB05] dealing with the improvements of gestures in immersive environment was conducted around the set up of geometric constraints in 3D. A second one, more advanced, in the scope of a deformation application of geometric shapes, enables the specification of deformation constraints on virtual volumes simply by reproducing an «intuitive» deformation gesture (input of the constraint's application point and deformation gesture: stretching, torsion, etc.). We also offered several improvements to the basic deformation paradigm in virtual environment.
 +
 
 +
The more advanced offers a bimanual interaction mode [4-VCB07] enhancing the freedom of the user in his interaction with the system: he can choose to express his constraints alternately with one or the other hand, realize simultaneous deformations (allowing to get complex results difficult to obtain with sequential deformations), or even grab/shift an object with one hand and set a deformation with the other. The introduction of a physical reference frame in the environment, through the use of the non-dominant hand and the addition of proprioceptive information, provides an increased control feeling to the user. Nevertheless the effective gains in terms of precision and execution speed remain uncertain. Our preliminary study shows that bimanual interaction leads to more important execution times. Moreover it seams that the expected precision gains only appear in cases where the environment's visual indices are insufficient to allow a fine control of the interaction.
 +
 
 +
This work on bimanual interaction techniques made us question ourselves on the control's integration of the several dimensions of the ongoing interaction task (i.e. fusion and simultaneous control of several task's dimensions in a single command action, e.g. the simultaneous control of the positioning and orientation of the manipulated object). The notion of degrees of freedom (DOF) can highlighting these task's dimensions. We currently work on the definition of measures for precisely evaluating the integration degree of DOF during a manipulation task. We recommend the MDS measures (Magnitude of Degrees of freedom Separation) which provide the number of simultaneously manipulated DOF at any time of the task and the TFS (Task Fulfilment State) which gives the realization degree of a task. Their purpose is to provide quantitative indices allowing the analysis of exploitation strategies of the DOF by users during the VR manipulation tasks. In a first step we exploited these two measures in the case of simple tasks : positioning (3 DOF) or orientation (3 DOF) of objects. These first studies allowed us to validate the proposed measures by verifying their behavior. In addition they allowed to update complex strategies of task realization, especially for orientation tasks in VR. These measures, as the results of experimental studies for the orientation and positioning, are currently under submission.
 +
 
 +
 
 +
===Force feedback===
 +
 
 +
In order to allow to perform accurate movements along axes or privileged directions in 3D, the improvement of the gestural interaction also targeted consists in physically constraining the hand's or handed tool's movements thanks to a force feedback device.  
  
 
[[Image:Caro_phantom_small.jpg|right|thumb|120px]]
 
[[Image:Caro_phantom_small.jpg|right|thumb|120px]]
 
[[Image:SpidarWB.jpg|right|thumb|120px]]  
 
[[Image:SpidarWB.jpg|right|thumb|120px]]  
  
Une première étude [2-VSG05], réalisée dans le cadre de l'application médicale de radiofréquence, a permis de montrer le grand intérêt d'un outil de localisation 3D et du retour d'effort pour la simulation réaliste du geste chirurgical. L'objectif visé est principalement la formation de praticiens, car son expérience joue un rôle non négligeable dans les taux de réussite de l'opération. Cette application a utilisé un système de retour d'effort du commerce couplé avec une station de travail classique.  
+
A first study  [2-VSG05], conducted within a medical application of radio-frequency showed the great interest of a 3D localization tool and of the force feedback for the realistic simulation of surgical gesture. The goal is mainly the training of practitioners as his/her experience plays a non negligible role in the surgery's chances of success. This application used a commercial force feedback system coupled with a classical workstation.
 +
 
 +
We currently work on the definition of a formalism in order to be able to automatically add constraints of dimension reduction for selection tasks. Applied to VR manipulation these techniques can improve the user's performance for application control tasks (selection in a menu) or pointing/selection of interest points in a points cloud. Preliminary studies (in the process of publication) seem encouraging as they indeed show that large performance gains can be obtained by, at the same time, reducing the cognitive load linked to the task.
 +
 
 +
A collaboration between the LSIIT and the lab of Professor Makato Sato from Tokyo Institute of Technology enabled to install a force feedback system of type Spidar on the Workbench at the University of Strasbourg. Sylvain Thery stayed one month in Japan in 2006 to acquire the necessary experience of such a force feedback system. This dedicated Spidar has the benefit of offering a larger work space and allows us to apply haptic techniques that we have designed in immersive VR environments.
  
Une collaboration entre le laboratoire LSIIT et le laboratoire du professeur Makato Sato au Tokyo Institute of Technology a permis d'installer un système à retour d’effort du type Spidar sur la station de réalité virtuelle du type workbench de l’Université Louis Pasteur de Strasbourg. Sylvain Thery a séjourné un mois au Japon en 2006 pour acquérir l'expérience nécessaire à l'installation d'un tel système à retour d'effort. Ce Spidar dédié a l'avantage d'offrir un espace de travail plus large, permettant de voir directement la scène en 3D et de la manipuler.
+
[[fr:THEME_2_OPERATION5]]

Latest revision as of 20:50, 29 September 2010

Operation 5: 3D Interaction

Head : Dominique Bechmann PR1

Participants : Dominique Bechmann PR1, Antonio Capobianco MC, Caroline Essert MC, Jérôme Grosjean MC, Olivier Génevaux IR CNRS.

Ph.D. candidate : Jonathan Wonner

Postdoc : Manuel Veit

Presentation

In parallel to the modeling of objects we wish to be able to interact with these models at any level. We wish to manipulate, deform, edit their topology as well as their embedding, high-level constraints and descriptions as well as the proposed solutions. This leads us to study the problem of interaction in virtual reality. Our approach relies on visual feedback, bimanual interaction and force feedback to interact with the modeled objects.

3D interaction is characterized by a cutting of interaction in elementary tasks [3-SB07] which are the navigation, selection and manipulation with gestural and force feedback control. The application's control comes in completion of the elementary tasks. We present our work, following this cutting.


VR platform

WBGeneralFront.jpg

VR currently suffers from a lack of standards for the conception of a VR application. One of the first problems is the heterogeneity of physical devices which are still in great evolution. The second one is the fact that few fundamental concepts in 3D interaction have come out to this day, thus the interaction techniques are in full emergence. In this context any investment in VR environment requires almost the development of a VR software platform well-suited on one side to the in-out available devices and to the targeted applications, on the other side.

Also, within the thesis of Ludovic Sternberger defended in 2006, a first 3D interaction library was conceived and developed : the vrLIB [4-SBB07, 8-Ste06]. The latter doesn't pretend to enhance or replace the numerous VR libraries known accross the globe. Its positioning is to offer a software toolbox allowing to move - as simply as possible - from an existing workstation application to a VR application of type workbench.

WBVRLibCloseup.jpg

In order to extend this goal and best exploit the available VR hardware at LSIIT, software components grouped within a VR platform are being developed within IGG team. They have two goals: allowing the hardware's usage and offering the interaction tools developed by the team's research. We find e.g. components dedicated to the usage of data gloves or the haptic peripheral device SPIDAR. The VR interaction aspects, with menus C3 and spin-menu are being developed and will soon be available.

A toolbox approach offering isolated and focalized components on precise techniques, as independent as possible, is being favored in order to offer the simplest possible integration and the least intrusive to client applications. The reliability of the various components is also one of the major conducted goals.

This software platform is mainly being developed by the CNRS research engineer nearby his management activities of VR platform at LSIIT.


Application control

The goal of this activity is to offer new techniques for the control of applications in immersive environment [8-Geb04]. The first research works were performed in the context of Dominique Gerber's thesis, defended in 2005. They deal with the design of a new menu well-suited to immersive devices. This menu, from the family of pie menus, called Spin menu [4-GB05, 4-GB04, 6-GBG05], is controlled by a simple hand rotation : a right rotation implies either a menu rotation (shifting the selected element to the left) or a rotation of the selected element. The two options correspond to cognitive schemes, changing with the user, that are both almost as frequent as the other. A correction of the user's movements has been set up to reduce the input errors. A hierarchical version of the Spin menu has been designed, thus allowing to create control menus with as many elements as needed. Finally this menu has the benefit of being efficient for a large class of users from beginners to advanced users. The Spin menu, and the CCube menu - developed by Jérôme Grosjean during his thesis within the project INRIA I3D - are integrated in the VR platform.


Selection and manipulation

Geolo pilote.jpg
Geolo select.jpg
Geolo wb.jpg
Dogmerv.jpg

Contrary to early hopes that VR raised up, the selection and manipulation of virtual objects in an immersive environment currently generate more problems than solutions. Rapidly the necessity of bringing support to manipulation appeared via visual indices and/or constrained manipulations. This supports make sense only for a given application context, this is why we considered a certain number of supports for diverse immersive applications.

Immersive geological pilot

The first immersive application [4-HBB03, 2-BSP05] - on which we worked on in cooperation with the French Petrol Institute (IFP) and the engineering school "Ecole des Mines de Paris" - offered the possibility to label surfaces interactively, representing geological layers or faults, in order to construct a geological evolution graph of the subsoil by surfaces. This first application especially allowed us to realize how difficult it is to work in an immersive environment.

3D deformation (DogmeRV)

The second immersive application, DogmeRV [4-GB04], developed within the thesis of Dominique Gerber, aimed at taking advantage of the immersive environment for the control of objects deformations via the deformation model Dogme [2-BG03]. Here again we confronted to numerous difficulties, the first one being the control of applications. This lead us to the development of the Spin menu previously described. Nevertheless some of these set up manipulations showed us the potential of work in immersive environment : direct input of the deformation trajectory by hand gesture, control of the deformed area by handles shifting.

Radio-frequency planning

The third application is medical [4-VSCG04] : it allows to plan a burn surgery of a tumor by thermal heating (radio-frequency). The immersive environment allowed to interactively place the needle on the abdomen.

4D modeling

The fourth application, StigmaVR [4-BRA07], aims at capitalizing the acquired experience in terms of manipulation to extend even further some tests in immersive environment : this time we manipulate 4D objects through several 3D views of the object.

Notes on 3D interaction

These four immersive applications lead us to the following observation concerning 3D interaction : an immersive environment task needs to be completely redesigned from scratch as the simple port implies less efficiency and ergonomics compared to the task usually performed on a workstation, which has several reasons :

  • In the virtual environment the partial viewpoint of the user on the object doesn't allow him/her to visualize all the realized effects. Moreover the semi-immersive working environment of type workbench limits the possibility of the user of modifying his/her viewpoint by moving around the object.
  • The depth feeling of virtual objects in the scene can be imperfect and create a shifting between the real hand's position and the point indicated in the virtual world: this actually poses gesture accuracy problems.

Starting from those observations our work consists in offering real improvements to construct an efficient immersive working environment and navigation around virtual sketches by especially using gesture control and the add of modalities, in particular, the force feedback.

Complex terrain modeling

VR and direct 3D interaction immersive systems haven't yet been used for applications dealing with natural phenomena. Within the project ANR "DNA" (Details in Natural Scene) on large data, we are working on the development - together with our partners (LIRIS in Lyon and XLIM in Limoges) - of an immersive modeling platform and new interaction paradigms for modeling and ecosystems editing with large natural details. The current platform set up over the Workbench allows the immersive editing of a multi-layer complex terrain model (with overhangs, caves and arches) over which the new interaction techniques are being tested. We are studying the problem of fast circulation around the digital model, of visibility during manipulation, and of shape editing tools on a new and more efficient approach for the task of 3D pointing.


Gesture control

One of the strengths of immersive environments is the possibility of imagining and designing an efficient gesture control. A first study [4-FSSB06, 4-FSSB05] dealing with the improvements of gestures in immersive environment was conducted around the set up of geometric constraints in 3D. A second one, more advanced, in the scope of a deformation application of geometric shapes, enables the specification of deformation constraints on virtual volumes simply by reproducing an «intuitive» deformation gesture (input of the constraint's application point and deformation gesture: stretching, torsion, etc.). We also offered several improvements to the basic deformation paradigm in virtual environment.

The more advanced offers a bimanual interaction mode [4-VCB07] enhancing the freedom of the user in his interaction with the system: he can choose to express his constraints alternately with one or the other hand, realize simultaneous deformations (allowing to get complex results difficult to obtain with sequential deformations), or even grab/shift an object with one hand and set a deformation with the other. The introduction of a physical reference frame in the environment, through the use of the non-dominant hand and the addition of proprioceptive information, provides an increased control feeling to the user. Nevertheless the effective gains in terms of precision and execution speed remain uncertain. Our preliminary study shows that bimanual interaction leads to more important execution times. Moreover it seams that the expected precision gains only appear in cases where the environment's visual indices are insufficient to allow a fine control of the interaction.

This work on bimanual interaction techniques made us question ourselves on the control's integration of the several dimensions of the ongoing interaction task (i.e. fusion and simultaneous control of several task's dimensions in a single command action, e.g. the simultaneous control of the positioning and orientation of the manipulated object). The notion of degrees of freedom (DOF) can highlighting these task's dimensions. We currently work on the definition of measures for precisely evaluating the integration degree of DOF during a manipulation task. We recommend the MDS measures (Magnitude of Degrees of freedom Separation) which provide the number of simultaneously manipulated DOF at any time of the task and the TFS (Task Fulfilment State) which gives the realization degree of a task. Their purpose is to provide quantitative indices allowing the analysis of exploitation strategies of the DOF by users during the VR manipulation tasks. In a first step we exploited these two measures in the case of simple tasks : positioning (3 DOF) or orientation (3 DOF) of objects. These first studies allowed us to validate the proposed measures by verifying their behavior. In addition they allowed to update complex strategies of task realization, especially for orientation tasks in VR. These measures, as the results of experimental studies for the orientation and positioning, are currently under submission.


Force feedback

In order to allow to perform accurate movements along axes or privileged directions in 3D, the improvement of the gestural interaction also targeted consists in physically constraining the hand's or handed tool's movements thanks to a force feedback device.

Caro phantom small.jpg
SpidarWB.jpg

A first study [2-VSG05], conducted within a medical application of radio-frequency showed the great interest of a 3D localization tool and of the force feedback for the realistic simulation of surgical gesture. The goal is mainly the training of practitioners as his/her experience plays a non negligible role in the surgery's chances of success. This application used a commercial force feedback system coupled with a classical workstation.

We currently work on the definition of a formalism in order to be able to automatically add constraints of dimension reduction for selection tasks. Applied to VR manipulation these techniques can improve the user's performance for application control tasks (selection in a menu) or pointing/selection of interest points in a points cloud. Preliminary studies (in the process of publication) seem encouraging as they indeed show that large performance gains can be obtained by, at the same time, reducing the cognitive load linked to the task.

A collaboration between the LSIIT and the lab of Professor Makato Sato from Tokyo Institute of Technology enabled to install a force feedback system of type Spidar on the Workbench at the University of Strasbourg. Sylvain Thery stayed one month in Japan in 2006 to acquire the necessary experience of such a force feedback system. This dedicated Spidar has the benefit of offering a larger work space and allows us to apply haptic techniques that we have designed in immersive VR environments.