Team IGG : Computer Graphics and Geometry

Difference between revisions of "InVirtuo"

From Team IGG : Computer Graphics and Geometry
Jump to navigation Jump to search
m
 
(30 intermediate revisions by 6 users not shown)
Line 1: Line 1:
 +
{{PAGE_Begin}}
 +
InVirtuo becomes a pole of the GAIA ICube Platform (Graphics, Artificial Intelligence and data Analysis).
 +
 +
Our platform InVirtuo participates in the ANR EQUIPEX + CONTINUUM (Collaborative continuity from digital to human) led by Michel Beaudoin-Lafon (Professor at Paris-Saclay University). The project is financed with 15.5 M€ starting January 1st 2021 for 8 years.
 +
 
__NOTOC__
 
__NOTOC__
 
__NOEDITSECTION__
 
__NOEDITSECTION__
 +
= ''The virtual experience'' =
 +
[[Image:WBGeneralSide.jpg|right|250px]]
 +
The IN VIRTUO—''the virtual experience'' virtual-reality platform offered by ICube is mainly based on the offering of the following hardware resources:
 +
* an immersive workbench,
 +
* an immersive wall augmented with a large-scale haptic device (parallel cable robot),
 +
* personal desktop-sized haptic devices,
 +
* head-mounted virtual reality (2 HTC Vive, 2 Occulus Rift, 4 Occulus Quest 2).
  
!!! Document in french !!! Translation in progress...
+
[[Image:IncaPyramid.JPG|right|250px]]
 
+
These devices are supplemented by the offering of software tools designed to help their use. Indeed, on top of hardware-related complications, such devices require to completely redesign the application HCI to make the best use of their abilities. In order to fill the lack of existing tools, standards, and techniques, software developments resulting from the IGG-team researches on HCI are made available along with the hardware.
= Presentation =
+
<br style="clear: both" />
  
[[Image:WBGeneralSide.jpg|left|300px]]
+
== Workbench ==
 +
[[Image:WBGeneralFrontRight.jpg|left|200px]]
 +
The workbench is an immersive virtual-reality device with a configuration well-adapted to interaction and manipulation tasks at human reach. The available hardware is based on two large screens (2 meters diagonal approximately) featuring active stereoscopic display. Immersion is also made available thanks to an optical tracking system to capture user's movements.
  
The VR platform offered by LSIIT is based on the supply of hardware resources, mainly in form of a workbench, and the supply of software tools allowing to benefit from the best of this environment.  
+
Along with this display device come multiple peripherals (wand and joysticks with buttons, two datagloves able to assess fingers bending and fingertip contacts). A SPIDAR parallel cable robot is also available to the user, as a force-feedback device.
  
Indeed, despite the problems of application development linked to the specific hardware, exploiting these immersive environments requires a complete re-engineering of the applications' design and HCI. In order to compensate the lack of tools, standards and existing techniques, a software platform dedicated to these problems is being developed. It follows the research work of IGG's group in the field of interaction in immersive environment.
+
Initially installed and managed by the ''Centre d'Etude du Calcul Parallèle et de la Visualisation'' (Study center of parallel computation and visualization) of UDS in February 2002, the available hardware has consistently evolved and happens to be mainly operated by [[Main_Page|IGG]]. Indeed, end of 2007, the projection system has been entirely renewed thanks to the funding from CPER IRMC Virtual Reality theme. Its access still remains free to the whole community of researchers at Université de Strasbourg.
 +
<br style="clear: both" />
  
= Hardware platform =  
+
== Visual and haptic immersive wall ==
[[Image:WBGeneralFrontRight.jpg|right|200px]]
 
Initially installed and managed by the [http://www-cecpv.u-strasbg.fr/ 'Centre d'Etude du Calcul Parallèle et de la Visualisation'] (Study center of parallel computation and visualization) of UDS in February 2002, the available hardware has consistently evolved and happens to be currently mainly operated by [[Main_Page|IGG]] of LSIIT. Indeed, end of 2007, the projection system has been entirely renewed thanks to the funding from CPER IRMC thme Virtual Reality. Its access still remains free to the whole community of researchers at Université de Strasbourg.
 
  
 +
[[Image:IncaHardware.JPG|left|200px]]
 +
This hybrid device is designed to put the user in the middle of visual immersion and force-feedback. It features a large immersive wall (3m&nbsp;x&nbsp;2.25m) featuring active stereoscopic display and motion tracking, combined with a large-scale parallel cable robot (from the Haption company) acting as a force-feedback device. The user is then able to feel a full haptic feedback (forces and torques) in a large manipulation space (approximately 1 cubic-meter) on top on visual immersion.
  
== Hardware ==
+
This immersive virtual-reality hardware has been installed early 2011. Its acquisition has been made possible thanks to a grant of CPER IRMC. The parallel robot has been bought and is operated jointly with the [http://lsiit-cnrs.unistra.fr/avr-en/index.php/Main_Page Control, Vision & Robotics team] of the LSIIT.
 +
<br style="clear: both" />
  
The available hardware consists of:
+
== Haptic peripherals ==
* a (virtual) workbench, with two displays of 2m diagonal each;
 
* a stereoscopic system allowing to restitute a real depth feeling (active stereoscopy);
 
* a system of movements capture, capable of identifying the movements of the user;
 
* a micro-cluster of computers handling display, peripheral devices, and execute applications.
 
<gallery>
 
Image:WBGeneralFrontRight.jpg|Ecrans
 
Image:WBProjector.JPG|Projecteur
 
Image:WBGlassesWhite.JPG|Lunettes
 
Image:WBTracking.JPG|Capture de mouvement
 
</gallery>
 
  
Ce dispositif de visualisation immersive est assorti de plusieurs périphériques:
+
Traditional haptic peripherals targeted at desktop use are also available. They mainly consist in 6DOF and 3DOF Phantoms from Sensable.
* un joystick présentant un ensemble de boutons;
+
<gallery widths=200px>
* deux gants de données, capables de mesurer la flexions des doigts de l'utilisateur, ainsi que le contact;
+
Image:PhantomPremium.png
* un SPIDAR, dispositif à retour d'efforts par câbles, capable de restituer une force à l'utilisateur.  
+
Image:PhantomOmni.png
<gallery>
+
Image:PhantomOmni2.png
Image:WBWandWhite.JPG|Joystick
 
Image:WBGlovesWhite.JPG|Gants
 
Image:WBSpidarGeneral.JPG|Spidar (effecteur)
 
 
</gallery>
 
</gallery>
  
  
== Execution environment ==
+
== Software & specific developments ==
 
 
Le système employé usuellement sur le cluster est GNU/Linux (distribution Debian), mais un système Windows est également disponible. La bibliothèque [http://www.vrjuggler.org/ VRJuggler] est disponible, et habituellement utilisée, pour s'abstraire de la problématique de distribution et de synchronisation de l'application sur le cluster, ainsi que pour la gestion de la plupart des périphériques.
 
 
 
A noter que toute application est à même d'exploiter le dispositif de visualisation, sous réserve de gérer elle-même la stéréoscopie, la capture de mouvements, l'exécution distribuée, ainsi les éventuels périphériques requis.
 
 
 
  
= Software platform =
+
Most of the software developed uses the [http://code.google.com/p/vrjuggler/ VRJuggler] library in order to avoid directly managing the hardware configuration (data distribution, synchronization if required, and interfacing with the peripherals). The two devices (workbench et immersive wall) offer access to GNU/Linux et Microsoft Windows OS. One should note that any application is able to take advantage of the devices directly, without being compliant to any framework, provided it handles stereoscopy, user tracking, and all the required peripherals.
  
Cette plateforme a pour objectif la fourniture de composants logiciels simplifiant le développement d'applications en environnement immersif, et plus particulièrement sur le workbench. Elle est organisée en un ensemble de modules indépendants, chacun destinés à offrir une fonctionnalité de haut niveau, comme une technique d'interaction ou la communication avec un type de périphérique donné. Elle regroupe à la fois des composants ad-hoc, principalement pour utiliser des matériels ou réaliser des tâches spécifiques, ainsi que des composants plus génériques, essentiellement tournés vers l'interaction avec les applications, et issus des recherches menées au sein de l'équipe IGG du LSIIT.
+
Multiple specific software have been developed in the IGG team of ICube, some being offered to take advantage of the hardware of the platform. This toolbox is organized as a set of independent modules, each one offering a high-level functionality, such as an interaction technique or the communication with a specific type of peripheral device. As an example, one can find the VRLIB library, dedicated to provide 3D UI in immersive applications, or the C<sup>3</sup> menu designed for application control in such an immersive environment.
 
+
<gallery widths=200px>
Les composants développés à l'heure actuelle, ou dont le développement est planifié sont les suivants:
+
Image:WBVRLibGeneral.jpg|Modeling application based on the VRLib
 
+
Image:WBVRLibCloseup.jpg|Some controls available in the VRLIB
 
+
Image:CCubeTerrainEdit.jpg|C<sup>3</sup> menu
== Hardware usage ==
 
 
 
* Interfaçage natif avec le système de capture de mouvements ART installé;
 
* Interfaçage avec les gants de données X-IST disponibles, avec déport des informations par le réseau. Sous la forme d'un driver vrjuggler d'une part (client), et d'un logiciel d'émission des données (serveur);
 
* Interfaçage avec le système à retour de forces spidar disponible. Sous la forme d'une émulation distante et complète de l'API native du spidar.
 
 
 
 
 
== Interaction ==
 
 
 
* VRLIB: système de fenêtrage en environnement immersif.
 
* Menu C<sup>3</sup>: contrôle d'application en environnement immersif. "En développement"
 
* Spin menu: contrôle d'application en environnement immersif. "En développement"
 
<gallery>
 
Image:WBVRLibGeneral.jpg|Modeleur employant la VRLib
 
Image:WBVRLibCloseup.jpg|Exemple de contrôles offerts par la VRLIB
 
 
</gallery>
 
</gallery>
  
Line 76: Line 59:
 
= Applications - Links =  
 
= Applications - Links =  
  
Ces plateformes sont principalement utilisées dans le cadre des [[THEME 2 OPERATION5 | recherches en interaction]] menées au sein de l'équipe IGG du LSIIT.
+
These platforms are mostly used in the context of the [[Visualization_and_Interactions#Interaction | HCI researches]] conducted in the IGG team.
  
Elles sont également mises à disposition des étudiants des différentes filières informatique proposées par l'université de Strasbourg, notamment aux étudiants du [http://master-informatique.unistra.fr/isi/isi-formation.php master "Informatique et Sciences de l'Image"].
+
They are also made available to students from several computer science classes offered by the University of Strasbourg, in particular to the students from the [https://mathinfo.unistra.fr/formations/master/informatique/ "Image and 3D" courses of the Computer science Master].
  
Des contacts existent également avec la plateforme de réalité virtuelle mise en place par la région Alsace au sein de son pôle image [http://www.iconoval.fr/services/centre-ressource-imagerie.html Iconoval].
+
Contacts also exist with the VR platform set up by the Alsace region and operated by [http://www.holo3.com/ Holo3].
  
  
 
= Contact =
 
= Contact =
For any complementary information, please contact [[Olivier Génevaux | O. Génevaux]].
+
For any additional information, please contact Thierry Blandet, research engineer attached to the IGG team, technical head of the platform.
 
+
{{PAGE_End}}
[[fr:Plateformes_Realité_virtuelle]]
+
[[fr:InVirtuo]]

Latest revision as of 11:44, 9 October 2021

InVirtuo becomes a pole of the GAIA ICube Platform (Graphics, Artificial Intelligence and data Analysis).

Our platform InVirtuo participates in the ANR EQUIPEX + CONTINUUM (Collaborative continuity from digital to human) led by Michel Beaudoin-Lafon (Professor at Paris-Saclay University). The project is financed with 15.5 M€ starting January 1st 2021 for 8 years.


The virtual experience

WBGeneralSide.jpg

The IN VIRTUO—the virtual experience virtual-reality platform offered by ICube is mainly based on the offering of the following hardware resources:

  • an immersive workbench,
  • an immersive wall augmented with a large-scale haptic device (parallel cable robot),
  • personal desktop-sized haptic devices,
  • head-mounted virtual reality (2 HTC Vive, 2 Occulus Rift, 4 Occulus Quest 2).
IncaPyramid.JPG

These devices are supplemented by the offering of software tools designed to help their use. Indeed, on top of hardware-related complications, such devices require to completely redesign the application HCI to make the best use of their abilities. In order to fill the lack of existing tools, standards, and techniques, software developments resulting from the IGG-team researches on HCI are made available along with the hardware.

Workbench

WBGeneralFrontRight.jpg

The workbench is an immersive virtual-reality device with a configuration well-adapted to interaction and manipulation tasks at human reach. The available hardware is based on two large screens (2 meters diagonal approximately) featuring active stereoscopic display. Immersion is also made available thanks to an optical tracking system to capture user's movements.

Along with this display device come multiple peripherals (wand and joysticks with buttons, two datagloves able to assess fingers bending and fingertip contacts). A SPIDAR parallel cable robot is also available to the user, as a force-feedback device.

Initially installed and managed by the Centre d'Etude du Calcul Parallèle et de la Visualisation (Study center of parallel computation and visualization) of UDS in February 2002, the available hardware has consistently evolved and happens to be mainly operated by IGG. Indeed, end of 2007, the projection system has been entirely renewed thanks to the funding from CPER IRMC Virtual Reality theme. Its access still remains free to the whole community of researchers at Université de Strasbourg.

Visual and haptic immersive wall

IncaHardware.JPG

This hybrid device is designed to put the user in the middle of visual immersion and force-feedback. It features a large immersive wall (3m x 2.25m) featuring active stereoscopic display and motion tracking, combined with a large-scale parallel cable robot (from the Haption company) acting as a force-feedback device. The user is then able to feel a full haptic feedback (forces and torques) in a large manipulation space (approximately 1 cubic-meter) on top on visual immersion.

This immersive virtual-reality hardware has been installed early 2011. Its acquisition has been made possible thanks to a grant of CPER IRMC. The parallel robot has been bought and is operated jointly with the Control, Vision & Robotics team of the LSIIT.

Haptic peripherals

Traditional haptic peripherals targeted at desktop use are also available. They mainly consist in 6DOF and 3DOF Phantoms from Sensable.


Software & specific developments

Most of the software developed uses the VRJuggler library in order to avoid directly managing the hardware configuration (data distribution, synchronization if required, and interfacing with the peripherals). The two devices (workbench et immersive wall) offer access to GNU/Linux et Microsoft Windows OS. One should note that any application is able to take advantage of the devices directly, without being compliant to any framework, provided it handles stereoscopy, user tracking, and all the required peripherals.

Multiple specific software have been developed in the IGG team of ICube, some being offered to take advantage of the hardware of the platform. This toolbox is organized as a set of independent modules, each one offering a high-level functionality, such as an interaction technique or the communication with a specific type of peripheral device. As an example, one can find the VRLIB library, dedicated to provide 3D UI in immersive applications, or the C3 menu designed for application control in such an immersive environment.


Applications - Links

These platforms are mostly used in the context of the HCI researches conducted in the IGG team.

They are also made available to students from several computer science classes offered by the University of Strasbourg, in particular to the students from the "Image and 3D" courses of the Computer science Master.

Contacts also exist with the VR platform set up by the Alsace region and operated by Holo3.


Contact

For any additional information, please contact Thierry Blandet, research engineer attached to the IGG team, technical head of the platform.