Team IGG : Computer Graphics and Geometry

Difference between revisions of "InVirtuo"

From Team IGG : Computer Graphics and Geometry
Jump to navigation Jump to search
 
(23 intermediate revisions by 5 users not shown)
Line 1: Line 1:
 +
{{PAGE_Begin}}
 +
InVirtuo becomes a pole of the GAIA ICube Platform (Graphics, Artificial Intelligence and data Analysis).
 +
 +
Our platform InVirtuo participates in the ANR EQUIPEX + CONTINUUM (Collaborative continuity from digital to human) led by Michel Beaudoin-Lafon (Professor at Paris-Saclay University). The project is financed with 15.5 M€ starting January 1st 2021 for 8 years.
 +
 
__NOTOC__
 
__NOTOC__
 
__NOEDITSECTION__
 
__NOEDITSECTION__
 +
= ''The virtual experience'' =
 +
[[Image:WBGeneralSide.jpg|right|250px]]
 +
The IN VIRTUO—''the virtual experience'' virtual-reality platform offered by ICube is mainly based on the offering of the following hardware resources:
 +
* an immersive workbench,
 +
* an immersive wall augmented with a large-scale haptic device (parallel cable robot),
 +
* personal desktop-sized haptic devices,
 +
* head-mounted virtual reality (2 HTC Vive, 2 Occulus Rift, 4 Occulus Quest 2).
  
= Presentation =
+
[[Image:IncaPyramid.JPG|right|250px]]
 +
These devices are supplemented by the offering of software tools designed to help their use. Indeed, on top of hardware-related complications, such devices require to completely redesign the application HCI to make the best use of their abilities. In order to fill the lack of existing tools, standards, and techniques, software developments resulting from the IGG-team researches on HCI are made available along with the hardware.
 +
<br style="clear: both" />
  
[[Image:WBGeneralSide.jpg|left|300px]]
+
== Workbench ==
 +
[[Image:WBGeneralFrontRight.jpg|left|200px]]
 +
The workbench is an immersive virtual-reality device with a configuration well-adapted to interaction and manipulation tasks at human reach. The available hardware is based on two large screens (2 meters diagonal approximately) featuring active stereoscopic display. Immersion is also made available thanks to an optical tracking system to capture user's movements.
  
The VR platform offered by LSIIT is based on the supply of hardware resources, mainly in form of a workbench, and the supply of software tools allowing to benefit from the best of this environment.  
+
Along with this display device come multiple peripherals (wand and joysticks with buttons, two datagloves able to assess fingers bending and fingertip contacts). A SPIDAR parallel cable robot is also available to the user, as a force-feedback device.
  
Indeed, despite the problems of application development linked to the specific hardware, exploiting these immersive environments requires a complete re-engineering of the applications' design and HCI. In order to compensate the lack of tools, standards and existing techniques, a software platform dedicated to these problems is being developed. It follows the research work of IGG's group in the field of interaction in immersive environment.
+
Initially installed and managed by the ''Centre d'Etude du Calcul Parallèle et de la Visualisation'' (Study center of parallel computation and visualization) of UDS in February 2002, the available hardware has consistently evolved and happens to be mainly operated by [[Main_Page|IGG]]. Indeed, end of 2007, the projection system has been entirely renewed thanks to the funding from CPER IRMC Virtual Reality theme. Its access still remains free to the whole community of researchers at Université de Strasbourg.
 +
<br style="clear: both" />
  
= Hardware platform =  
+
== Visual and haptic immersive wall ==
[[Image:WBGeneralFrontRight.jpg|right|200px]]
 
Initially installed and managed by the [http://www-cecpv.u-strasbg.fr/ 'Centre d'Etude du Calcul Parallèle et de la Visualisation'] (Study center of parallel computation and visualization) of UDS in February 2002, the available hardware has consistently evolved and happens to be currently mainly operated by [[Main_Page|IGG]] of LSIIT. Indeed, end of 2007, the projection system has been entirely renewed thanks to the funding from CPER IRMC thme Virtual Reality. Its access still remains free to the whole community of researchers at Université de Strasbourg.
 
  
 +
[[Image:IncaHardware.JPG|left|200px]]
 +
This hybrid device is designed to put the user in the middle of visual immersion and force-feedback. It features a large immersive wall (3m&nbsp;x&nbsp;2.25m) featuring active stereoscopic display and motion tracking, combined with a large-scale parallel cable robot (from the Haption company) acting as a force-feedback device. The user is then able to feel a full haptic feedback (forces and torques) in a large manipulation space (approximately 1 cubic-meter) on top on visual immersion.
  
== Hardware ==
+
This immersive virtual-reality hardware has been installed early 2011. Its acquisition has been made possible thanks to a grant of CPER IRMC. The parallel robot has been bought and is operated jointly with the [http://lsiit-cnrs.unistra.fr/avr-en/index.php/Main_Page Control, Vision & Robotics team] of the LSIIT.
 +
<br style="clear: both" />
  
The available hardware consists of:
+
== Haptic peripherals ==
* a (virtual) workbench, with two displays of 2m diagonal each;
 
* a stereoscopic system allowing to restitute a real depth feeling (active stereoscopy);
 
* a system of movements capture, capable of identifying the movements of the user;
 
* a micro-cluster of computers handling display, peripheral devices, and applications' execution.
 
<gallery>
 
Image:WBGeneralFrontRight.jpg|Displays
 
Image:WBProjector.JPG|Projector
 
Image:WBGlassesWhite.JPG|Glasses
 
Image:WBTracking.JPG|Movements capture
 
</gallery>
 
  
This immersive environment device has several other peripheral devices:
+
Traditional haptic peripherals targeted at desktop use are also available. They mainly consist in 6DOF and 3DOF Phantoms from Sensable.
* a joystick presenting a set of buttons;
+
<gallery widths=200px>
* two data gloves, capable of measuring the flexion of the user's fingers, and the contact;
+
Image:PhantomPremium.png
* a SPIDAR, wired force-feedback device, capable of restituting a force to the user.  
+
Image:PhantomOmni.png
<gallery>
+
Image:PhantomOmni2.png
Image:WBWandWhite.JPG|Joystick
 
Image:WBGlovesWhite.JPG|Gloves
 
Image:WBSpidarGeneral.JPG|Spidar (effector)
 
 
</gallery>
 
</gallery>
  
  
== Execution environment ==
+
== Software & specific developments ==
 
 
The common employed system on the cluster is GNU/Linux (Debian distribution), but Windows is also available. The library [http://www.vrjuggler.org/ VRJuggler] is available, and usually used, in order to abstract from the problem of distribution and synchronization of the application on the cluster, and  the handling of most peripheral devices.
 
 
 
Note that any application can exploit the visualization device, provided that it handles itself the stereoscopy, the movements capture, the distributed execution and the possible peripheral devices.
 
 
 
 
 
= Software platform =
 
 
 
This platform has the goal of providing software components which will simplify the development of applications in immersive environment, and particularly on the workbench. It is organized as a set of independent modules, each one offering a high-level functionality, such as an interaction technique or the communication with a specific type of peripheral device. It merges both ad-hoc components (mostly in order to use hardware or accomplish specific tasks) and more generic components, essentially turned to interaction with applications and coming from research conducted at IGG.
 
 
 
The components currently developed or which are planned for development are the following:
 
 
 
  
== Hardware usage ==
+
Most of the software developed uses the [http://code.google.com/p/vrjuggler/ VRJuggler] library in order to avoid directly managing the hardware configuration (data distribution, synchronization if required, and interfacing with the peripherals). The two devices (workbench et immersive wall) offer access to GNU/Linux et Microsoft Windows OS. One should note that any application is able to take advantage of the devices directly, without being compliant to any framework, provided it handles stereoscopy, user tracking, and all the required peripherals.
  
* Native interfacing with the installed system of movements capture ART;
+
Multiple specific software have been developed in the IGG team of ICube, some being offered to take advantage of the hardware of the platform. This toolbox is organized as a set of independent modules, each one offering a high-level functionality, such as an interaction technique or the communication with a specific type of peripheral device. As an example, one can find the VRLIB library, dedicated to provide 3D UI in immersive applications, or the C<sup>3</sup> menu designed for application control in such an immersive environment.
* Interfacing with the available data gloves X-IS, avec information offset through the network. In form of a vrjuggler driver from one side (client), and a data transmission software on the other side (server);
+
<gallery widths=200px>
* Interfacing with the available force feedback system SPIDAR. On the form of a remote and complete emulation of the SPIDAR's native API.
+
Image:WBVRLibGeneral.jpg|Modeling application based on the VRLib
 
+
Image:WBVRLibCloseup.jpg|Some controls available in the VRLIB
 
+
Image:CCubeTerrainEdit.jpg|C<sup>3</sup> menu
== Interaction ==
 
 
 
* VRLIB: swindowing system in immersive environment.
 
* C<sup>3</sup> menu: application control in immersive environment. "Under development"
 
* Spin menu: application control in immersive environment. "Under development"
 
<gallery>
 
Image:WBVRLibGeneral.jpg|Modeler based on VRLib
 
Image:WBVRLibCloseup.jpg|Example of controls offered by VRLIB
 
 
</gallery>
 
</gallery>
  
Line 74: Line 59:
 
= Applications - Links =  
 
= Applications - Links =  
  
These platforms are mostly used in the context of [[THEME 2 OPERATION5 | research in interaction]] conducted at IGG.
+
These platforms are mostly used in the context of the [[Visualization_and_Interactions#Interaction | HCI researches]] conducted in the IGG team.
  
They are also made available to students from several computer science classes offered by the University of Strasbourg, in particular to the students from the [http://master-informatique.unistra.fr/isi/isi-formation.php Master "Informatique et Sciences de l'Image"] (Computer science and image sciences).
+
They are also made available to students from several computer science classes offered by the University of Strasbourg, in particular to the students from the [https://mathinfo.unistra.fr/formations/master/informatique/ "Image and 3D" courses of the Computer science Master].
  
Contacts also exist with the VR platform set up by Alsace region within its pole Image [http://www.iconoval.eu/ Iconoval].
+
Contacts also exist with the VR platform set up by the Alsace region and operated by [http://www.holo3.com/ Holo3].
  
  
 
= Contact =
 
= Contact =
For any complementary information, please contact [[Olivier Génevaux | O. Génevaux]].
+
For any additional information, please contact Thierry Blandet, research engineer attached to the IGG team, technical head of the platform.
 
+
{{PAGE_End}}
[[fr:Plateformes_Realité_virtuelle]]
+
[[fr:InVirtuo]]

Latest revision as of 11:44, 9 October 2021

InVirtuo becomes a pole of the GAIA ICube Platform (Graphics, Artificial Intelligence and data Analysis).

Our platform InVirtuo participates in the ANR EQUIPEX + CONTINUUM (Collaborative continuity from digital to human) led by Michel Beaudoin-Lafon (Professor at Paris-Saclay University). The project is financed with 15.5 M€ starting January 1st 2021 for 8 years.


The virtual experience

WBGeneralSide.jpg

The IN VIRTUO—the virtual experience virtual-reality platform offered by ICube is mainly based on the offering of the following hardware resources:

  • an immersive workbench,
  • an immersive wall augmented with a large-scale haptic device (parallel cable robot),
  • personal desktop-sized haptic devices,
  • head-mounted virtual reality (2 HTC Vive, 2 Occulus Rift, 4 Occulus Quest 2).
IncaPyramid.JPG

These devices are supplemented by the offering of software tools designed to help their use. Indeed, on top of hardware-related complications, such devices require to completely redesign the application HCI to make the best use of their abilities. In order to fill the lack of existing tools, standards, and techniques, software developments resulting from the IGG-team researches on HCI are made available along with the hardware.

Workbench

WBGeneralFrontRight.jpg

The workbench is an immersive virtual-reality device with a configuration well-adapted to interaction and manipulation tasks at human reach. The available hardware is based on two large screens (2 meters diagonal approximately) featuring active stereoscopic display. Immersion is also made available thanks to an optical tracking system to capture user's movements.

Along with this display device come multiple peripherals (wand and joysticks with buttons, two datagloves able to assess fingers bending and fingertip contacts). A SPIDAR parallel cable robot is also available to the user, as a force-feedback device.

Initially installed and managed by the Centre d'Etude du Calcul Parallèle et de la Visualisation (Study center of parallel computation and visualization) of UDS in February 2002, the available hardware has consistently evolved and happens to be mainly operated by IGG. Indeed, end of 2007, the projection system has been entirely renewed thanks to the funding from CPER IRMC Virtual Reality theme. Its access still remains free to the whole community of researchers at Université de Strasbourg.

Visual and haptic immersive wall

IncaHardware.JPG

This hybrid device is designed to put the user in the middle of visual immersion and force-feedback. It features a large immersive wall (3m x 2.25m) featuring active stereoscopic display and motion tracking, combined with a large-scale parallel cable robot (from the Haption company) acting as a force-feedback device. The user is then able to feel a full haptic feedback (forces and torques) in a large manipulation space (approximately 1 cubic-meter) on top on visual immersion.

This immersive virtual-reality hardware has been installed early 2011. Its acquisition has been made possible thanks to a grant of CPER IRMC. The parallel robot has been bought and is operated jointly with the Control, Vision & Robotics team of the LSIIT.

Haptic peripherals

Traditional haptic peripherals targeted at desktop use are also available. They mainly consist in 6DOF and 3DOF Phantoms from Sensable.


Software & specific developments

Most of the software developed uses the VRJuggler library in order to avoid directly managing the hardware configuration (data distribution, synchronization if required, and interfacing with the peripherals). The two devices (workbench et immersive wall) offer access to GNU/Linux et Microsoft Windows OS. One should note that any application is able to take advantage of the devices directly, without being compliant to any framework, provided it handles stereoscopy, user tracking, and all the required peripherals.

Multiple specific software have been developed in the IGG team of ICube, some being offered to take advantage of the hardware of the platform. This toolbox is organized as a set of independent modules, each one offering a high-level functionality, such as an interaction technique or the communication with a specific type of peripheral device. As an example, one can find the VRLIB library, dedicated to provide 3D UI in immersive applications, or the C3 menu designed for application control in such an immersive environment.


Applications - Links

These platforms are mostly used in the context of the HCI researches conducted in the IGG team.

They are also made available to students from several computer science classes offered by the University of Strasbourg, in particular to the students from the "Image and 3D" courses of the Computer science Master.

Contacts also exist with the VR platform set up by the Alsace region and operated by Holo3.


Contact

For any additional information, please contact Thierry Blandet, research engineer attached to the IGG team, technical head of the platform.