Difference between revisions of "Gallery"
(Consistency with the new french gallery) |
|||
Line 38: | Line 38: | ||
<gallery mode="packed-hover" heights="200px"> | <gallery mode="packed-hover" heights="200px"> | ||
+ | Image:ISMAR22_teaser.png | '''Experiment on emotion in virtual reality:''' Extract from the experimental setup used to evaluate the link between emotion and incarnation in virtual reality. The participants are exposed to diverse emotional stimuli and avatars. [https://publis.icube.unistra.fr/4-FLC22 [4-FLC22]] | ||
+ | Image:virtual_env_design2.png | '''Design of a virtual environment for cue exposure therapy:''' Cocaine consumption situation modeled in virtual reality for therapeutic purposes. The craving triggered in virtual reality allow sthe patients with consumption disorder to apprehend this sense of craving. [https://publis.icube.unistra.fr/4-GLLA22 [4-GLLA22]] | ||
Image:pan_2021_multi_scale_space_time_registration_of_growing_plants_00.png | '''Multi-scale Space-time Registration of Growing Plants :''' Result of our framework on one tomato plant (left) and one maize (right): corresponding points share the same colour. Some landmarks have been sampled and are shown connected to their corresponding points. [https://icube-publis.unistra.fr/docs/15504/3dv_paper.pdf [3DV-2021]] | Image:pan_2021_multi_scale_space_time_registration_of_growing_plants_00.png | '''Multi-scale Space-time Registration of Growing Plants :''' Result of our framework on one tomato plant (left) and one maize (right): corresponding points share the same colour. Some landmarks have been sampled and are shown connected to their corresponding points. [https://icube-publis.unistra.fr/docs/15504/3dv_paper.pdf [3DV-2021]] | ||
Image:pan_2021_multi_scale_space_time_registration_of_growing_plants_01.png | '''Multi-scale Space-time Registration of Growing Plants:''' Point-wise matching of the whole plant growing. [https://icube-publis.unistra.fr/docs/15504/3dv_paper.pdf [3DV-2021]] | Image:pan_2021_multi_scale_space_time_registration_of_growing_plants_01.png | '''Multi-scale Space-time Registration of Growing Plants:''' Point-wise matching of the whole plant growing. [https://icube-publis.unistra.fr/docs/15504/3dv_paper.pdf [3DV-2021]] |
Latest revision as of 10:21, 19 September 2022
Contraintes et Preuves
Modélisation et Interaction
Experiment on emotion in virtual reality: Extract from the experimental setup used to evaluate the link between emotion and incarnation in virtual reality. The participants are exposed to diverse emotional stimuli and avatars. [4-FLC22]
Design of a virtual environment for cue exposure therapy: Cocaine consumption situation modeled in virtual reality for therapeutic purposes. The craving triggered in virtual reality allow sthe patients with consumption disorder to apprehend this sense of craving. [4-GLLA22]
Multi-scale Space-time Registration of Growing Plants : Result of our framework on one tomato plant (left) and one maize (right): corresponding points share the same colour. Some landmarks have been sampled and are shown connected to their corresponding points. [3DV-2021]
Multi-scale Space-time Registration of Growing Plants: Point-wise matching of the whole plant growing. [3DV-2021]
Tubular shape scaffolding: Hexahedral reconstruction of the metatron mesh using the tubular shape scaffolding algorithm. [VISIGRAPP-2021]
Tubular shape scaffolding: Using the red scaffold as a support a hexahedral mesh is produced then optimized and fitted to the input surface. [VISIGRAPP-2021]
- Viville 2021 hexahedral mesh generation for tubular shapes using skeletons and connection surfaces 02.png
Hex Meshes: Construction of hexahedral meshes (yellow) from an input surface (blue) and it's curve skeleton using the scaffold method. [VISIGRAPP-2021]
Descriptive: Interactive 3D Shape Modeling from A Single Descriptive Sketch: The constraints specified by the user are shown in red for positional constraints green for corner points and green–red for both. [CAD-2020]
Analyzing Clothing Layer Deformation Statistics of 3D Human Motions: Comparison to ground truth. First row: our predicted clothing deformation. Second row: ground truth colored with per-vertex error. Blue: 0cm; red: 10cm. [ECCV-2018]
Reconstructing Flowers from Sketches: Reconstruction of a flower model from a sketch: input sketch (a); guide strokes provided by the user (b); segmented sketch into petals and other botanical elements (c); reconstructed model (d and e). [PG-2018]
Reconstructing Flowers from Sketches: Results obtained with our modeler. [PG-2018]
Reconstructing Flowers from Sketches: More flower model examples from our modeler. [PG-2018]
Indoor Scene reconstruction from a sparse set of 3D shots: Three shots containing no overlapping area are given as input (top row). The second row shows two different results provided by our method. [CGI-2017]
Handling Topological Changes during Elastic Registration: Augmented reality on cut and deformed kidney 1 (top) and 2 (bottom) overlaid by the virtual organ the initial registration (left) final registrations: uncut (middle left) cut (middle right) and reference registration (right). [IJCARS-2016]
Mesh Sequence Morphing: A galloping Camel gradually changes into a galloping Dino. The Dino sequence (indicated in blue) is obtained by transferring the deformations of the Camel to the compatibly remeshed Dino in the rest pose. [CGF-2016]
- Error creating thumbnail: File with dimensions greater than 12.5 MP
Animation of bronchi: Construction of a hexahedral volume mesh of a bronchial model.
Interactions with a Hybrid Map for Navigation in Virtual Reality: (A B) interactions using the Oculus hand Controller (C D) interactions using the smartphone. The red circle represents the fingertip position in the VE. [ACM-ISS-2020]
UMI3D: A Unity3D Toolbox to support CSCW Systems Properties in Generic 3D User Interfaces: UMI3D case study - social interactions. User 2 (on the left) waits for the character controlled by User 1 (on the right) to cross the road before continuing to drive the red car. [ACM-HCI-2018]
A Unified Model for Interaction in 3D Environment: A new model for designing VR AR and MR applications independently of any device. [ACM-VRST-2017]
A Hybrid Projection to Widen the Vertical Field of View with Large Screens: Left image represents a view with a perspective projection. Right image shows and example of the hybrid projection. In left image the user cannot see the chair behind him. [3DUI-2016]
- IncaHardware.jpg
Synthetic reality equipment: INCA.
Apparence et Mouvement
Importance Sampling of Glittering BSDFs based on Finite Mixture Distributions: A glittering coloured glass sphere with a spatially varying microfacet density. Left: our sampling scheme. Centre: reference. Right: Gaussian mono-lobe approximation is used for sampling. [EGSR-2021]
Edge-based procedural textures: Edges of a texture are extracted and encoded into an edge-based procedural texture (EBPT). New textures are generated either automatically or by controlling the EBPT generation by the user. [VC-2021]
Cyclostationary Gaussian noise: theory and synthesis: We convey existing stationary noises to a cyclostationary context enabling the synthesis of cyclostationary textures controlled by spectra (left) and by an exemplar (right). [EG-2021]
Procedural Physically based BRDF for Real-Time Rendering of Glints: Left: sparkling fabrics are rendered (3.0 ms/frame). Right: plane with specular microfacet density increasing. Top: our physically based BRDF (2.5 ms/frame). Bottom: not physically based [ZK16] (1.3 ms/frame). [PG-2020]
Real-Time Geometric Glint Anti-Aliasing with Normal Map Filtering: Arctic landscape with a normal mapped surface. (a) glinty BRDF of Chermain et al. [2020] prone to geometric glint aliasing. Our geometric glint anti-aliasing (GGAA) without and with normal map filtering (b and c). (d) Reference. [i3D-2020]
Semi-Procedural Textures Using Point Process Texture Basis Functions: (a) Input texture map(s) and a binary structure (b) are used to generate a semi-procedural output (d). A rendered view of the input material is shown for comparison (c). [EGSR-2020]
Modeling Rocky Scenery using Implicit Blocks: Different styles of blocks generated on a cliff and arches. From left to right tabular block style equidimensional blocks and finally rhombohedral block style. [VC-2020]
Content-aware texture deformation with dynamic control: Our deformation model allows to mimic non-uniform physical behaviors at texel resolution. Top: the parameterization is advected in a static flow field. Bottom: the deformation can be controlled dynamically. [C&G-2020]
Anisotropic Filtering for On-the-fly Patch-based Texturing: Our filtering method (right) is compared to the ground truth (middle) and no filtering (left). The ground truth is computed by an exact filtering of the high resolution. The leftmost view indicates the MIP-map levels used. [EG-2019]
Bi-Layer textures: Our noise model decomposes an input exemplar as a structure layer and a noise layer. Large outputs are synthesized on-the-fly by synchronized synthesis of the layers. Variety can be achieved at the synthesis stage by deforming the structure layer while preserving fine scale appearance encoded in the noise layer. [EGSR-2017]
Multi-Scale Label-Map Extraction for Texture Synthesis: The input non-stationary texture (a). Hierarchy of labeled clusters: coarse scale (b) includes finer scales (c). Interactive texture editing (d) and content selection for creating new non-stationary textures (e). [SIGGRAPH-2016]
Modélisation et synthèse de textures: Des cartes de labels multi-échelles sont obtenues à l'aide de notre méthode d'analyse de textures. Une application possible est l'édition interactive de textures. [SIGGRAPH-2016]
Volumetric Spot Noise for Procedural 3D Shell Texture Synthesis: Left: uniform density. Middle: user controls density. Right: user controls orientation. In all cases control maps can be painted interactively. [CGVC-2016]
Volumetric Spot Noise for Procedural 3D Shell Texture Synthesis: Bunny1 with the "ring" kernel profile (a) and a semi regular distribution profile (b). Bunny2 with a Gaussian kernel profile (c) a random distribution profile (d) and a density map (e). Dragon and plan: use a color map (f) a density map (g) a kernel (h) and a random distribution (i). [CGVC-2016]
Procedural Texture Synthesis by Locally Controlled Spot Noise: Examples of a near-regular features reproduction by a single spot noise. Left: kernel profiles and distribution profiles. Right: blue fabric pattern applied on a 3D model. [file:///C:/Users/joris_r/AppData/Local/Temp/Pavie.pdf [WSCG-2016]]