3D Gesture Interaction and Fusion of 3D Images

Many topics within the 3D sensation alliance deal with the fusion of 3D image data from different sources. Augmented Reality (AR) applications for example face the challenge to integrate the virtual content into a real world image in a realistic and consistent way. Fusing 3D image data from different sources can also be used to create 3D content for Virtual Reality applications.

The fused 3D image data should ensure a non-disturbing and comfortable depth perception. The Visual Computing Group will work on methods for a consistent geometric and photometric fusion of 3D data from different sources. The geometric fusion registers the 3D data (translation, rotation, scale) and is responsible to ensure a consistent depth ordering and occlusion handling. When dealing with dynamic data, the geometric fusion process must also ensure a time consistent registration. Methods for the photometric fusion synchronize the image properties (blur, noise, color temperature, imaging pipeline artifacts) and adapt the scene illumination (light sources, shadows).

It is planned to create a dataset containing scenes which combine different 3D image data. For each scene, different instances will be generated with varying parameters of the fusion process. This dataset can then be used for a user study in order to evaluate the impact of the fusion parameters on the overall perception and comfort of a user.

Principal Investigators
Eisert, Peter Prof. Dr.-Ing. (Details) (Visual Computing)

Duration of Project
Start date: 12/2014
End date: 01/2016

Research Areas
Interactive and Intelligent Systems, Image and Language Processing, Computer Graphics and Visualisation

Research Areas
Human-Computer Interaction (HCI), Informatik

Last updated on 2020-20-03 at 23:15