Image-based Representation of Clothes for Realistic Virtual Try-On


The project addresses the photo-realistic rendering of faces. Faces show, similar to clothing, very complex appearance properties, and it is very difficult to photorealistically render human faces. Even more, as the human eye is very sensitive to seeing human faces, computer generated human faces often appear uncanny to the human observer. In recent years, there has been a tremendous step towards realistic facial rendering. However, current methods for realistic face rendering require very sophisticated capturing setups, e.g. high-resolution camera setups and light stages, to capture and model the reflectance field of the human face or computationally demanding simulation of facial characteristics during rendering. Extending the developed pose-space image-based rendering methods to the visualization of faces will allow a photorealistic rendering and synthesis of facial expressions directly from a database of example expression images without complex and computationally demanding simulation of skin bulging, wrinkling effects and reflection properties, as these details are captured by the images. One possible application for the proposed methods is performance-driven facial animation, i.e. the transfer of the facial expressions of one person to another.


Principal investigators
Eisert, Peter Prof. Dr.-Ing. (Details) (Visual Computing)

Financer
DFG Individual Research Grant

Duration of project
Start date: 08/2014
End date: 07/2015

Research Areas
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing

Research Areas
Informatik

Last updated on 2025-15-01 at 22:54