TUTCRIS - Tampereen teknillinen yliopisto

TUTCRIS

ICface: Interpretable and controllable face reenactment using GANs

Tutkimustuotosvertaisarvioitu

Yksityiskohdat

AlkuperäiskieliEnglanti
Otsikko2020 IEEE Winter Conference on Applications of Computer Vision, WACV 2020
KustantajaIEEE
Sivut3374-3383
Sivumäärä10
ISBN (elektroninen)9781728165530
DOI - pysyväislinkit
TilaJulkaistu - 1 maaliskuuta 2020
OKM-julkaisutyyppiA4 Artikkeli konferenssijulkaisussa
TapahtumaIEEE/CVF Winter Conference on Applications of Computer Vision - Snowmass Village, Yhdysvallat
Kesto: 1 maaliskuuta 20205 maaliskuuta 2020

Julkaisusarja

NimiIEEE Winter Conference on Applications of Computer Vision
ISSN (painettu)1550-5790

Conference

ConferenceIEEE/CVF Winter Conference on Applications of Computer Vision
MaaYhdysvallat
KaupunkiSnowmass Village
Ajanjakso1/03/205/03/20

Tiivistelmä

This paper presents a generic face animator that is able to control the pose and expressions of a given face image. The animation is driven by human interpretable control signals consisting of head pose angles and the Action Unit (AU) values. The control information can be obtained from multiple sources including external driving videos and manual controls. Due to the interpretable nature of the driving signal, one can easily mix the information between multiple sources (e.g. pose from one image and expression from another) and apply selective postproduction editing. The proposed face animator is implemented as a two stage neural network model that is learned in self-supervised manner using a large video collection. The proposed Interpretable and Controllable face reenactment network (ICface) is compared to the state-of-the-art neural network based face animation techniques in multiple tasks. The results indicate that ICface produces better visual quality, while being more versatile than most of the comparison methods. The introduced model could provide a lightweight and easy to use tool for multitude of advanced image and video editing tasks. The program code will be publicly available upon the acceptance of the paper.