Tampere University of Technology

TUTCRIS Research Portal

Depth Assisted Composition of Synthetic and Real 3D Scenes

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Standard

Depth Assisted Composition of Synthetic and Real 3D Scenes. / Cortes, Santiago; Suominen, Olli; Gotchev, Atanas.

Proceedings of the IS&T International Symposium on Electronic Imaging. 2016.

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Harvard

Cortes, S, Suominen, O & Gotchev, A 2016, Depth Assisted Composition of Synthetic and Real 3D Scenes. in Proceedings of the IS&T International Symposium on Electronic Imaging. IS&T International Symposium on Electronic Imaging, 1/01/00. https://doi.org/10.2352/ISSN.2470-1173.2016.21.3DIPM-399

APA

Vancouver

Author

Cortes, Santiago ; Suominen, Olli ; Gotchev, Atanas. / Depth Assisted Composition of Synthetic and Real 3D Scenes. Proceedings of the IS&T International Symposium on Electronic Imaging. 2016.

Bibtex - Download

@inproceedings{3b355510862841338e11a17c9c6e71fb,
title = "Depth Assisted Composition of Synthetic and Real 3D Scenes",
abstract = "In media production, previsualization is an important step. It allows the director and the production crew to see an estimate of the final product during the filmmaking process. This work focuses in a previsualization system for composite shots, which involves real and virtual content. The system visualizes a correct perspective view of how the real objects in front of the camera operator look placed in a virtual space. The aim is to simplify the workflow, reduce production time and allow more direct control of the end result. The real scene is shot with a time-of-flight depth camera, whose pose is tracked using a motion capture system. Depth-based segmentation is applied to remove the background and content outside the desired volume, the geometry is aligned with a stream from an RGB color camera and a dynamic point cloud of the remaining real scene contents is created. The virtual objects are then also transformed into the coordinate space of the tracked camera, and the resulting composite view is rendered accordingly. The prototype camera system is implemented as a self-contained unit with local processing, and it runs at 15 fps and produces a 1024x768 image.",
author = "Santiago Cortes and Olli Suominen and Atanas Gotchev",
year = "2016",
month = "2",
doi = "10.2352/ISSN.2470-1173.2016.21.3DIPM-399",
language = "English",
booktitle = "Proceedings of the IS&T International Symposium on Electronic Imaging",

}

RIS (suitable for import to EndNote) - Download

TY - GEN

T1 - Depth Assisted Composition of Synthetic and Real 3D Scenes

AU - Cortes, Santiago

AU - Suominen, Olli

AU - Gotchev, Atanas

PY - 2016/2

Y1 - 2016/2

N2 - In media production, previsualization is an important step. It allows the director and the production crew to see an estimate of the final product during the filmmaking process. This work focuses in a previsualization system for composite shots, which involves real and virtual content. The system visualizes a correct perspective view of how the real objects in front of the camera operator look placed in a virtual space. The aim is to simplify the workflow, reduce production time and allow more direct control of the end result. The real scene is shot with a time-of-flight depth camera, whose pose is tracked using a motion capture system. Depth-based segmentation is applied to remove the background and content outside the desired volume, the geometry is aligned with a stream from an RGB color camera and a dynamic point cloud of the remaining real scene contents is created. The virtual objects are then also transformed into the coordinate space of the tracked camera, and the resulting composite view is rendered accordingly. The prototype camera system is implemented as a self-contained unit with local processing, and it runs at 15 fps and produces a 1024x768 image.

AB - In media production, previsualization is an important step. It allows the director and the production crew to see an estimate of the final product during the filmmaking process. This work focuses in a previsualization system for composite shots, which involves real and virtual content. The system visualizes a correct perspective view of how the real objects in front of the camera operator look placed in a virtual space. The aim is to simplify the workflow, reduce production time and allow more direct control of the end result. The real scene is shot with a time-of-flight depth camera, whose pose is tracked using a motion capture system. Depth-based segmentation is applied to remove the background and content outside the desired volume, the geometry is aligned with a stream from an RGB color camera and a dynamic point cloud of the remaining real scene contents is created. The virtual objects are then also transformed into the coordinate space of the tracked camera, and the resulting composite view is rendered accordingly. The prototype camera system is implemented as a self-contained unit with local processing, and it runs at 15 fps and produces a 1024x768 image.

U2 - 10.2352/ISSN.2470-1173.2016.21.3DIPM-399

DO - 10.2352/ISSN.2470-1173.2016.21.3DIPM-399

M3 - Conference contribution

BT - Proceedings of the IS&T International Symposium on Electronic Imaging

ER -