Modeling the timing of cuts in automatic editing of concert videos
Research output: Contribution to journal › Article › Scientific › peer-review
|Number of pages||25|
|Journal||Multimedia Tools and Applications|
|Publication status||Published - 20 Feb 2016|
|Publication type||A1 Journal article-refereed|
Increasing amount of video content is being recorded by people in public events. However, the editing of such videos can be challenging for the average user. We describe an approach for modeling the shot cut timing of professionally edited concert videos. We analyze the temporal positions of cuts in relation to the music meter grid and form Markov chain models from the found switching patterns and their occurrence frequencies. The stochastic Markov chain models are combined with audio change point analysis and cut deviation models for automatically generating temporal editing cues for unedited concert video recordings. Videos edited according to the modeling are compared in a user study against a baseline automatic editing method as well as against videos edited by hand. The study results show that users prefer the cut timing from the proposed system over the baseline with a clear margin, whereas a much smaller difference is observed in the preference of hand-made videos over the proposed method.
- Automatic video editing, Automatic vidCut timing , Example-based modeling , Live music content analysis