Loading…
Metadata-Based Adaptive Assembling of Video Clips on the Web
Video content remixing - generic or personalised - is available on the Web in various forms. Most platforms target at users uploading and rearranging content, and sharing it with the community. Although several implementations exist, to the best of our knowledge no solution uses metadata to its full...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Video content remixing - generic or personalised - is available on the Web in various forms. Most platforms target at users uploading and rearranging content, and sharing it with the community. Although several implementations exist, to the best of our knowledge no solution uses metadata to its full extent to, e.g., dynamically render a video stream. With the research presented in this paper, we propose a new approach to dynamic video assembly. In our approach consumers may describe the desired content using a set of domain-specific parameters. Based on the metadata the video clips are annotated with, the system then chooses clips fitting the user criteria. They are aligned in an aesthetically pleasing manner while the user furthermore is able to interactively influence content selection during playback. A fictitious showcase from the sports domain illustrates the applicability of our approach. The implementation is demonstrated using an available non-linear, interactive movie production environment. |
---|---|
DOI: | 10.1109/SMAP.2007.42 |