Loading…
Light source estimation of outdoor scenes for mixed reality
Illumination consistency is important for photorealistic rendering of mixed reality. However, it is usually difficult to acquire illumination conditions of natural environments. In this paper, we propose a novel method for evaluating the light conditions of a static outdoor scene without knowing its...
Saved in:
Published in: | The Visual computer 2009-05, Vol.25 (5-7), p.637-646 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Illumination consistency is important for photorealistic rendering of mixed reality. However, it is usually difficult to acquire illumination conditions of natural environments. In this paper, we propose a novel method for evaluating the light conditions of a static outdoor scene without knowing its geometry, material, or texture. In our method, we separate respectively the shading effects of the scene due to sunlight and skylight through learning a set of sample images which are captured with the same sun position. A fixed illumination map of the scene under sunlight or skylight is then derived reflecting the scene geometry, surface material properties and shadowing effects. These maps, one for sunlight and the other for skylight, are therefore referred to as basis images of the scene related to the specified sun position. We show that the illumination of the same scene under different weather conditions can be approximated as a linear combination of the two basis images. We further extend this model to estimate the lighting condition of scene images under deviated sun positions, enabling virtual objects to be seamlessly integrated into images of the scene at any time. Our approach can be applied for online video process and deal with both cloudy and sun shine situations. Experiment results successfully verify the effectiveness of our approach. |
---|---|
ISSN: | 0178-2789 1432-2315 |
DOI: | 10.1007/s00371-009-0342-4 |