Abstract
Illumination consistency is important for photorealistic rendering of mixed reality. However, it is usually difficult to acquire illumination conditions of natural environments. In this paper, we propose a novel method for evaluating the light conditions of a static outdoor scene without knowing its geometry, material, or texture. In our method, we separate respectively the shading effects of the scene due to sunlight and skylight through learning a set of sample images which are captured with the same sun position. A fixed illumination map of the scene under sunlight or skylight is then derived reflecting the scene geometry, surface material properties and shadowing effects. These maps, one for sunlight and the other for skylight, are therefore referred to as basis images of the scene related to the specified sun position. We show that the illumination of the same scene under different weather conditions can be approximated as a linear combination of the two basis images. We further extend this model to estimate the lighting condition of scene images under deviated sun positions, enabling virtual objects to be seamlessly integrated into images of the scene at any time. Our approach can be applied for online video process and deal with both cloudy and sun shine situations. Experiment results successfully verify the effectiveness of our approach.
Original language | English (US) |
---|---|
Pages (from-to) | 637-646 |
Number of pages | 10 |
Journal | Visual Computer |
Volume | 25 |
Issue number | 5-7 |
DOIs | |
State | Published - May 2009 |
Externally published | Yes |
All Science Journal Classification (ASJC) codes
- Software
- Computer Vision and Pattern Recognition
- Computer Graphics and Computer-Aided Design
Keywords
- Augmented reality
- Light source recovery
- Outdoor scenes