Colour correction in theory and practice

Hordley, S. D., Paul, J. F. and Finlayson, G. D. (2005) Colour correction in theory and practice. In: 5th IASTED International Conference on Visualization, Imaging and Image Processing, 2005-09-01.

Full text not available from this repository. (Request a copy)


Most digital cameras transform their raw RGB values into a device independent colour space such as CIE XYZ space so that images can be accurately reproduced on a display device. This transform is derived based on prior knowledge about the typical colour stimuli the device will encounter. One way to obtain this knowledge is by calibration: a transform is de rived from measurements of a set of re?ectance functions imaged under a known illuminant. Alternatively, the transform can be derived by making theoretical assumptions about the statistical distribution of colour stimuli. In this paper we propose a correction procedure which is a compromise between these two approaches. We posit that accurate correction is best achieved, by making use of prior knowledge about both surfaces and illuminants. We argue that our prior knowledge of scene illuminants is best modelled by a theoretical approach which only constrains illuminants to be physically realisable, but that accurate knowledge of surface re?ectance can be obtained by calibration. We then show how these two independent models can be combined to form a model of colour stimuli, and from this, an appropriate colour correction transform. Finally, we present an empirical analysis which shows that our compromise solution a?ords very good colour correction and performs slightly better than previous approaches.

Item Type: Conference or Workshop Item (Paper)
Faculty \ School: Faculty of Science > School of Computing Sciences
UEA Research Groups: Faculty of Science > Research Groups > Interactive Graphics and Audio
Faculty of Science > Research Groups > Colour and Imaging Lab
Related URLs:
Depositing User: Vishal Gautam
Date Deposited: 20 Jul 2011 12:43
Last Modified: 22 Apr 2023 02:45

Actions (login required)

View Item View Item