Definition: Process of dividing or sampling a continuously varying signal or quality into a series of discrete levels or quanta for the purpose of creating an accurate copy or facsimile of the signal. * Equivalently, it is the processing of reducing the number of bits needed to describe a variable by reducing the precision with which we store the variable. * It is the crucial, critical step in creating a digital copy of an image. * The frequency at which the signal is divided before quantisation should be at least double that of the highest frequency in the signal: if not, the reconstruction of the original signal is likely to contain artefacts, or aliasing. * Note: the quantisation may be linear i.e. equal steps in the source are matched to equal steps in the quantised data or not linear i.e. more steps in the quantised data are allocated to parts of the source signal than to others -- e.g. more quanta are assigned to low levels (shadows) than to high levels. Quantum zones You might never of have thought of it, but the Good Ole Zone System is a quantisation of continuous tonal or luminance variation: into rather widely space ten zones. It's just as saying someone is 'x years old' quantises the continuous variable of a person's exact age into the quanta of whole years.
Related Terms: alias
Previous Term: quality factor Next Term: quantum efficiency