analogue-to-digital converter

Definition: Process of representing a continuously varying signal into a set of discrete or digital values. * For conversion to take place (a) the source signal must be sampled at regular intervals - the quantization rate or interval: the higher the rate or finer the interval, the more accurately can the digital record represent the analogue original; (b) the quantity or scale of the analogue signal must be represented by binary code - the bit depth: the longer the length of the code or greater bit depth, the more precisely the values can be represented. * Analogue-to-digital conversion can take place in largely in hardware e.g. DAT recorders or in a mixture of both hard and software e.g. scanners. * Also known as ADC.

Related Terms: aliasing, DAC

Previous Term: analogue  Next Term: analytical density

Type a photography term below to find its definition: