Definition: Technique of projecting images on television and computer monitor screens as two parts or fields in such a way that the scan lines of one alternate - i.e. interlace - with the other. * This improves quality without needing more data or bandwidth. It reduces  flicker: a TV showing a movie at a rate of 25 non-interlaced frames per second (fps) will appear to flicker but if it is presented as 50 interlaced fields per second (overall frame rate is still 25 fps), flicker is reduced to acceptable levels provided the image is not static. * Cannot be used in LCD, plasma or other screens which do not build the image by scan lines. * Invented in 1930s by RCA.

Related Terms: field

Previous Term: interimage effects  Next Term: inter-leave

Type a photography term below to find its definition: