Intensity images are of limited use in terms of estimation of surfaces. Pixel values are related to surface geometry only indirectly. Range images encode the position of surface directly. Therefore, the shape can be computed reasonably easy. Range images are a special class of digital images. Each pixel of a range image expresses the distance between a known reference frame and a visible point in the scene. Therefore, a range image reproduces the 3D structure of a scene. Range images are also referred to as depth images, depth maps, xyz maps, surface profiles and 2.5D images.
Range images can be represented in two basic forms. One is a list of 3D coordinates in a given reference frame (cloud of points), for which no specific order is required. The other is a matrix of depth values of points along the directions of the x,y image axes, which makes spatial organisation explicit.
Range images are acquired with range sensors. In computer vision normally optical range sensors are used. We can distinguish between active and passive range sensors. Active range sensors project energy (e.g. light) on the scene and detect its position to measure or exploit the effect of controlled changes of some sensor parameters (e.g. focus). On the other hand passive range sensors rely only on intensity images to reconstruct depth.
Active range sensors exploit a variety of physical principles. The most common sensor techniques are triangulation, radar/sonar, moiré interferometry and active focusing/defocusing. Triangulation uses a light projector and an intensity camera, which is placed at a certain distance from the projector. The projector emits a light pattern. The most common patterns are planes and single beams. We shall use a projected plane for illustration. The intersection of the plane with the scene surface is a planar curve called the strip, which is observed by the camera. By using triangulation we get the depth map of the surface points under the strip. Radar/sonar uses a short electromagnetic or acoustic wave and detect the return (echo) reflected from surrounding surfaces. Distance is obtained as a function of the time taken by the wave to hit a surface and come back. Moiré sensors project two gratings with regularly spaced patterns onto the surface and measure the phase differences of the observed interference pattern. Other phase difference sensors measure the phase shift of the observed return beam. Distance is a function of the phase difference. Active focusing/defocusing sensors use two or more images of the same scene, which are acquired under varying focus settings. Once the best focused image is determined, a model linking focus values and distance yields the distance.