I. Introduction
We have previously proposed the natural 3D display system based on the reconstruction of parallax rays [1]–[4]. Its notable feature is the ability to display natural 3D images, like holography, which are visible to multiple viewers at the same time without the need for special glasses. In this display system, 3D images are formed by the intersection of discrete reconstructed parallax rays coming from the display screen. In order to display a 3D image of a real object by applying the principle of ray-space representation [5], which is one type of Image-Based Rendering technique [5]–[7], we have also proposed a camera system that captures light ray data needed for reconstructing 3D images from multiple images captured from multiple viewpoints [8]. However, since the arrangement of the previous camera system is straight-line, the ray information of an object is acquired from one side of an object. In order to reconstruct a 3D image of a real object which can be observed from surrounding multiple viewing points, multiple images which captured from multiple viewing points around a real object are required. Thus, we propose a circular camera system that captures light ray data about a real 3D object by multiple images captured from circular multiple viewing points. Moreover, to reconstruct a 3D image of a real object, many images are required. Therefore, it is difficult to implement a circular camera system in practice, because of the physical size and the cost of the cameras. In order to reduce the number of required images, we also propose an interpolation algorithm. This paper describes an interpolation algorithm of the circular camera system for the omnidirectional 3D display.