Conventional vision systems and algorithms assume the camera to have a single viewpoint. However, cameras need not always maintain a single viewpoint. For instance, an incorrectly aligned imaging system could cause non-single viewpoints. Also, systems could be designed on purpose to deviate from a single viewpoint to trade-off image characteristics such as resolution and field of view. In these cases, the locus of viewpoints forms what is called a caustic. In this project, we are interested in the analysis, implementation and calibration on non-single viewpoint imaging systems.

In our initial work in this area, we conducted an in-depth analysis of the caustics of catadioptric cameras with conic reflectors. The properties of caustics with respect to field of view and resolution were analyzed. Next, we developed ways to calibrate conic catadioptric systems and estimate their caustics from known camera motion.

In the second phase of this project, we developed a general imaging model that can be used to represent an arbitrary imaging system, including single viewpoint and non-single viewpoint ones. All imaging systems perform a mapping from incoming scene rays to photo-sensitive elements on the image detector. This mapping can be conveniently described using a set of virtual sensing elements called raxels. Raxels include geometric, radiometric and optical properties. We have developed a simple calibration method that uses structured light patterns to extract the raxel parameters of an arbitrary imaging system. Calibration experiments have been conducted for perspective as well as non-perspective imaging systems.