Conventional vision systems are designed to perform in clear weather. Needless to say, in any outdoor application, there is no escape from "bad" weather. Images taken in poor weather conditions suffer from severe color and contrast degradation. Furthermore, this degradation worsens exponentially with distance making it impossible to acquire meaningful images of scenes that are not near the imaging system. Thus, computer vision systems must include mechanisms that enable them to function (even if somewhat less reliably) in the presence of haze, fog, rain, hail and snow. In this project, we are studying the visual manifestations of different weather conditions. For this, we draw on what is already known about atmospheric optics, and identify effects caused by bad weather that can be turned to our advantage. Since the atmosphere modulates the information carried from a scene point to the observer, it can be viewed as a mechanism of visual information coding. We exploit two fundamental scattering models, attenuation and airlight, to describe the colors, contrasts and polarizations of scene points observed through bad weather. Then, we use these models to develop methods for recovering pertinent scene properties, such as three-dimensional structure, from one or two images taken under poor weather conditions.