1-3hit |
Osafumi NAKAYAMA Morito SHIOHARA Shigeru SASAKI Tomonobu TAKASHIMA Daisuke UENO
During the period from dusk to dark, when it is difficult for drivers to see other vehicles, or when visibility is poor due to rain, snow, etc., the contrast between nearby vehicles and the background is lower. Under such conditions, conventional surveillance systems have difficulty detecting the outline of nearby vehicles and may thus fail to recognize them. To solve this problem, we have developed a rear and side surveillance system for vehicles that uses image processing. The system uses two stereo cameras to monitor the areas to the rear and sides of a vehicle, i.e., a driver's blind spots, and to detect the positions and relative speeds of other vehicles. The proposed system can estimate the shape of a vehicle from a partial outline of it, thus identifying the vehicle by filling in the missing parts of the vehicle outline. Testing of the system under various environmental conditions showed that the rate of errors (false and missed detection) in detecting approaching vehicles was reduced to less than 10%, even under conditions that are problematic for conventional processing.
Takahiro AOKI Osafumi NAKAYAMA Morito SHIOHARA Shigeru SASAKI Yoshishige MURAKAMI
We have developed an airport monitoring system that traces the movement of airplanes in the parking areas of airports. For this system, we have developed an image processing method, a two-stage normalized background subtraction method that can detect moving objects and determine the sizes of those objects under illumination changes, which are inevitable for outdoor monitoring systems. The two-stage method consists of local and global normalized subtraction. With this method, airplanes can be detected in a stable manner under illumination changes, which means that the brightness in each pixel is not constant due to changes in atmospheric phenomena, such as the shadows of clouds. And false detection problems due to the presence of boarding bridges are solved by utilizing differences in motion between an airplane and the boarding bridge, such as the direction of movement. We have evaluated this method using 140 hours of video images that contain scenes with a variety of conditions, such as the presence of cloud shadows, the turning on and off of lights, night, rainfall and so on. As a result, we have confirmed a 95% level of accuracy of airplane detection. This system is now in operation at Kansai International Airport and is performing most satisfactorily.
Daisuke ABE Eigo SEGAWA Osafumi NAKAYAMA Morito SHIOHARA Shigeru SASAKI Nobuyuki SUGANO Hajime KANNO
In this paper, we present a robust small-object detection method, which we call "Frequency Pattern Emphasis Subtraction (FPES)", for wide-area surveillance such as that of harbors, rivers, and plant premises. For achieving robust detection under changes in environmental conditions, such as illuminance level, weather, and camera vibration, our method distinguishes target objects from background and noise based on the differences in frequency components between them. The evaluation results demonstrate that our method detected more than 95% of target objects in the images of large surveillance areas ranging from 30-75 meters at their center.