1-4hit |
Eigo SEGAWA Morito SHIOHARA Shigeru SASAKI Norio HASHIGUCHI Tomonobu TAKASHIMA Masatoshi TOHNO
We developed a system that detects the vehicle driving immediately ahead of one's own car in the same lane and measures the distance to and relative speed of that vehicle to prevent accidents such as rear-end collisions. The system is the first in the industry to use non-scanning millimeter-wave radar combined with a sturdy stereo image sensor, which keeps cost low. It can operate stably in adverse weather conditions such as rain, which could not easily be done with previous sensors. The system's vehicle detection performance was tested, and the system can correctly detect vehicles driving 3 to 50 m ahead in the same lane with higher than 99% accuracy in clear weather. Detection performance in rainy weather, where water drops and splashes notably degraded visibility, was higher than 90%.
Osafumi NAKAYAMA Morito SHIOHARA Shigeru SASAKI Tomonobu TAKASHIMA Daisuke UENO
During the period from dusk to dark, when it is difficult for drivers to see other vehicles, or when visibility is poor due to rain, snow, etc., the contrast between nearby vehicles and the background is lower. Under such conditions, conventional surveillance systems have difficulty detecting the outline of nearby vehicles and may thus fail to recognize them. To solve this problem, we have developed a rear and side surveillance system for vehicles that uses image processing. The system uses two stereo cameras to monitor the areas to the rear and sides of a vehicle, i.e., a driver's blind spots, and to detect the positions and relative speeds of other vehicles. The proposed system can estimate the shape of a vehicle from a partial outline of it, thus identifying the vehicle by filling in the missing parts of the vehicle outline. Testing of the system under various environmental conditions showed that the rate of errors (false and missed detection) in detecting approaching vehicles was reduced to less than 10%, even under conditions that are problematic for conventional processing.
Takahiro AOKI Osafumi NAKAYAMA Morito SHIOHARA Shigeru SASAKI Yoshishige MURAKAMI
We have developed an airport monitoring system that traces the movement of airplanes in the parking areas of airports. For this system, we have developed an image processing method, a two-stage normalized background subtraction method that can detect moving objects and determine the sizes of those objects under illumination changes, which are inevitable for outdoor monitoring systems. The two-stage method consists of local and global normalized subtraction. With this method, airplanes can be detected in a stable manner under illumination changes, which means that the brightness in each pixel is not constant due to changes in atmospheric phenomena, such as the shadows of clouds. And false detection problems due to the presence of boarding bridges are solved by utilizing differences in motion between an airplane and the boarding bridge, such as the direction of movement. We have evaluated this method using 140 hours of video images that contain scenes with a variety of conditions, such as the presence of cloud shadows, the turning on and off of lights, night, rainfall and so on. As a result, we have confirmed a 95% level of accuracy of airplane detection. This system is now in operation at Kansai International Airport and is performing most satisfactorily.
Daisuke ABE Eigo SEGAWA Osafumi NAKAYAMA Morito SHIOHARA Shigeru SASAKI Nobuyuki SUGANO Hajime KANNO
In this paper, we present a robust small-object detection method, which we call "Frequency Pattern Emphasis Subtraction (FPES)", for wide-area surveillance such as that of harbors, rivers, and plant premises. For achieving robust detection under changes in environmental conditions, such as illuminance level, weather, and camera vibration, our method distinguishes target objects from background and noise based on the differences in frequency components between them. The evaluation results demonstrate that our method detected more than 95% of target objects in the images of large surveillance areas ranging from 30-75 meters at their center.