TY - GEN
T1 - Multi-sensor fusion for cost-effective precise vehicle positioning
AU - Kim, Hojun
AU - Lee, Impyeong
PY - 2016
Y1 - 2016
N2 - As the advantage of safety and convenience, an autonomous car and ADAS are being actively researched. One of the main challenge of the systems is to precisely determine the position of the vehicle. To solve this problem, sensor fusion methods are mostly used in many recent studies. In this work, we design a workflow for a vehicle position estimation system based on a sensor fusion approach and evaluate the accuracy of the proposed algorithm. The algorithm uses in-vehicle sensors, GPS, image sensors and road characteristics information for position estimation. The proposed sensor fusion method determines the vehicle positions by the following procedures. First, a relative trajectory is calculated using in-vehicle sensors only. This process is called a dead reckoning step. Then, we perform a bundle adjustment algorithm to estimate the position and direction of the vehicle using images and the initial values derived from the previous step. Through this step, we can determine the vehicle position more precisely. The sensor fusion procedure is performed using an EKF. The EKF calculates vehicle positions whenever the sensory data are acquired from in-vehicle sensors, a GPS and a camera. If road characteristics information is acquired from other sensors, it also can be combined for accurate position estimation. For the experiment, we designed a sensory data acquisition system and installed it on a vehicle. We also installed a precise position measuring equipment to evaluate the proposed algorithm. The estimation is performed using in-vehicle sensor only method and the proposed sensor fusion method. The RMS errors of the estimated positions from the proposed method are about 1.6 m. This experimental results show nearly 90% improvement in accuracy compared with the results from the in-vehicle sensor only method. The algorithm may be used for applications requiring accurate driving route estimation such as autonomous car and ADAS.
AB - As the advantage of safety and convenience, an autonomous car and ADAS are being actively researched. One of the main challenge of the systems is to precisely determine the position of the vehicle. To solve this problem, sensor fusion methods are mostly used in many recent studies. In this work, we design a workflow for a vehicle position estimation system based on a sensor fusion approach and evaluate the accuracy of the proposed algorithm. The algorithm uses in-vehicle sensors, GPS, image sensors and road characteristics information for position estimation. The proposed sensor fusion method determines the vehicle positions by the following procedures. First, a relative trajectory is calculated using in-vehicle sensors only. This process is called a dead reckoning step. Then, we perform a bundle adjustment algorithm to estimate the position and direction of the vehicle using images and the initial values derived from the previous step. Through this step, we can determine the vehicle position more precisely. The sensor fusion procedure is performed using an EKF. The EKF calculates vehicle positions whenever the sensory data are acquired from in-vehicle sensors, a GPS and a camera. If road characteristics information is acquired from other sensors, it also can be combined for accurate position estimation. For the experiment, we designed a sensory data acquisition system and installed it on a vehicle. We also installed a precise position measuring equipment to evaluate the proposed algorithm. The estimation is performed using in-vehicle sensor only method and the proposed sensor fusion method. The RMS errors of the estimated positions from the proposed method are about 1.6 m. This experimental results show nearly 90% improvement in accuracy compared with the results from the in-vehicle sensor only method. The algorithm may be used for applications requiring accurate driving route estimation such as autonomous car and ADAS.
KW - Bundle adjustment
KW - Kalman filter
KW - Navigation
KW - Positioning
KW - Sensor fusion
UR - http://www.scopus.com/inward/record.url?scp=85018266646&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85018266646
T3 - 37th Asian Conference on Remote Sensing, ACRS 2016
SP - 431
EP - 438
BT - 37th Asian Conference on Remote Sensing, ACRS 2016
PB - Asian Association on Remote Sensing
T2 - 37th Asian Conference on Remote Sensing, ACRS 2016
Y2 - 17 October 2016 through 21 October 2016
ER -