Inside GNSS Media & Research

JUL-AUG 2019

Issue link: https://insidegnss.epubxp.com/i/1148308

Contents of this Issue

Navigation

Page 47 of 67

48 Inside GNSS J U L Y / A U G U S T 2 0 1 9 www.insidegnss.com GNSS positioning is strongly challenged in urban canyon areas. The signal reflection induces multipath and non-light-of-sight (NLOS). These signal blockages and reflections are caused by the obstacles of signal transmission between the satellites and receiver. The obstacles can be buildings, trees and even a high-rise vehicle such as double-decker buses. Interestingly, they are the obstacles in urban traffic scenes. Inspiring from this, the authors propose an innovative sensor integration scheme to aid GNSS single point positioning (SPP). Taking the uprising autonomous driving as an example, instead of simply using LiDAR odometry to provide receiver movement between two data epochs, we make use of the objects detected by LiDAR and describe them in the representation of relative azimuth and elevation angles to the receiver. According to this experiment's results, the proposed perceived environment aided GNSS SPP can improve 35% comparing to conventional weighted least square (WLS). SINGLE POINT POSITIONING A utonomou s d r iv i ng i nt ro - duces high demand in GNSS in all driving environments. Currently, GNSS performance is heav- ily challenged in deep urban canyons. e positioning error can go up to even 100 meters, due to the notorious non- line-of-sight (NLOS) receptions which dominates the GNSS positioning errors in dense-building areas (see Hsu 2018 in Additional Resources). e conventional solution is to integrate GNSS with other on-board sensors including, inertial navigation systems (INS), vehicular odometry, and vehicular steering. More recently, more sensors are installed on the future intelligent vehicle as shown in Figure 1. us, visual odometry and LiDAR odometry are also integrated with GNSS now. e level of integration is usually clas- sified based on how "raw" the measure- ment is that is provided by GNSS. For example, the position and velocity are treated as loosely coupled, the pseudor- anges and its rates are as tightly coupled, and I/Q correlator outputs are as ultra- tightly coupled integration. However, these integration schemes are purely considering that other sensors can only provide the system propagation in posi- tion and orientation of the vehicle. However, one opportunity has been neglected. Other than LiDAR and vision odometry, both LiDAR scanners and cameras are also used to detect sur- rounding objects to avoid collisions. In the other words, they can perceive the surrounding environment of a GNSS receiver in a real-time manner. is means the vehicle can obtain the location of the surrounding objects including trees, buildings and vehicles as shown in Figure 2. is article dem- WEISONG WEN LI-TA HSU HONG KONG POLYTECHNIC UNIVERSITY, HONG KONG Perceived Environment Aided GNSS Single Point Positioning: An Example using LiDAR Scanner

Articles in this issue

Links on this page

view archives of Inside GNSS Media & Research - JUL-AUG 2019