Inside GNSS Media & Research

NOV-DEC 2017

Issue link: http://insidegnss.epubxp.com/i/906409

Contents of this Issue

Navigation

Page 40 of 67

www.insidegnss.com N O V E M B E R / D E C E M B E R 2 0 1 7 Inside GNSS 41 Press articles in the 1950s and 1960s predicted that autonomous cars and "electronic highways" would become widely available by 1975. Major mile- stones in the use of new sensor, compu- tation, and communication technology have recently reenergized the eager- ness for HAVs. This first started with the 2005 "DARPA Grand Challenge", where four different HAVs designed by teams of engineers from industry and academia completed a 132-mile trip across the Mohave desert in less than 7.5 hours with no human intervention. e 2007 DARPA "Urban Challenge" saw six teams autonomously complete a 60-mile course in an urban environment, while following traffic laws. Most teams used a combination of LiDAR, cameras, dif- ferential GPS, and computation power that is multiple orders of magnitude higher than what is typically needed for a commercial passenger vehicle. In 2009, Google (now Waymo) began designing and testing "self-driving" cars, which have since accumulated more than three million miles in autonomous mode. Currently, most car manufactur- ers have HAV prototype systems and Google, Uber, NuTonomy have HAV pilot testing programs, including fully autonomous systems for public trans- portation, which, for now, are confined to segregated lanes and geo-fenced areas. Multiple Tier-2 supplier companies have emerged, which specialize in autono- mous car technology. In early 2017, 36 companies were registered to test pro- totype HAV systems on public roads in the state of California. However, in Figure 1 , Gartner's "2016 Hype Cycle for Emerging Technologies" shows that HAV technology might be at the "peak of inf lated expectations", approaching the "trough of disillusion- ment". Hype cycle curves are non-scien- tific tools that have been empirically ver- ified for multiple example technologies over many years. Two example emerging technologies, commercial unmanned aircra systems (UAS) and virtual real- ity, are included in Figure 1 for illustra- tion purposes. The curve's time scale may differ for each technology. One of many indicators of decreasing expec- t at ion s on H AVs include a reduction i n press coverage and the emergence o f f i r s t n e g a t i v e news stories, in par- ticular following the May 2016 crash of a Tesla Model S whose autopilot failed to distinguish a white trailer truck from the bright Florida sk y. The Model S ran under the trail- er causing its roof to be torn off and the operator to lose its life. e car kept going full speed on the side of the road through two fences until it hit a pole and came to a stop. In parallel, until the end of 2016, Google was providing detailed reports of their self-driving car performance, which were designed to operate in real-world urban environments. ese reports contain records of millions of miles driven autonomously, but also acknowledge "disengagements", i.e., where the operator needed to take over control to avoid collisions. The data shows that HAVs are much more likely to be involved in collisions, even though these collisions are oen of lower sever- ity than in conventional human driving [HAVs typically get rear-ended because of their unusual road behavior] (see B. Schoettle, and M. Sivak, "A Preliminary Analysis of Real-World Crashes Involv- ing Self-Driving Vehicles," Additional Resources). Also, Uber's autonomous taxis in Pittsburg have a reported rate of one disengagement per mile autono- mously driven. Moreover, the first fielded autono- mous systems have revealed new safety threats. In particular, the technology's functionality, as perceived by the human operator, does not always match the intended operational domain: for exam- ple, there have been cases of highway autopilots being used in urban areas and passing red lights without slowing down. In addition, human-machine interaction is at the heart of role confusion (is the operator or the HAV in charge?) of mode confusion (is the HAV in autonomous or manual mode?) and of the operator's trust in this multimodal system. Mis- interpretation may grow even wilder because a given functionality will not achieve the same level of performance across models and manufacturers, and operators may not be aware of the sys- tems' independently verified safety rat- ings. And, within the next few years, operators will be expected to anticipate hazardous situations and ta ke over control. us, operating an HAV may require more education and different training than driving a car manually. Current Safety Assessment Efforts To focus this article, first consider the Society of Automotive Engineer (SAE) International's classification of driving autonomy levels in Table 1 . Under Levels 0 to 2, the human driver is responsible at all times, either for driving by himself, or for supervising the HAV in autonomous mode and taking control if needed. Under Levels 3 to 5, the system is self- monitoring and the driver is expected to take control, but only if requested by the system. Levels 0-4 provide partial auto- mation under predefined driving modes and circumstances, whereas Level 5 is full autonomy. e most advanced private car sys- tems are currently Level 2, and pilot pro- FIGURE 1 Gartner's "Hype Cycle for Emerging Technologies", As of July 2016 [16]. Expectations Time Commercial UAVs HAVs Virtual Reality Innovation trigger Peak of Inflated Expectations Trough of Disillusionment Slope of Enlightenment Plateau of Productivity

Articles in this issue

Links on this page

view archives of Inside GNSS Media & Research - NOV-DEC 2017