Autonomous Vehicles: Moving Closer to the Driverless Future

by EOS Intelligence

Autonomous Vehicles: Moving Closer to the Driverless Future

by EOS Intelligence

by EOS Intelligence

An Uber self-driving car was reported getting into an accident in Arizona last month. But as the saying goes “any publicity is good publicity”, this also holds true for autonomous vehicles. The news sparked a discussion and shed some light on potential challenges the technology may face before it becomes available for commercial use. At the same time, it spread awareness about the level of safety testing being done to improve the technology before it is rolled out to the public. We are taking a look at what’s potentially in store for users waiting to see streets flooded with driverless vehicles.

Autonomous self-driving vehicles have been the talk of the industry for some time now, with some of the initial attempts to create a modern autonomous car dating back to 1980s. However, major advancements have only been made during the last decade, coinciding with advancements in the supporting technologies, such as advanced sensors, real-time mapping, and cognitive intelligence, which are perhaps the most crucial to the success of any autonomous vehicle.

Early advancements in the segment were led by technology companies which focused on developing software to automate/assist driving of cars. Some prime examples include nuTonomy, which has recently partnered with Grab (a ride-hailing startup rival to Uber) to test its self-driving cars in Singapore, Cruise Automation (acquired by GM in 2016), and Argo AI, which has recently received a US$1 billion investment from Ford. These companies use primarily regular cars/vans that are retrofitted with sensors, as well as high-definition mapping and software systems.

However, software alone is not capable enough to offer self-driving driving functionalities, therefore, automotive OEMs are taking the front seat when it comes to driving advancements in autonomous vehicles segment. New cars/vans, which are tuned to work seamlessly with this software, are likely to adapt better with the algorithms and meet stringent performance and safety standards required before they can be rolled out commercially. California-based Navigant Research believes that with its investment in Argo AI, Ford has taken a lead among such automotive OEMs in the race to produce an autonomous, self-driving vehicles.

Advanced levels of autonomy still to be achieved

In a nutshell, there are five levels of autonomous cars. Levels 1 through to 3 require human intervention in some form or other. The most basic level comprises only driver assistance systems, such as steering or acceleration control. Most common form of currently prevalent autonomy is Level 2, which involves the driver being disengaged from physically operating the vehicle for some time, using automation such as cruise control and lane-centering. Tesla’s current Autopilot system can be categorized as Level 2.

Level 3 involves the car completely undertaking the safety-critical functions, under certain traffic or environmental conditions, while requiring a driver to intervene if necessary.

Most OEMs developing autonomous cars target launching their vehicles in the next three to five years. Tesla is probably the closest, with its Model 3 car with Autopilot 3 system expected to be unveiled in 2018 (however, this depends on whether the regulations are in place by then). Nissan, Toyota, Google, and Volvo plan to achieve this by 2020, while BMW and Ford have set a deadline for 2021. Most of these companies are working on achieving cars with Level 3 autonomy, with a driver sitting behind the steering wheel to take over from the car’s programming as and when required.

Level 4 and Level 5 vehicles are deemed as fully autonomous which means they do not require a driver and all driving functions are undertaken by the car. The only difference is that while Level 4 vehicles are limited to most common roads and general traffic conditions, Level 5 vehicles are able to offer performance equivalent to a human driving in every scenario – including extreme environments such as off-roads.

Some OEMs, Ford in particular, are against the practice of using a human as a back-up, based on the understanding that a person sitting idle behind the wheel often loses the situational awareness which is required when he needs to take over from the car’s programming. Ford is planning to skip achieving Level 3 autonomy and target development of Level 4 autonomous vehicles instead.

Google is currently the only company focusing on developing a Level 5 autonomous car (or a robot car). The company already showcased a prototype that has no steering wheel or manual controls – a prototype that in true sense can be the first autonomous car. Tesla also plans to work on achieving the highest level of autonomy and plans to fit its cars with all hardware necessary for a fully-autonomous vehicle.

High costs continue to be challenging

While the plans are in place, one massive roadblock that persists in the development of these cars of future are costs. There are multiple sensors used in these cars, including SONAR and LIDAR. The ongoing research has helped to reduce the costs of sensors – Google’s Waymo has managed to reduce the costs of LIDAR sensors by 90%, from about $75,000 (in 2009) to about $7,000 (in 2016) – but they are still very expensive. The fact that a driverless car requires about four of these sensors, makes the cars largely unaffordable for consumers, and that puts off any discussion of feasibility of commercial production at this stage.

EOS Perspective

The first three months of 2017 have been particularly eventful, with several prototypes launched or tested. This activity is expected to increase further as companies try to meet their ambitious plans to roll out self-driving cars by 2020.

Initial adoption is likely to come from companies investing in commercial fleet, particularly those focusing on on-demand taxi or fleet, similar to what Uber or Lyft offer. Series of investments by large bus manufacturing companies, such as Scania, Iveco, and Yutong, also indicate how this technology will be the flavor of the future in public transport.

It is too soon to comment how and when exactly these autonomous vehicles can be expected to impact the way people choose to travel and how they may redefine the societies’ mobility. It is likely to depend on how the regulatory environment evolves to allow driverless cars in active traffic. Current regulatory environment for driverless cars is still at a nascent stage and allows only for testing of these cars in an isolated environment. Some states in the USA, particularly California, Arizona, and Pennsylvania, have opened up to testing of these cars in general public. However, recent accidents and cases of autonomous cars breaking traffic rules have put pressure on authorities to reconsider their stance until the cars become more advanced and tested to handle the nuances of public traffic. We might need to wait another decade or two before driverless cars are a reality in many markets. As things stand, endless efforts continue to go behind the curtain, as companies strive to win the race to develop highly autonomous and safe vehicles.