Are you excited or scared about the future’s self-driving cars?
Once more, technology seems set to bring sci-fi movies to the real world. But while the idea of self-driving cars — or autonomous vehicles as they are also known — might seem like a dream come true, many still believe the technology cannot be trusted.
In March 2018, the news broke that the first recorded fatality had occurred due to a self-driving car. A pedestrian was struck by a self-driving Uber car travelling at 40mph. A later report by The Guardian revealed that the car’s sensors had actually detected the pedestrian, but due to being “tuned too far” towards ignoring potential false positive detections, the algorithm decided to ignore the detection. The safety driver within the vehicle was also noted as having not paid enough attention to take control of the vehicle fast enough.
Unsurprisingly, this incident has reflected on the public’s perception of self-driving cars. A study by Intelligent Car Leasing found that, out of 1,750 people surveyed, 61% said they would feel safest in a human-controlled car. Only 17% said they would feel safest in a self-driving car.
Do self-driving vehicles have a future? And if so, can the public’s mind be changed regarding their safety? To look at the matter further, we’ve teamed up with Pass ‘N’ Go, which offers driving lessons in Middlesbrough and across the North East.
The levels of autonomy
There are six levels of self-driving ability, ranging from minor assistance to full-blown machine control without human input. These levels have been outlined by SAE International and J3016 and are sometimes referred to as SAE levels:
- No Automation
- Driver Assistance
- Partial Automation
- Conditional Automation
- High Automation
- Full Automation
At level three, the system is able to monitor the driving environment, though the human driver is still the main fallback. Current systems are capable of level two automation, says The Guardian, which means the car’s systems can keep the car in the right line, but they cannot be relied upon to execute all driving tasks. There is a growing concern, however, that the way car manufacturers are marketing their technology misleads drivers, such as claiming to have an AutoPilot feature. While these features can help as driving aids, they do not have full automation capabilities.
The BBC reported that motor insurance companies are requesting that car manufacturers avoid using the word “autonomous” in their marketing — until the technology has reached legitimate full autonomy.
Of course, the goal for many technology developers in the industry is to reach level five autonomy. There are even claims that fully autonomous cars could reduce accidents dramatically. A study by McKinsey & Company stated that road accidents could be reduced by up to 90% with self-driving cars.
But while some suggest this will lead to the eradication of fatalities on the road, an ABC article suggests this would merely give self-driving cars the responsibility of life-or-death decisions.
The German travel ministry has stated that driverless cars must treat all human life equally and make decisions based on harming the fewest possible people. Humans would take priority over animals and properties in its decision making. But studies show that, although more than 75% of people supported the logic of saving the many, even if it meant injuring the driver or passengers, when presented with the same idea but in the scenario of they themselves being the driver, the majority wanted the car to protect the passengers and themselves as a priority.
This sounds logical on paper, but as ABC points out, some areas are a little greyer. For example, swerving to avoid hitting a group of five pedestrians, at the risk of injuring the driver, seems understandable. But swerving to avoid three pedestrians who crossed without looking, when you and your child are in the car, isn’t so cut-and-dry. By Germany’s rules, the car would swerve to save the greater number (the pedestrians) at the cost of injuring you and your child.
ABC also points out the issue of scenarios where there is no “greater number” of lives to save — if a child runs out onto the road, would the car swerve to injure you and save the child, or hit the child and save you?
Driving assistance is without a doubt very useful, but it seems we’re quite a way from achieving fully automated cars. Until then, we’re still going to need to learn to drive and pass our tests!