It’d be an understatement to say that there’s been a lot of debate about the safety of self-driving cars lately. People have been dreaming about cars that can drive themselves for a long time—probably since the very first automobile accident happened. But we can also foresee some possible risks and drawbacks to this technology.
Can autonomous vehicles make moral decisions?
Most computer programmers would agree that encoding human morals and judgment into a machine is most likely impossible. The reason for this limitation goes back to an old ethics experiment called the trolley problem. In this macabre scenario, you’re asked to decide between letting a trolley continue on course toward five people on one track or taking action that will result in one person’s death in order to save that group. A computer could only be expected to operate based on the logistics of one versus five, being incapable of feeling guilt for actively taking a life.
Of course, that experiment has repeatedly proven that not all people share the same views on morality, either. The real issue seems to be the potential legal implications for automakers—because human juries will still render the verdicts assigning fault in those driverless car accidents. On the other hand, the pure robotic logic of these automobiles will make road rage a thing of the past.
Who’s responsible if the car crashes?
If the automaker designs and programs the robotic automobiles of the future, then the individual may no longer be held fully responsible for car accidents. As such, auto insurance could fall into the carmaker’s domain rather than the consumer having to purchase it. Much like malpractice insurance fees drive up medical costs today, we could expect future automobile prices to increase accordingly.
Although people will still be expected to supervise and override the self-driving system as warranted, it wouldn’t be too hard to wriggle out of that responsibility after events unfold. All anyone would need to argue is that they were unable to make a corrective maneuver quickly enough to avoid crashing.
Will users still need drivers’ licenses?
Being able to make isolated corrective maneuvers is very different from having to control a vehicle from point A to point B. The recent trend in some places, such as New York City, has been to increase driving age from 15 or 16 to 18. However, those decisions are based on increased risk that wouldn’t be as much of a factor when teens are merely taking on a supervisory role in self-driving cars. In fact, there could be two different levels of licensing in the future—manual vs. autonomous, perhaps?
Would certain roads, such as highways, ban human-driven automobiles?
We can imagine that commutes could be shortened as speed limits increase based on improving response times from autonomous vehicles’ computer systems. However, human drivers would be ill-equipped to perform at those speeds. Thus, it may be more difficult to get around in non-autonomous cars at that point since human drivers might not be allowed on highways anymore. Or, at least for a while, maybe there’d be special lanes for them.
What’ll happen to buses, subways, and trains?
Since traffic could move faster and more efficiently, public transit may become unnecessary in most places. Ride-sharing could be available for anyone who can’t afford a car. On the other hand, commute times would need to be staggered somewhat for ride-sharing to work well—most people still won’t want to share their cars with strangers while they’re using them. Of course, buses could also be autonomous.
To be sure, many of today’s cars already feature some level of automation. For example, automatic parallel parking is now helping all those drivers who barely passed their maneuverability tests to stop circling the block repeatedly. Lane assist is another function that can help keep less attentive drivers alive by sounding alarms and braking when the sensors detect drifting or too-close neighboring vehicles. All things considered, the next progression of autonomous automobiles will likely solve far more problems than it brings up.