Autonomous cars are not yet a permanent feature on the roads but when they do become more popular, they will have life and death decisions to make, much like human drivers do in the heat of the moment.
Autonomous systems will have to be empowered with the capability to make moral choices that could place pedestrians, other road users and the driver at greater risk depending on how the vehicle classifies whose life has priority. For example, if an autonomous vehicle identifies a child crossing a street, the vehicle will have to make a decision about whether or not the adult driver of that car should be compromised rather than the child.
Other questions are raised in this process, such as who makes decisions over the safety that should be prioritized when the systems are programming. MIT researchers are currently exploring these ethical and moral considerations and results were published in a massive online survey of more than 2 million people across 200 countries.
Most people were able to agree on three common elements. They argued that the lives of the many should be prioritized over the lives of the few, that the young should have priority over the old and that humans should be sacrificed over animals.
When it comes to other considerations, however, most people were not able to make decisions as far as coming to a consensus about what should be prioritized. These critical issues will have to be addressed in coming years as people contemplate the possibility of autonomous cars becoming common on the roads. While self-driving cars are part of the future, most accidents today are caused by human error. If you’ve already been hurt, hire a San Francisco lawyer to help you with your claim.