At a recent meal with friends, someone shared that Elon Musk plans for all Tesla electric vehicles to feature fully autonomous driving by the end of 2017.
I was asked, “Would you ride in a driverless car?”
Good question! Self-driving cars seem to be the way of the future. Every major automaker and tech companies such as Google, Apple and Baidu are heavily invested in the research and development of vehicles capable of navigating without human input.
Though the discussion that ensued focused on safety, my mind went to morality concerns. Will it be possible to program autonomous cars with instructions on how to react in the case of an unavoidable collision?
I stayed up late that night to answer the Moral Machine survey from MIT researchers on how self-driving cars should be programmed in the case of a collision (moralmachine.mit.edu). The survey presented various hypothetical scenarios where the survey-taker is given the choice to program cars with one of two fatal outcomes.
One scenario proposed the following: If the self-driving car has sudden brake failure, should it continue ahead through the pedestrian crossing killing two men and one woman? Or should the car swerve and hit three pedestrians, who were breaking the law by crossing on the red light?
What if the choice of casualties was between hitting a female and male executive crossing on a red signal or swerving and crashing into a concrete barrier resulting in the death of bystanders who happen to be a homeless person and a criminal?
It was a gut–wrenching process for me. It was also revealing since at the conclusion of the survey, MIT matched up my answers to those from the wider public. The Torah (Devarim 30:19) tells us, “You shall choose life.” What right do we have to determine someone else’s life? Who allowed me to “play” the role of judge, jury and even executioner?
The Talmud (Yoma 82b) tells of a Jewish man who was given an ultimatum by authorities to either murder his neighbor or be killed himself. He presented this dilemma to the well-known scholar Rava, who replied: “What did you see to make you think that your blood is redder and more important than his?”
Indeed, I am not content with all of the answers I gave to MIT. That is why I am looking forward to presenting the new course from the Rohr Jewish Learning Institute (JLI) titled “The Dilemma.” Together, we will debate six modern situations with legal and ethical ramifications, like the Tesla Autopilot dilemma. We will explore relevant precedents from the Talmud as well as secular law.
As to the question of whether I’d ride in a self-driving car: Since I can’t afford a Tesla Model X at this point, I’m open to trying a self-driving Uber the next time I’m in San Francisco.
Rabbi Yehuda Ceitlin is the outreach director of Chabad Tucson and associate rabbi at Congregation Young Israel of Tucson. He will be teaching “The Dilemma” over six weeks beginning Thursday, Feb. 2, at the Tucson Jewish Community Center. The course is co-hosted by Chabad Tucson, the Tucson J and the Jewish Federation of Southern Arizona’s Cardozo Society.