At the Questacon science museum in Canberra there was an exhibition on robotics, which included quizes asking people how they felt robots should behave. One of those, probing the value of different kinds of human lives, asked what the software controlling an autonomous car should do if the brakes failed approaching a pedestrian crossing and the choices were to run over a child or an old person — alternative routes to the sides were shown crashing into brick walls.
Now the correct answer, though it wasn't an option, is that the car should run itself into one of the brick walls. Assuming a 30mph (50km/hr) speed — and pedestrian crossings are not recommended when traffic is much faster — then there's very little chance of the occupants of any modern car suffering more than minor injuries from that, while hitting a pedestrian, old or young, poses a high risk of killing or seriously injuring them.
Now no one expects a human driver to drive into a wall, that would require greath strength of will. But perhaps autonomous cars should be considering broader social outcomes, and perhaps that is something this kind of quiz could probe... In any event, the blindness of the question-framing here reflects the blindess of transport policy, where the convenience of people driving is continually prioritised over the safety of vulnerable road users.