Driverless Dilemma

What should the car do?


  • Total voters
    14
Page may contain affiliate links. Please see terms for details.
It is silly to say that because a driverless cannot save everyone's' lives in every scenarios then it is wrong. The software will minimise not eradicate death and injury. This will still be a massive leap forward.

I would imagine it would be programmed with @User hierarchy:

1. My safety
2. Your safety
3. My convenience
4. Your convenience

To add to this, the cyclists are on the wrong side of the road. The car would be driven at a safe speed so it can safely stop in the distance it can see.
 

winjim

Straddle the line, discord and rhyme
Stop. Primary duty of care is to the occupant of the vehicle so their safety takes priority. Also, the decision to actively take life is worse than passively allowing life to be taken. But, the car shouldn't have been going that fast in the first place anyway.

Wasn't this sort of thing covered pretty comprehensively by Isaac Asimov in the forties and fifties?
 
D

Deleted member 26715

Guest
Stop. Primary duty of care is to the occupant of the vehicle so their safety takes priority. Also, the decision to actively take life is worse than passively allowing life to be taken. But, the car shouldn't have been going that fast in the first place anyway.

Wasn't this sort of thing covered pretty comprehensively by Isaac Asimov in the forties and fifties?
But the car wasn't going too fast, the OP claims that the car could stop safely & it was the cyclists that hit the car.
 

Tin Pot

Guru
[QUOTE 4250754, member: 9609"]Presumably the moral code of the designers will be incorporated into how the car deals with various scenarios. I am sure we would all deal differently with the following;
travelling at 40mph with another car tailgating you. 1/ a rabbit runs out in front of you, 2/ a barn owl swoops down in front of your car, 3/ someones pet Labrador runs our. 4/ a child runs out. - Braking violently will cause some damage to the cars and may be some minor injuries.
Has the automonous cars programmer choose the course of action that you would have - or will these morals be available within the cars settings?

but back to the OP - I think the car would just be programmed to stop[/QUOTE]

I though it would be obvious I was joking...

They are not programmed with anything other than collision avoidance - there is no judgement to be made. The OP is a purely hypothetical question, read Asimovs Robot series as this whole thing was done 60 years ago.
 
Could it happen? Why couldn't the cyclists stop? Why is it inevitable that they hit the car and are killed?
And if they were cycling on the wrong side of the road and couldn't stop in the distance they could safely see why should the driver kill it's passenger(s) to save these cyclists?
 
[QUOTE 4250819, member: 9609"]can you elaborate on the collision stuff, presumably there is a limit to the size of the object it will brake for. It would need to be able to do a full emergency stop for a toddler, and a toddler is not much bigger than a large Hare.. Does the software take any consideration into what and how close another vehicle is following you?
Would it bring you to a halt half way round a bend on a trunk road for a rabbit with the danger of a 44tonner sitting on his limiter wiping you out?

I really am curious as to how these things are going to work.[/QUOTE]
I would imagine rather than teaching it rules, it will run millions of test scenarios and learn from the best outcomes.

They'll program thousands and thousand of combinations, it'll simulate it, judge the outcome, then build and algorithm that creates the best scenario outcome on average.

Programming ai is fascinating. For example they wanted to teach a computer to recognise male v female from photos. They taught it nothing just showed it thousands of examples of faces without hair. It learnt itself. Created its own algorithm. It then achieved better results than humans. I doubt the algorithm is even translatable into if/then/else's. It would measure thousands of variables and weight them according to its learnt scenarios and act a line of most likely best outcome.
 
D

Deleted member 26715

Guest
Could it happen?
I think so, can you think of a reason why a car couldn't/wouldn't come around a bend with a tanker parked at the side of the road with 2 cyclists overtaking said tanker.
Why couldn't the cyclists stop?
I have no idea, I wasn't/aren't there in this fictitious scenario
Why is it inevitable that they hit the car and are killed?
Again I have no idea, maybe they were cycling with reckless endangerment.
 

winjim

Straddle the line, discord and rhyme
It's a version of the trolley problem, although in this case it fails because the car is in full control of its braking system. It also has the added complication of the car having a direct duty of care towards its occupant.

Death is inevitable. Is it better to actively kill the few in order to save the many, or to passively allow the many to die in order to avoid becoming an active participant in the killing?

In the case of a driverless car, I think we could modify Asimov's first law to give precedence to the prevention of active killing:

  • A driverless car may not injure a human being.
  • A driverless car may not through inaction, allow a human being to come to harm except where this would conflict with the first part of this law.
 

John the Monkey

Frivolous Cyclist
Location
Crewe
Yup, because cyclists are never killed when there is a human at the wheel.
I forget who said it, but I remember reading in another article "If driverless cars are a disaster waiting to happen, driven cars are a catastrophe, happening right now."

(Although, to be fair, they were talking about it in terms of road safety, not the apocalyptic effect that these technologies are going to have on the labour market).
 
Top Bottom