Vehicle Automation: Moved from Charlie Alliston Thread

Page may contain affiliate links. Please see terms for details.

theclaud

Openly Marxist
Location
Swansea
I didn't mean automatic cars in that sense, but can't be arsed to edit it, before anyone points that out...
 

srw

It's a bit more complicated than that...
What you are describing is not, as GC points out, how we got to where we are now
I think GC is wrong and wearing a tinfoil helmet. But I've got a massive hangover, so I can't be bothered to take his post apart.
 

theclaud

Openly Marxist
Location
Swansea
I think GC is wrong and wearing a tinfoil helmet. But I've got a massive hangover, so I can't be bothered to take his post apart.
^_^
We'll all just have to be patient then, while you recover. Then you can explain how we can learn to welcome our new self-driving overlords.
 

Wobblers

Euthermic
Location
Minkowski Space
I think GC is wrong and wearing a tinfoil helmet. But I've got a massive hangover, so I can't be bothered to take his post apart.

For my sins, I used to be a software developer in a previous career. There are two things you need to know about software development, and software developers.

The first is that they actually aren't much good at edge cases. And this is a problem, because cyclists are very much an edge case. Especially so to those software developers in the US. Vulnerable road users may a more familiar concept to the German software developers writing the code for autonomous vehicles. So we could conceivablly find that the most courteously operated vehicles in our autonomous future are German, in a deliciously ironic reversal of the present perception.

But no autonomous vehicle can operate any better than its software. And if the developer decides that overtaking a cyclist with 10 cm to spare at speed is perfectly acceptable, that's exactly what will happen. It's even quite conceivable that the sensors detecting the presence of a cyclist or pedestrian will be removed as noise by some digital signal processing algorithm. The bottom line is that the software for these thiungs will written by fallible humans, so that software will in turn be fallible. And complex systems fail in surprising ways...

The second is that the path of least resistance is invariably the preferred choice. That is at least partly the reason behind why we spend so much of our time fighting recalcitrant applications - it is so much easier for the developers to force the users to adapt their ways to the software than it is for them to learn how people actually want to use it.

Legislating vulnerable road users off the road is very much the path of least resistance: you can expect that there will be much lobbying by the motoring lobby for exactly that to happen: "for their own good", naturally. After all, it will be rather difficult to sell an autonomous vehicle if it cedes priority to every lowly pedestrian or cyclist, won't it? It is possible to make this prediction because this is exactly what happened in the US, with the passing of jaywalking laws. The motoring lobby is large and well funded - it is unwise to ignore it.
 

PK99

Legendary Member
Location
SW19
For my sins, I used to be a software developer in a previous career. There are two things you need to know about software development, and software developers.

The first is that they actually aren't much good at edge cases. And this is a problem, because cyclists are very much an edge case. Especially so to those software developers in the US. Vulnerable road users may a more familiar concept to the German software developers writing the code for autonomous vehicles. So we could conceivablly find that the most courteously operated vehicles in our autonomous future are German, in a deliciously ironic reversal of the present perception.

But no autonomous vehicle can operate any better than its software. And if the developer decides that overtaking a cyclist with 10 cm to spare at speed is perfectly acceptable, that's exactly what will happen. It's even quite conceivable that the sensors detecting the presence of a cyclist or pedestrian will be removed as noise by some digital signal processing algorithm. The bottom line is that the software for these thiungs will written by fallible humans, so that software will in turn be fallible. And complex systems fail in surprising ways...

The second is that the path of least resistance is invariably the preferred choice. That is at least partly the reason behind why we spend so much of our time fighting recalcitrant applications - it is so much easier for the developers to force the users to adapt their ways to the software than it is for them to learn how people actually want to use it.

Legislating vulnerable road users off the road is very much the path of least resistance: you can expect that there will be much lobbying by the motoring lobby for exactly that to happen: "for their own good", naturally. After all, it will be rather difficult to sell an autonomous vehicle if it cedes priority to every lowly pedestrian or cyclist, won't it? It is possible to make this prediction because this is exactly what happened in the US, with the passing of jaywalking laws. The motoring lobby is large and well funded - it is unwise to ignore it.

Way too sensible a post for here!

You do echo some of my concerns, but put them far better than I could hope to do.
 
OP
OP
D

Dan B

Disengaged member
For my sins, I used to be a software developer in a previous career. There are two things you need to know about software development, and software developers.

The first is [...]

The second is [...]

And the third is they're all suckers for metacircular jokes involving off-by-one errors
 

Bollo

Failed Tech Bro
Location
Winch
For my sins, I used to be a software developer in a previous career. There are two things you need to know about software development, and software developers.

The first is that they actually aren't much good at edge cases. And this is a problem, because cyclists are very much an edge case. Especially so to those software developers in the US. Vulnerable road users may a more familiar concept to the German software developers writing the code for autonomous vehicles. So we could conceivablly find that the most courteously operated vehicles in our autonomous future are German, in a deliciously ironic reversal of the present perception.

But no autonomous vehicle can operate any better than its software. And if the developer decides that overtaking a cyclist with 10 cm to spare at speed is perfectly acceptable, that's exactly what will happen. It's even quite conceivable that the sensors detecting the presence of a cyclist or pedestrian will be removed as noise by some digital signal processing algorithm. The bottom line is that the software for these thiungs will written by fallible humans, so that software will in turn be fallible. And complex systems fail in surprising ways...

The second is that the path of least resistance is invariably the preferred choice. That is at least partly the reason behind why we spend so much of our time fighting recalcitrant applications - it is so much easier for the developers to force the users to adapt their ways to the software than it is for them to learn how people actually want to use it.

Legislating vulnerable road users off the road is very much the path of least resistance: you can expect that there will be much lobbying by the motoring lobby for exactly that to happen: "for their own good", naturally. After all, it will be rather difficult to sell an autonomous vehicle if it cedes priority to every lowly pedestrian or cyclist, won't it? It is possible to make this prediction because this is exactly what happened in the US, with the passing of jaywalking laws. The motoring lobby is large and well funded - it is unwise to ignore it.

There are higher level issues than software bugs (although based on my Qashqai there's some way to go). We're now getting into the bizarre world of machine ethics where a vehicle may be presented with scenarios where it would have to make what in human terms is a moral choice. Mercedes have popped in an out of the news with this story. I'm sure the reporting has been through several sensation-enhancing filters, but the core dilemma is still there. It's all very 'I Robot'!
 

theclaud

Openly Marxist
Location
Swansea
There are higher level issues than software bugs (although based on my Qashqai there's some way to go). We're now getting into the bizarre world of machine ethics where a vehicle may be presented with scenarios where it would have to make what in human terms is a moral choice. Mercedes have popped in an out of the news with this story. I'm sure the reporting has been through several sensation-enhancing filters, but the core dilemma is still there. It's all very 'I Robot'!
'Tricky moral question' my arse.
 

Bollo

Failed Tech Bro
Location
Winch
'Tricky moral question' my arse.
I agree with you for the car-centric question in the linked article. If these things become commonplace, I'd like to see a law that said the source of the danger - the car and it's occupants - bore the brunt of any consequences. Passing a law and then persuading the AI to follow it are another thing though.
The properly tricky questions come when you consider choices that only have consequences for non-occupants. You can stage the scenarios for yourself, but how would the vehicle determine the 'best' outcome if each choice is likely to lead to at least some death or injury? The reference to Asimov is relevant because his Robot books deal with exactly this, even if they're dated technologically. Sooner or later we're going to need robot ethicists.
 
  • Like
Reactions: mjr

jarlrmai

Veteran
I agree with you for the car-centric question in the linked article. If these things become commonplace, I'd like to see a law that said the source of the danger - the car and it's occupants - bore the brunt of any consequences. Passing a law and then persuading the AI to follow it are another thing though.
The properly tricky questions come when you consider choices that only have consequences for non-occupants. You can stage the scenarios for yourself, but how would the vehicle determine the 'best' outcome if each choice is likely to lead to at least some death or injury? The reference to Asimov is relevant because his Robot books deal with exactly this, even if they're dated technologically. Sooner or later we're going to need robot ethicists.

No-one would buy a car that doesn't save them over someone else, would someone buy a car that paralysed or seriously them over killing someone else? These are the sticky moral issues that are locked inside the self driving car design conundrum.
 

jarlrmai

Veteran
Why are we discussing self driving software on page 97 (!?!) of a thread about a collision that didn't even involve a car.

Can I suggest someone kick a new thread if they want to go into detail about this issue?

There is a suggestion that the somewhat overstated response to this incident in the media may in someway be part of a bid by a lobby group funded by the motoring industry to remove cyclists from the roads in order to smooth the way for self driving cars. This may seem like a conspiracy but it is certainly the case that jaywalking laws in the US were a direct result of a similar campaign against pedestrians (or as I like to call them, people just going about their business.)
 
Top Bottom