originally posted in:Secular Sevens
I'm almost positive that Google's self-driving cars can be switched to manual driving if the program screws up, which makes of lot of the author's concerns moot.
English
-
Edited by Ryan: 10/9/2013 9:56:06 PM[quote]I'm almost positive that Google's self-driving cars can be switched to manual driving if the program screws up, which makes of lot of the author's concerns moot.[/quote]Not at all. The "program screwing up" is only one of the many possible problems. For instance, it doesn't take into account situations where the car needs to decide what to do in morally ambiguous situations, which has nothing to do with the "program screwing up". Programmers would still need to write code to determine how the car will operate under those circumstances. Also, it's important to understand that if the program starts to screw up when you are going 70+ mph on the highway, it can probably be difficult to react fast enough to switch it to manual and prevent any accidents, depending on the severity of the screw up. On a final note, you presuppose that at least one passenger of an autonomous cars has to know how to drive, which isn't explicitly required.
-
Edited by King Dutchy: 10/9/2013 10:16:33 PM[quote]Also, it's important to understand that if the program starts to screw up when you are going 70+ mph on the highway, it can probably be difficult to react fast enough to switch it to manual and prevent any accidents, depending on the severity of the screw up.[/quote]You do programming, right? If so, I probably sound really stupid saying this: Why can't the car alert you if the programming -blam!-s up? My computer does it all the time with the blue screen of death, so why can't the car alert you in a similar way? Maybe it could tell you to turn it off and drive manually until the problem has been resolved. [quote]On a final note, you presuppose that at least one passenger of an autonomous cars has to know how to drive, which isn't explicitly required.[/quote]I'd imagine that every driving law that exists would apply to the self driving cars, so I'd say it would be explicitly required by law. (Assuming you didn't mean that it wasn't explicitly required by the program)
-
Edited by Ryan: 10/9/2013 10:25:22 PM[quote][quote]Also, it's important to understand that if the program starts to screw up when you are going 70+ mph on the highway, it can probably be difficult to react fast enough to switch it to manual and prevent any accidents, depending on the severity of the screw up.[/quote]You do programming, right? If so, I probably sound really stupid saying this: Why can't the car alert you if the programming -blam!-s up? My computer does it all the time with the blue screen of death, so why can't the car alert you in a similar way? Maybe it could tell you to turn it off and drive manually until the problem has been resolved.[/quote]There are also cases where your computer just immediately crashes without warning. Not all errors are easily predictable, if there were, then it would be easier to prevent them in the first place. [quote]]I'd imagine that every driving law that exists would apply to the self driving cars, so I'd say it would be explicitly required by law. (Assuming you didn't mean that it wasn't explicitly required by the program)[/quote]I know, but it doesn't currently exist. I omitted the paragraph of the article on thelegal principle nullum crimen sine lege, or “no crime without law,”