back to article My self-driving cars may lead to human driver ban, says Tesla's Musk

Self-driving cars are "almost a solved problem," Tesla Motors boss Elon Musk told the crowds at Nvidia's GPU Technology Conference in San Jose, California. But he fears the on-board computers may be too good, and ultimately encourage laws that force people to give up their steering wheels. He added: "We’ll take autonomous cars …

Page:

  1. Lobrau

    Not entirely sure I'd want cars learning to drive from some of the nuggets I regularly encounter on the roads.

    Not saying I'm a saint. We all make poor judgements sometimes. Hopefully the aggregate of good drivers will outweigh the bad.

    1. James Hughes 1

      Given the number of people driving around today in the fog with either just sidelights or no lights at all, I tend to agree. And that for the honk I got from some a guy driving at about 80 on a 60 road, in the fog. Note to you, if I cannot see you in the fog, and you are driving that fast, what do you expect will happen when I need to overtake a cyclist?

    2. Anonymous Coward
      Anonymous Coward

      AI - As Bender would say

      "Kill all humans."

      So what happens if I set up my AI car to watch the movie Deathrace a few hundred times?

      1. The Crow From Below

        Re: AI - As Bender would say

        "So what happens if I set up my AI car to watch the movie Deathrace a few hundred times?"

        Ok Google. If they scatter, go for the baby and the mother.

        1. Little Mouse

          Re: AI - As Bender would say

          watch the movie Deathrace a few hundred times? That's just cruel. Once was one time too many for me.

          Anyway, I'd prefer to train them on those moves from the 70's and early 80's featuring evil possessed vehicles, and see what they pick up from those.

          The ability to use their bonnets as a large mouth and a taste for human blood, most likely.

  2. Ralph B

    Self-driving cars are "almost a solved problem"

    Yeah, but don't forget the old ninety-ninety rule, Elon.

    1. ItsNotMe
      Thumb Up

      Re: Self-driving cars are "almost a solved problem"

      And old Elon also has to remember that his "proclamation" is only valid if people actually buy his cars. Which from the looks of his most recent earnings reports...there aren't a whole lot doing so.

  3. Anonymous Coward
    Anonymous Coward

    When this really does become a thing there are all kinds of interesting points to iron out

    - Accident liability. Are you responsible if your car is at fault in a crash, or is your car's AI?

    - Security. The possibility of people being able to tamper with things that drive you around is pretty scary (especially if you're a person of note).

    - City centre traffic. London and New York are bad; Mumbai and Beijing are worse. I'd worry that any AI risk averse enough to avoid an accident would grind to a halt without a significant amount of rather scary real world testing.

    1. Anonymous Coward
      Anonymous Coward

      Real world testing

      So far, all I every see is talk about these self driving cars in nice sunny weather on dry roads. My dog can drive under those conditions.

      I want to see tests of these things on unploughed snow covered roads where no lane markings are visible. I want to see tests of these things on unploughed snow covered hilly roads in blizzard conditions. I want to see tests of these things negotiating an icy steep hill in a narrow urban street, with cars parked on both sides half buried in snow. I want to see tests of these things on a curve at freeway speeds when they hit black ice. I want to see tests of these things when the snow is so high that every corner is a blind corner, as all the drivers in New England have had to deal with for a couple of months this winter. I also want to see how these things behave hitting a pothole at high speed on an icy freeway while it was sleeting and a tire gets blown out. Another fun experience I've had in the last year. Those are things I have to face every winter.

      I want to see how these things behave in blinding rain. I want to see how these things behave when they hydroplane in blinding rain. I want to see how these things behave hitting a pothole at freeway speeds in blinding rain, and blowing a tire out. Those are things I have to face every summer.

      And given how, in the US at least, a lot of roads are in the boonies with no connectivity, I don't want to hear any BS about "AI in the cloud". The car should handle everything on it's own, just like a human.

      Training the car by having it watch what normal people do (crash a lot) probably isn't the way to go.

      1. Anonymous Coward
        Anonymous Coward

        Re: Real world testing

        It sounds like you drive quickly in poor conditions a lot and you're not very good at identifying hazards - you claim to hit at least two potholes every year, hard enough to burst a tyre.

        I think the key defensive tactic a computer and a human would employ in these situations is not to drive like a bloody nutter in the first place.

        Maybe you'd be safer in a self-driving car?

        1. Public Citizen

          Re: Real world testing

          Your spelling of the round rubber thingies that go on the wheels leads me to suspect you've never driven on poorly maintained rural roads in the USA, or even most cities who keep spending the Pothole Money on some elected officials pet vote getting project.

      2. Anonymous Coward
        Anonymous Coward

        Re: Real world testing

        > I want to see how these things behave in blinding rain. I want to see how these things behave when they hydroplane in blinding rain. I want to see how these things behave hitting a pothole at freeway speeds in blinding rain, and blowing a tire out. Those are things I have to face every summer.

        Humans are spectacularly bad at driving in these kinds of conditions, as evidenced by the sharp increase in accidents when bad weather comes our way.

        Most accidents in bad weather are caused by poor driving, such as driving too fast, not paying attention or driving too close to the car in front.

        Once you've sorted out the basic logic and image processing, there's every reason to believe that computer driven cars would *far* exceed the capabilities of even the best drivers.

        Given that humans can't see in all directions at the same time (like a computer car could) and even in infra-red or ultra-violet I really don't see the practical justification that automated cars wouldn't be *much* safer than their meaty alternatives.

        1. lucki bstard

          Re: Real world testing

          I think the point the original commentator was making is that the weather is North America can be very hard to predict. Ice and Snow can be hard for the human and could be impossible for the electronic driver.

          Its not all about speed and driving style; although they are important. Does the vehicle have winter tyres? Are the roads cleared? Has the snow been polished at the intersection so you have to pull away very slowly otherwise you'll spin, and how will the car detect this when it is covered in fresh snow? In the summer your car may be able to easily get up a certain hill, in the winter the 5 cars in front of you may have polished the ice; how does the electronic car detect this?

          My opinion is that there will be electronic zones for driving and non electronic zones. Combine this with a different driving license, where with one type, you can only only 'drive' a car in an electronic zone; and an advanced license that allows you to drive in a non electronic zone.

          1. Charles 9
            WTF?

            Re: Real world testing

            "I think the point the original commentator was making is that the weather is North America can be very hard to predict. Ice and Snow can be hard for the human and could be impossible for the electronic driver."

            Why would it be impossible for an electronic driver? Unless you can describe in detail situations no sensor would be able to see and where the only way one can survive intact is by instinct or even blind luck? The article notes being able to see through rain, and if snow is blinding, perhaps the prudent course a computer would take is to slow to a crawl or even stop (something humans are averse to doing).

            The nightmare scenario I keep thinking about is rush hour in an overcrowded Asian city such as downtown Manila, where pedestrians and vehicles of all sorts are everywhere (including many where automation is impossible, like bicycles), road markings aren't really honored, and time is of the essence (perhaps because fuel is low).

            1. lucki bstard

              Re: Real world testing

              'even stop' - Yeah like that will work in -40C. AI says car will stop, car stops and driver will freeze.

          2. Public Citizen

            Re: Real world testing

            Already have that in the USA where a higher class license is required for Heavy Trucks or for Motorcycles over 150cc [different licensing classes].

            Personally I'd like to see ~everybody~ have to start out with a Scooter License [under 150cc] so they can learn the rules of the road without having a 2000lb plus vehicle "under their control" and that when it gets out of control can become a lethal weapon.

            Having to spend a few months as the most vulnerable vehicle on the road tends to focus a teenagers mind on the task at hand much more effectively than the modern crash-cage/entertainment cocoon on 4 wheels.

      3. NinjaTheVanish

        Re: Real world testing

        @ AC

        Thank you for reminding me why I will never take a job in the North.

      4. Anonymous Coward
        Anonymous Coward

        Re: Real world testing

        "I want to see how these things behave in blinding rain. I want to see how these things behave when they hydroplane in blinding rain. I want to see how these things behave hitting a pothole at freeway speeds in blinding rain, and blowing a tire out. Those are things I have to face every summer."

        I feel sorry for you that you have to buy a new tyre every summer

      5. Public Citizen
        Facepalm

        Re: Real world testing

        Two Words:

        Tire Chains

        1. lucki bstard

          Re: Real world testing

          Great, until you enter a location that snow chains are not allowed, but the city doesn't clear the snow properly anyway.

    2. DaLo

      Liability

      "Accident liability. Are you responsible if your car is at fault in a crash, or is your car's AI?"

      It doesn't really matter, you will just have an insurance policy which will pay out for the damage caused by your car. Firstly, insurance should be massively cheaper when self driving cars become universal due to the reduced accident rate. Secondly any manufacturers who have accident prone cars will have the insurance rates hiked right up until they either fix the issue or go out of business. Market economics will determine reliable self-driving cars. Same will happen with manufacturers liability insurance.

      1. earl grey
        Mushroom

        Re: Liability

        Yeah, swell. Except it won't be the manufacturers who will be picking up the tab. It will be the "insured" driver (oh, you are going to actually make sure everyone on the road has insurance, right?). And just because there MIGHT be fewer accidents, doesn't mean that rip-off insurance won't be any less expensive than now. Market economics will perhaps determine whether people actually BUY or can AFFORD a self-driving car; but if the costs are not realistic, people simply won't.

    3. Nextweek

      > - Accident liability. Are you responsible if your car is at fault in a crash, or is your car's AI?

      Liability is already established in law.

      Car manufacturers have large legal departments which decide when to payout and when to recall cars. Your AI will be no different from a fuel line or breaking system.

      1. Mark 85

        There is also "no-fault" where each driver's/car's insurance takes care of it's own claims rather than having lawyers sue the other guy and his insurance company. Lawyers don't like this so maybe we can use them for crash test dummies?

      2. Anonymous Coward
        Anonymous Coward

        No fault insurance

        There are already some US states that have no fault auto insurance, I expect this will become universal when people are no longer driving. Fault is unimportant to an individual, they just want their losses to be covered.

        Fault, and what remedies are required should be a question for regulators. I see autocar accidents being investigated like airplane accidents. Figure out whether the fault was a mechanical failure, software failure, how much conditions or lax maintenance contributed, etc. and order fixes/recalls where necessary.

    4. Justthefacts Silver badge

      Molehill mountains

      Accidental liability: That's a perennial complaint, but Im struggling to see it.....

      Everyone has 3rd party insurance by law, based on make of car, driver details and driver history. In future, insurance companies will have much more accurate data for Volvo self driving accidents per mile than 17 year old little Johnnie with his bird by his side. Where's the difference in process?

      If you are saying "but who am I going to lock up dangerous driving", why would you? Dangerous means without due care and attention. Car conforms to safety testing; sometimes it will fail, just like sometimes brakes fail; doesn't mean anyone goes to jail.

      Security: I know of three friends who had brake or oil lines cut by vandals. Bad neighbourhoods. And?

      City centre traffic: OK, I agreet, it's harder for an unconstrained AI. But lots easier and more freely moving to platoon along high street, and even easier to coordinate at traffic lights using long range 802.11p. Swings, meet roundabouts:)

    5. T. F. M. Reader

      City centres

      Forget about city centre traffic - it's relatively easy. I'd like to see an AI trying to find a parking spot in a city centre - in traffic. How will it navigate without a specified destination?

  4. Nigel Brown

    Am I the only one...

    I am deeply, deeply uneasy about this. I know driving standards are pretty poor, but I still don't trust a 'puter to do it instead of a human.

    1. Crisp

      Re: Am I the only one...

      We drive around in cars built by robots and that seemed to work out well enough.

      1. Anonymous Coward
        WTF?

        Re: Am I the only one...

        We drive around in cars built by robots and that seemed to work out well enough.

        Last I saw, robots don't move around at 70mph

        1. James Hughes 1

          Re: Am I the only one...

          Hasn't the Google car driven more miles without an accident than the average driver already?

          1. Anna Logg

            Re: Am I the only one...

            Average drivers have to drive with lots of other traffic and pedestrians, and don't tend to drive the same few miles over and over again (well OK, apart from the daily commute)

        2. Stuart 22

          Re: Am I the only one...

          "Last I saw, robots don't move around at 70mph"

          The one driving my tube train is rated to do 75mph. The one flying my plane cruises at 500mph and can land safely in fog. As we know its tube drivers and pilots who fail castrophically and kill. But, somehow, we feel uneasy if there isn't a person up front who can open the doors or give us the weather forecast for our destination.

          1. JustNiz

            Re: Am I the only one...

            Trains and planes are both in a fairly predictable environment where the rules and conventions are pretty much always followed by other users. And the other users are ususally miles away. Things don't generally suddenly appear in front of you.

            Cars are in an environment where the rules and conventions are often broken, other users are trying to share almost the same space, and things can and do suddenly jump out in front of you.

            1. Danny 14

              Re: Am I the only one...

              It would also be interesting to see what it does in a no-win situation such as black ice or white van man side swiping you. I bet an audi/bmw driver will find some way to confuse the sensors by sitting up your arse flashing lights.

            2. Anonymous Coward
              Anonymous Coward

              Re: Am I the only one...

              > Trains and planes are both in a fairly predictable environment where the rules and conventions are pretty much always followed by other users.

              Speaking as a former commercial pilot: the systems do not rely at all on any supposed "predictability" of the environment. What makes automation safe in that context is that we, the pilots, were trained and were in theory thoroughly familiar with the systems, their capabilities, their behaviour, failure modes, etc., so we could supervise them effectively. But even if it is the autopilot sending the commands to the control surfaces, etc., ultimately it is the pilots who are always in control (and we respond with our licences, if not our lives, if something goes seriously wrong).

              The article makes a mention of the possibility of a special licence being required to drive cars above a certain level of automation. That makes a lot of sense. If you think of the technology in current cars (ACC being perhaps the most obvious example), it already requires a degree of familiarity to know when to let it do its thing, and know if it's working correctly, and when to take over.

      2. Paul Crawford Silver badge

        Re: @Crisp

        Robots in a factory doing precisly defined work is one thing, and they work really well. Its the uncertainty in what a real road will throw at the system that matters, and how it copes.

        Also I think it is moronic to have the assumption of "phone home" operation. What if you loose connectivity or the central servers go down for whatever reason? Does your car just stop?

        So then what if someone simply jamms the radio for a short while to stop you and rob you?

        1. Anonymous Coward
          Anonymous Coward

          Re: @Crisp

          > Also I think it is moronic to have the assumption of "phone home" operation. What if you loose connectivity or the central servers go down for whatever reason? Does your car just stop?

          My reading was the "phone home" part was for the processing of experience data for bulk improvement of the training of these things, not for the actual running of the machine.

          Anyone seriously suggesting implementing that I think would be laughed out of the room.

        2. (AMPC) Anonymous and mostly paranoid coward

          Re: @Crisp

          Well, if it resembles auto-pilot systems (such as those on the Airbus), the correct fallback would be manual control by the driver. Autonomous just means that it can drive by itself, not that it must.

          Of course, it should really broadcast a "meatbag controlled" signal to all other cars in the area, just as a courtesy,

          1. Paul Crawford Silver badge

            Re: @Crisp

            "Well, if it resembles auto-pilot systems (such as those on the Airbus), the correct fall-back would be manual control by the driver"

            Yes, and look how well that worked out for AF447 after all!

            See that is the problem, if it can't cope near-perfectly with anything on the roads your screwed. You won't be sitting there with full concentration all the time "just in case" - otherwise you might as well be driving. And in the event of an unhanded exception as car has seconds to impact, not the minute or two the startled pilots of AF447 had.

            1. Anonymous Coward
              Anonymous Coward

              Re: @Crisp

              > Yes, and look how well that worked out for AF447 after all!

              Paul, unless you are a qualified airline pilot, type rated in the Airbus family, and with the requisite experience, you do not understand what happened in that incident. No matter how many newspaper articles you read, how many documentaries you've watched, how much Microsoft sim flying you've done, or how clever you think you are in general. You simply do not have the necessary background to understand what went on and how it happened.

              Take that from a former airline pilot, but the same thing applies to any sufficiently complex technical field.

              1. Paul Crawford Silver badge

                Re: @AC w.r.t AF447

                "You simply do not have the necessary background to understand what went on and how it happened."

                I did not claim that I would have done any better, nor that I understand the details of how the pilots reaction to various conflicting warnings and instrument inconsistencies led them to not recover the plane from stalling.

                But what I am absolutely certain of is that having an autonomous system throw back the controls to humans under "difficult" conditions is a recipe for disaster. And equally for cars the conditions that are unlikely to be handled well, such as an unexpected conflict of sensors while approaching a junction, blind bend, etc, will leave the human operator with bugger-all time to come to terms with being in control, let alone to apprise the situation and react accordingly.

                So why even consider that case? Maybe so the car manufacturers can pin the blame for out-of-capability accidents upon the meat sack failing to drive correctly...

                1. Terry Barnes

                  Re: @AC w.r.t AF447

                  "But what I am absolutely certain of is that having an autonomous system throw back the controls to humans under "difficult" conditions is a recipe for disaster. And equally for cars the conditions that are unlikely to be handled well, such as an unexpected conflict of sensors while approaching a junction, blind bend, etc, will leave the human operator with bugger-all time to come to terms with being in control, let alone to apprise the situation and react accordingly."

                  I believe drivers are taught a manoeuvre known as the "emergency stop" to deal with such incidents. An advantage a car has over a flying thing is that such a thing is even possible. Why would a self-driving car not just implement an emergency stop in such situations?

                2. Anonymous Coward
                  Anonymous Coward

                  Re: @AC w.r.t AF447

                  > I did not claim that I would have done any better, nor that I understand the details [....]

                  Then why bother posting in the first place?

              2. JLV

                Re: >Paul, unless you are a qualified airline pilo

                Oh, don't be so condescending, please.

                You are right, you need to be very good in a field to understand the fine details & implications of technical issues. However, the general idea, as analyzed by experts, is usually good enough to form an opinion which isn't totally unreasonable. Managers have to do this all the time with techies and some of them are actually good at it (many are not, so your point remains valid as well).

                Far as I understand, AF447 had the following problems: sensor failure, pilots unaware of that particular possibility and not trained to compensate for it in a context of limited situational awareness with conflicting sensor readings. Both aspects probably needed addressing. Is that a totally unwarranted conclusion?

                Now, I happen to agree with the OP's contention. If the AI knows that it is entering failure mode and throws it back to you well in advance, then OK, by all means the driver can be tapped. She can either park the car by the side of the road & call a taxi. Or she can drive it home. Let's say something like "conditions are too cluttered with pedestrians, can't resolve" in an after-match situation where pedestrians are streaming out of a stadium.

                If on the other hand the AI has a split second indication of failure, as in "oh crap, there's no way I am dodging that pedestrian who leaped off the sidewalk", then, no, the OP is correct and there is no benefit to fall back to the driver. She won't have time. (Doesn't mean she shouldn't be allowed to drive the car the rest of time).

                But in a car, he's correct that you can't shunt off out-of-envelope conditions to the driver passenger at the last split second, the AI would have to know it's out of its depth and request manual control well in advance.

                Commercial pilots may have to take over from autopilot in a split second, but they are already well in the loop when entering critical phases such as takeoff and landing. If it is an unexpected emergency then they are usually at high enough altitude that they have some time to react. I agree with you, he's wrong about his AF447 conclusions, the pilots are the safety fallback, and an isolated disaster does not invalidate the pilots' role. But he's right that civilian drivers shouldn't be put in the same position of critical fallback at short notice, both by timing and by their training.

        3. earl grey
          FAIL

          Re: @Crisp

          They don't have to jam the radio to stop you. The AI cars are designed to always stop for an object in front of them, so all a crim has to do is step out in front of your (ignorant) AI car and it will very nicely stop so you can be robbed or kidnapped. I can imagine that executives and big-wigs everywhere are going to have fun with this concept.

          1. phil dude
            Pint

            Re: @Crisp

            I wonder if FUD can be used to power the car?

            Seriously, everyone seems to be so focused on the edge cases that they ignore the great deal of uncertainty in human driving is the other humans.

            @earl grey : I had thought about this, and it seems that initially these cars will drive only where there are not *supposed* to be humans e.g. Motorways , large roads. Any person "jumping in front of a car"

            will likely be arrested (or more likely) sent directly to hospital.

            I have proposed this on El Reg before but I expect these cars will come with "manual" vs "auto" operating modes.

            Specifically, if you are in "auto" mode and grab the wheel the car will try and do the absolute safest thing - stop or remove vehicle from traffic etc... More importantly, the insurance for the car will go from $30/mth to $3000/mth.

            Hence, rich people will have cars that don't stop for humans in the road as they'll pay $3000/mth to have a chauffeur.

            I'm all for the tech, but it is clearly dual-use...

            P.

            1. Paul Crawford Silver badge

              Re: @Phil Dude

              Folk who care about edge cases are the sort you want working on safety-critical stuff! Typically they are the ones to trust your well-being to. As for reliability, the current US death rate is around 1-2 per 100 million miles driven, or about 150-250 per million vehicle - years:

              http://www.census.gov/compendia/statab/2012/tables/12s1103.pdf

              So an autonomous car has to be pretty good to match that. Sure humans do really dumb things, and they are easily distracted, etc, which probably covers a good 90% or so of those deaths. But cars have to at least match that 2E-8 fault/mile figure under real-world conditions to be taken seriously.

          2. Terry Barnes

            Re: @Crisp

            "all a crim has to do is step out in front of your (ignorant) AI car and it will very nicely stop so you can be robbed or kidnapped. "

            Your argument being that a human would just mow them down and kill them?

          3. Public Citizen

            Re: @Crisp

            Doesn't require somebody stepping in front of the vehicle, just a truck with somebody in the back to act as the "kicker", a trash bin, and enough weight in the bin to make sure that it sticks when it lands. As an alternative, a large bag full of wet leaves would probably do the same "stop the vehicle" trick.

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like