Cruise CEO says SF ‘should be rolling out the red carpet’ for robotaxis, threatens to maybe leave town::In his first major public interview since the DMV cut their San Francisco fleet in half, Cruise CEO Kyle Vogt said “we cannot expect perfection” from the self-driving cars, and vaguely threatened to leave town if regulators curtail them any further.

  • just another dev@lemmy.my-box.dev
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    3
    ·
    1 year ago

    As the other guy said. Demanding perfection is insane - we don’t demand that from human drivers either. As long as it’s better than humans (preferably by a long shot), I’m all in favour.

    • chakan2@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      3
      ·
      edit-2
      1 year ago

      We demand perfection in a lot of fields, and we are a hell of a lot closer to it than the wild west of AI alphas we have driving around.

      Aviation, Medical, Space Travel…etc…

      We can get to extreme levels of quality when lives are at risk. Driverless cars put lives at risk.

      Humans are a terribly low bar to use for a quality measure. Also, a human will (usually) do it’s best to mitigate damage in am accident

      In the case of Tesla…fuck it…I’m going through that parked semi at 80mph.

      • just another dev@lemmy.my-box.dev
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        3
        ·
        1 year ago

        None of those fields have achieved perfection. Airplanes crash, people die in hospitals and space shuttles. If anything, computer assistance has managed to make those safer than before.

        If (when) robotcars are safer than human drivers, less people will die in traffic accidents. It’s not a perfect bar to settle on, but it’s better then the current standard.

        Again, denying improvements, because it’s less than perfect is just insane.

        • chakan2@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          1 year ago

          Denying “improvements” that cost innocent bystanders their life is the only responsible choice.

          I was game for the great experiment 10 years ago. But the tech just hasn’t gotten better, and arguably is worse today.

          It’s time to say enough is enough and restrict driverless tech to controlled areas.

          Being simply better than the average human isn’t enough here.

          • just another dev@lemmy.my-box.dev
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            1 year ago

            I never said better than the average driver, I said better than human drivers (preferably by a long shot).

            So let’s say that means… Better than 90% of all drivers. That isn’t going to cost lives, it’s going to save them. Not to mention improve traffic flow.

            • chakan2@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              1 year ago

              Unlikely…to make an AI car safer than 90% than human drivers means it will respect the speed limit.

              That alone causes traffic jams and unsafe conditions around the car as people try to get around it.

              A human driver will somewhat go with the flow of traffic.

              An AI vehicle just won’t work until it’s a nearly perfect driver that can make human decisions.

              That’s not going to happen for a long time. Musk, with his revolving door of low cost engineers is actually making it all worse.

              Pull the plug on this experiment and put it back on the test track.

    • supercriticalcheese@feddit.it
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      We don’t even know if they are better than humans in an actual driving environment that is more challenging higher speed roads etc…

      It is insane to think the slow speed tests are representative of the entire possible scenarios. Or they might fail in driving in things like roundabouts or merging into motorways much more often than humans or who knows what edge cases.

      • just another dev@lemmy.my-box.dev
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I agree. That will need to be proven. But when they are better than, say 90% of all drivers, it would make sense to switch. Waiting until they’re “perfect” (which is the requirement I object to), is just wasting needless lives.

        • supercriticalcheese@feddit.it
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Depends on what happens when they make errors. Is it comparable to human errors or are they prone to making worse mistakes than humans on average in terms of the conseguences.

          They might be 99.99% perfect but in 0.01% of cases cause massive car pileups in motorways (for example) due to reasons.

          A proper risk analysis based on a controlled transition would be better to be done first.

          • just another dev@lemmy.my-box.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Yups, fully agreed.

            When it all comes down to I’d much rather have the mass pileup you describe once every few years (which can then be analysed and remedied due to the telemetry involved), than the over 3000 traffic deaths a day we have now.

      • just another dev@lemmy.my-box.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Yeah, legislation and requirements for a self driving car to be allowed on the road will have to be updated. But an automated car can’t drink and drive, or make the intentional decision to run someone over because they hate them. I don’t see how vehicular homicide would apply.

        If somebody reprograms a car to murder someone, they are at fault. In all other cases - accidents - the insurance would have to shift from the driver to the car creator.