New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

  • daikiki@lemmy.world
    link
    fedilink
    English
    arrow-up
    97
    arrow-down
    13
    ·
    1 year ago

    I have a lot of trouble understanding how the NTSB (or whoever’s ostensibly in charge of vetting tech like this) is allowing these not-quite self driving cars on the road. The technology doesn’t seem mature enough to be safe yet, and as far as I can tell, nobody seems to have the authority or be willing to use that authority to make manufacturers step back until they can prove their systems can be integrated safely into traffic.

    • SpaceNoodle@lemmy.world
      link
      fedilink
      English
      arrow-up
      88
      arrow-down
      6
      ·
      1 year ago

      It’s just ADAS - essentially fancy cruise control. There are a number of autonomous vehicle companies who are carefully and successfully developing real self-driving technology, and Tesla should be censured and forbidden for labeling their assistance software as “full self-driving.” It’s damaging the real industry.

    • RushingSquirrel@lemm.ee
      link
      fedilink
      English
      arrow-up
      40
      arrow-down
      2
      ·
      1 year ago

      That’s similar to cruise control. Cruise control can be dangerous because someone could fall asleep (not having to manage your speed can afford up sleepiness) and the car wouldn’t slow down.

      In my opinion, those options are all the driver’s responsibility to know their own limit and understand that the tool is just a tool and you are responsible to making sure your driving is safe for others. Tesla autopilot adds a ton of safety features that avoid a lot of collisions based on lacking attention, sleepiness, and actively avoiding other drivers faults. But it’s still just a tool and the driver is responsible of their own car and driving.

      • daikiki@lemmy.world
        link
        fedilink
        English
        arrow-up
        38
        arrow-down
        7
        ·
        1 year ago

        The difference is that cruise control will maintain your speed, but ‘autopilot’ may avoid or slow down for obstacles. Maybe it avoids obstacles 90% of the time or 99% of the time. It apparently avoids obstacles enough that people can get lulled into a false sense of security, but once in a while it slams into the back of a stationary vehicle at highway speed.

        It’s easy to say it’s the driver’s responsibility, and ultimately it is, of course, but in practice, a system that works almost all of the time but occasionally causally kills somebody is very dangerous indeed, and saying it’s all the driver’s fault isn’t really realistic or fair.

        • abhibeckert@lemmy.world
          link
          fedilink
          English
          arrow-up
          20
          arrow-down
          4
          ·
          edit-2
          1 year ago

          A lot of modern cruise control systems will match the speed of the car in front of you and stop if they stop. They’ll also keep the car in the current lane. And even without cruise control, most modern cars will stop if a pedestrian steps onto the road.

          It’s frustrating that Tesla’s system can’t detect a stationary police car in the middle of the road… but at the same time apparently that’s quite a difficult thing to do and it’s not unique to Tesla.

          It’s honestly not too much to ask a driver to step on the brakes if there’s a cop car stopped on the road.

          • SpaceNoodle@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            edit-2
            1 year ago

            It’s actually not that hard to do, but Tesla is not willing to spend the necessary time and resources to solve the hard problems.

        • NeoNachtwaechter@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          1 year ago

          Maybe it avoids obstacles 90% of the time or 99% of the time.

          99 is not enough!

          99 means many many more dead people.

          You need to go for 99.99%

        • ilickfrogs@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          10
          ·
          edit-2
          1 year ago

          Actually it’s absolutely realistic and fair. I don’t like Musk, or Tesla for that matter. But they make it pretty damn clear that you’re 100% responsible for the vehicle when using that feature. Anyone who assumes they don’t need to pay attention is a moron and should be held responsible. If a 747 autopilot system starts telling the pilot to take control of the plane and they don’t… we wouldn’t blame the manufacturer, we’d blame the shitty pilot that didn’t do their job.

          • ShittyBeatlesFCPres@lemmy.world
            link
            fedilink
            English
            arrow-up
            25
            arrow-down
            4
            ·
            1 year ago

            I can’t wait to get smacked by a Tesla beta tester and have everyone debate whether the car or the driver is responsible for my innards being spread across 4 lanes. Progress!

          • daikiki@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            ·
            1 year ago

            If the driver gets lulled into a false sense of security by a convenience system like this and the automation fails, it’s one thing to blame the driver, and that may or may not be fair depending on how much trust you place in the average driver’s competence, but the (hypothetical) victim is still dead, and who we decide to blame won’t make one iota of difference to that.

    • Not_Alec_Baldwin@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      3
      ·
      1 year ago

      It’s not “not-quite-self-driving” though, it’s literal garbage. It’s cruise control, lane assist and brake assist. The robot vision in use is horrible.

      There are Tesla engineers bad mouthing the system openly.

      Musk is a scammer and they need to issue an apology for all of the claims around autopilot, probably pay a great deal of money, and then change the name and advertising around it.

      Oh, and also this guy should never drive again.