TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5

Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:

  • The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
  • This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
  • The crashes are overwhelmingly Teslas rear-ending motorcyclists.

Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.

Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.

  • SkunkWorkz@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 year ago

    It’s because the system has to rely on visual cues, since Tesla’s have no radar. The system looks at the tail light when it’s dark to gauge the distance from the vehicle. And since some bikes have a double light the system thinks it’s a car in front of them that is far away, when in reality it’s a bike up close. Also remember the ai is trained on human driving behavior which Tesla records from their customers. And we all know how well the average human drives around two wheeled vehicles.

  • Ledericas@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    the cybertruck is sharp enough to cut a deer in half, surely a biker is just as vulnerable.

  • captainastronaut@seattlelunarsociety.org
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    Tesla self driving is never going to work well enough without sensors - cameras are not enough. It’s fundamentally dangerous and should not be driving unsupervised (or maybe at all).

    • Ledericas@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      they originally had lidar, or radar, but musk had them disabled in the older models.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        They had radar. Tesla has never had lidar, but they do use lidar on test vehicles to ground truth their camera depth / velocity calculations.

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      These fatalities are a Tesla business advantage. Every one is a data point they can use to program their self-driving intelligence. No one has killed as many as Tesla, so no one knows more about what kills people than Tesla. We don’t have to turn this into a bad thing just because they’re killing people /s

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Accurate.

      Each fatality I found where a Tesla kills a motorcyclist is a cascade of 3 failures.

      1. The car’s cameras don’t detect the biker, or it just doesn’t stop for some reason.
      2. The driver isn’t paying attention to detect the system failure.
      3. The Tesla’s driver alertness tech fails to detect that the driver isn’t paying attention.

      Taking out the driver will make this already-unacceptably-lethal system even more lethal.

      • br3d@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        There’s at least two steps before those three:

        -1. Society has been built around the needs of the auto industry, locking people into car dependency

        1. A legal system exists in which the people who build, sell and drive cars are not meaningfully liable when the car hurts somebody
      • jonne@infosec.pub
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago
        1. Self-driving turns itself off seconds before a crash, giving the driver an impossibly short timespan to rectify the situation.
        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          … Also accurate.

          God, it really is a nut punch. The system detects the crash is imminent.

          Rather than automatically try to evade… the self-driving tech turns off. I assume it is to reduce liability or make the stats look better. God.

          • jonne@infosec.pub
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            1 year ago

            Yep, that one was purely about hitting a certain KPI of ‘miles driven on autopilot without incident’. If it turns off before the accident, technically the driver was in control and to blame, so it won’t show up in the stats and probably also won’t be investigated by the NTSB.

              • KayLeadfoot@fedia.ioOP
                link
                fedilink
                arrow-up
                0
                ·
                1 year ago

                NHTSA collects data if self-driving tech was active within 30 seconds of the impact.

                The companies themselves do all sorts of wildcat shit with their numbers. Tesla’s claimed safety factor right now is 8x human. So to drive with FSD is 8x safer than your average human driver, that’s what they say on their stock earnings calls. Of course, that’s not true, not based on any data I’ve seen, they haven’t published data that makes it externally verifiable (unlike Waymo, who has excellent academic articles and insurance papers written about their 12x safer than human system).

                • NotMyOldRedditName@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  1 year ago

                  So to drive with FSD is 8x safer than your average human driver.

                  WITH a supervising human.

                  Once it reaches a certain quality, it should be safer if a human is properly supervising it, because if the car tries to do something really stupid, the human takes over. The vast vast vast majority of crashes are from inattentive drivers, which is obviously a problem and they need to keep improving the attentiveness monitoring, but it should be safer than a human with human supervision because it can also detect things the human will ultimately miss.

                  Now, if you take the human entirely out of the equation, I very much doubt that FSD is safer than a human at it’s current state.

    • ascense@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Most frustrating thing is, as far as I can tell, Tesla doesn’t even have binocular vision, which makes all the claims about humans being able to drive with vision only even more blatantly stupid. At least humans have depth perception. And supposedly their goal is to outperform humans?

      • TheGrandNagus@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Tesla’s argument of “well human eyes are like cameras therefore we shouldn’t use LiDAR” is so fucking dumb.

        Human eyes have good depth perception and absolutely exceptional dynamic range and focusing ability. They also happen to be linked up to a rapid and highly efficient super computer far outclassing anything that humanity has ever devised, certainly more so than any computer added to a car.

        And even with all those advantages humans have, we still crash from time to time and make smaller mistakes regularly.

        • NABDad@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          They also happen to be linked up to a rapid and highly efficient super computer far outclassing anything that humanity has ever devised

          A neural network that has been in development for 650 million years.

        • bluGill@fedia.io
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          1 year ago

          Anyone who has driven (or walked) into a sunrise/sunset knows that human vision is not very good. I’ve also driven in blizzards, heavy rain, and fog - all times when human vision is terrible. I’ve also not seen green lights (I’m colorblind).

          • TheGrandNagus@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            Human vision is very, very, very good. If you think a camera installed to a car is even close to human eyesight, then you are extremely mistaken.

            Human eyes are so far beyond it’s hard to even quantify.

            And bullshit on you not being able to see the lights. They’re specifically designed so that’s not an issue for colourblind people.

            • bluGill@fedia.io
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              And bullshit on you not being able to see the lights. They’re specifically designed so that’s not an issue for colour blind people

              Some lights are, but not all of them are. I often say I go when the light turns blue. However not all lights have that blue tint and so I often cannot tell the difference between a white light and a green light by color. (but white is not used in a stoplight and I can see red/yellow just fine) Where I live all stoplights have green on the bottom so that is always a cheat I use, but that only works if I can see the relative position - in an otherwise dark situation I only see a light in front of me and not the rest of the structure and so I cannot tell. I have driven where stoplights are not green on bottom and I can never remember if green is left/right.

              Even when the try though, not all colorblind is the same. There may not be a mitigation that will work from two different people with different aspects of colorblind.

            • bluGill@fedia.io
              link
              fedilink
              arrow-up
              0
              arrow-down
              1
              ·
              1 year ago

              Human vision is very, very, very good. If you think a camera installed to a car is even close to human eyesight, then you are extremely mistaken.

              Why are you trying to limit cars to just vision? That is all I have as a human. However robots have radar, lidar, radio, and other options, there is no reasons they can’t use them and get information eyes cannot. Every option has limits.

  • Redex@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Cuz other self driving cars use LIDAR so it’s basically impossible for them to not realise that a bike is there.

  • AnimalsDream@slrpnk.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    I imagine bicyclists must be æffected as well if they’re on the road (as we should be, technically). As somebody who has already been literally inches away from being rear-ended, this makes me never want to bike in the US again.

    Time to go to Netherlands.

  • Gork@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Lidar needs to be a mandated requirement for these systems.

    • TrackinDaKraken@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      How about we disallow it completely, until it’s proven to be SAFER than a human driver. Because, why even allow it if it’s only as safe?

      • explodicle@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        As an engineer, I strongly agree with requirements based on empirical results rather than requiring a specific technology. The latter never ages well. Thank you.

        • scarabic@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 year ago

          It’s hardly either / or though. What we have here is empirical data showing that cars without lidar perform worse. So it’s based in empirical results to mandate lidar. You can build a clear, robust requirement around a tech spec. You cannot build a clear, robust law around fatality statistics targets.

      • scarabic@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        This sounds good until you realize how unsafe human drivers are. People won’t accept a self-driving system that’s only 50% safer than humans, because that will still be a self-driving car that kills 20,000 Americans a year. Look at the outrage right here, and we’re nowhere near those numbers. I also don’t see anyone comparing these numbers to human drivers on any per-mile basis. Waymos compared favorably to human drivers in their most recently released data. Does anyone even know where Teslas stand compared to human drivers?

        • NotMyOldRedditName@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          There’s been 54 reported fatalities involving their software over the years in the US.

          That’s around 10 billion AP miles (9 billion at end of 2024), and around 3.6 billion on the various version of FSD (beta / supervised). Most of the fatal accidents happened on AP though not FSD.

          Lets just double those fatal accidents to 108 to make it for the world, but that probably skews high. Most of the fatal stuff I’ve seen is always in the US.

          That equates to 1 fatal accident every 125.9 million miles.

          The USA average per 100 million miles is 1.33 deaths, so even doubling the deaths it’s less than the current national average. That’s the equivalent of 1.33 deaths every 167 million miles with Tesla’s software.

          Edit: I couldn’t math, fixed it. Also for FSD specifically, very few places have it. Mainly North America, and just recently, China. I wish we had fatalities for FSD specifically.

    • ℍ𝕂-𝟞𝟝@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      1 year ago

      Honestly, emergency braking with LIDAR is mature and cheap enough at this point that is should be mandated for all new cars.

      • Nastybutler@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        No, emergency braking with radar is mature and cheap. Lidar is very expensive and relatively nascent

  • keesrif@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    On a quick read, I didn’t see the struck motorcycles listed. Last I heard, a few years ago, was that this mainly affected motorcycles with two rear lights that are spaced apart and fairly low to the ground. I believe this is mostly true for Harleys.

    The theory I recall was that this rear light configuration made the Tesla assume it was looking (remember, only cameras without depth data) at a car that was further down the road - and acceleration was safe as a result. It miscategorised the motorcycle so badly that it misjudged it’s position entirely.

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I also saw that theory! That’s in the first link in the article.

      The only problem with the theory: Many of the crashes are in broad daylight. No lights on at all.

      I didn’t include the motorcycle make and model, but I did find it. Because I do journalism, and sometimes I even do good journalism!

      The models I found are: Kawasaki Vulcan (a cruiser bike, just like the Harleys you describe), Yamaha YZF-R6 (a racing-style sport bike with high-mount lights), and a Yamaha V-Star (a “standard” bike, fairly low lights, and generally a low-slung bike). Weirdly, the bike models run the full gamut of the different motorcycles people ride on highways, every type is represented (sadly) in the fatalities.

      I think you’re onto something with the faulty depth sensors. Sensing distance is difficult with optical sensors. That’s why Tesla would be alone in the motorcycle fatality bracket, and that’s why it would always be rear-end crashes by the Tesla.

      • littleomid@feddit.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        At least in EU, you can’t turn off motorcycle lights. They’re always on. In eu since 2003, and in US, according to the internet, since the 70s.

        • pirat@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          I assume older motorcycles built before 2003 are still legal in the EU today, and that the drivers are responsible for turning on the lights when riding those.

        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          Point taken: Feel free to amend my comment from “No lights at all” to “No lights visible at all.”

    • jonne@infosec.pub
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Whatever it is, it’s unacceptable and they should really ban Tesla’s implementation until they fix some fundamental issues.

    • Not_mikey@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Robots don’t get drunk, or distracted, or text, or speed…

      Anecdotally, I think the Waymos are more courteous than human drivers. Though waymo seems to be the best ones out so far, idk about the other services.

        • dogslayeggs@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          They have remote drivers that CAN take control in very corner case situations that the software can’t handle. The vast majority of driving is don’t without humans in the loop.

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            They don’t even do that, according to Waymo’s claims.

            They can suggest what the car should do, but they aren’t actually doing it. The car is in complete control.

            Its a nuanced difference, but it is a difference. A Waymo employee never takes control of or operates the vehicle.

    • Bytemeister@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Because the march of technological advancement is inevitable?

      In light of recent (and let’s face it, long ago cases) Tesla’s “Full Self Driving” needs to be downgraded to level 2 at best.

      Level 2: Partial Automation

      The vehicle can handle both steering and acceleration/deceleration, but the driver must remain engaged and ready to take control.

      Pretty much the same level as other brands self driving feature.

      • AngryCommieKender@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        The other brands, such as Audi and VW, work much better than Tesla’s system. Their LIDAR systems aren’t blinded by fog, and rain the way the Tesla is. Someone recently tested an Audi with its system against a Tesla with its system. The Tesla failed either 3/5 or 4/5 tests. The Audi passed 3/5 or 4/5. Neither system is perfect, but the one that doesn’t rely on just cameras is clearly superior.

        Edit: it was Mark Rober.

        https://youtu.be/IQJL3htsDyQ

        • Bytemeister@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          It’s hard to tell, but from about 15 minutes of searching, I was unable to locate any consumer vehicles that include a LIDAR system. Lots of cars include RADAR, for object detection, even multiple RADAR systems for parking. There may be some which includes a TimeOfFlight sensor, which is like LIDAR, but static and lacks the resolution/fidelity. My Mach-E which has level 2 automation uses a combination of computer vision, RADAR and GPS. I was unable to locate a LIDAR sensor for the vehicle.

          The LIDAR system in Mark’s video is quite clearly a pre-production device that is not affiliated with the vehicle manufacturer it was being tested on.

          Adding, after more searching, it looks like the polestar 3, some trim levels of the Audi A8 and the Volvo EX90 include a LiDAR sensor. Curious to see how the consumer grade tech works out in real world.

          Please do not mistake this comment as “AI/computer vision” evangelisim. I currently have a car that uses those technologies for automation, and I would not and do not trust my life or anyone else’s to that system.

  • spacesatan@leminal.space
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Unless it’s a higher rate than human drivers per mile or hours driven I do not care. Article doesn’t have those stats so it’s clickbait as far as I’m concerned

  • NotMyOldRedditName@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 year ago

    For what it’s worth, it really isn’t clear if this is FSD or AP based on the constant mention of self driving even when it’s older collisions when it would definitely been AP, and is even listed as AP if you click on the links to the crash.

    So these may all be AP, or one or two might be FSD, it’s unclear.

    Every Tesla has AP as well, so the likelihood of that being the case is higher.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 year ago

      In this case, does it matter? Both are supposed to follow a vehicle at a safe distance

      I’d be more interested in how it changes over time, as new software is pushed. While it’s important that know it had problems judging distance to a motorcycle, it’s more important to know whether it still does

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 year ago

        In this case, does it matter? Both are supposed to follow a vehicle at a safe distance

        I think it does matter, while both are supposed to follow at safe distances, the FSD stack is doing it in a completely different way. They haven’t really been making any major updates to AP for many years now, all focus has been on FSD. I think the only real changes it’s had for quite awhile have been around making sure people are paying attention better.

        AP is looking at the world frame by frame, each individual camera on it’s own, while FSD is taking the input of all cameras, turning into 3d vector space, and then driving based off that. Doing that on city streets and highways is only a pretty recent development. Updates for doing it this way on highway and streets only went out to all cars with FSD in the past few months. For a long time it was on city streets only.

        I’d be more interested in how it changes over time, as new software is pushed.

        I think that’s why it’s important to make a real distinction between AP and FSD today (and specifically which FSD versions)

        They’re wholly different systems, one that gets older every day, and one that keeps getting better every few months. Making an article like this that groups them together over the span of years muddies the water on what / if any progress has been made.

        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          Fair enough!

          At least one of the fatalities is Full-Self Driving (it was cited by name in the police reports). The remainder are Autopilot. So, both systems kill motorcyclists. Tesla requests this data redacted from their NHTSA reporting, which specifically makes it difficult for consumers to measure which system is safer or if incremental safety improvements are actually being made.

          You’re placing a lot if faith that the incremental updates are improvements without equivalent regressions. That data is specifically being concealed from you, and I think you should probably ask why. If there was good news behind those redactions, they wouldn’t be redactions.

          I didn’t publish the software version data point because I agree with AA5B, it doesn’t matter. I honestly don’t care how it works. I care that it works well enough to safely cohabit the road with my manual transmission cromagnon self.

          I’m not a “Tesla reporter,” I’m not trying to cover the incremental changes in their software versions. Plenty of Tesla fans doing that already. It only has my attention at all because it’s killing vulnerable road users, and for that analysis we don’t actually need to know which self-driving system version is killing people, just the make of car it is installed on.

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            1 year ago

            I’d say it’s a pretty important distinction to know if one or both systems have a problem and the level of how bad that problem is.

            Also are you referencing the one in Seattle in 2024 for FSD? The CNBC article says FSD, but the driver said AP.

            And especially back then, there’s also an important distinction of how they work.

            FSD on highways wasn’t released until November 2024, and even then not everyone got it right away. So even if FSD was enabled, the crash may have been under AP.

            Edit: Also if it was FSD for real (that 2024 crash would have had to happen on city streets, not a highway) then thats 1 motorcycle fatality in 3.6 billion miles. The other 4 happened over 10 billion miles. Is that not an improvement? (edit again: I should say we can’t tell it’s an improvement yet as we’d have to pass 5 billion, so the jury is still out I guess IF that crash was really on FSD)

            Edit: I will cede though that as a motorcyclist, you can’t know what the Tesla is using, so you’d have to assume the worst.

            Edit: Just correcting myself that I was wrong about FSD in 2024. The change over to neural nets happened in November, but FSD was still FSD on highways when this accident happened. It was even earlier than that when FSD became AP when you transitioned to higways

            • KayLeadfoot@fedia.ioOP
              link
              fedilink
              arrow-up
              0
              ·
              1 year ago

              Police report for 2024 case attached, it is also linked in the original article: https://www.opb.org/article/2025/01/15/tesla-may-face-less-accountability-for-crashes-under-trump/

              It was Full Self Driving, according to the police. They know because they downloaded the data off the vehicle’s computer. The motorcyclist was killed on a freeway merge ramp.

              All the rest is beyond my brief. Thought you might like the data to chew on, though.

              • NotMyOldRedditName@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                1 year ago

                The motorcyclist was killed on a freeway merge ramp.

                I’d say that means it’s a very good chance that yes, while FSD was enabled, the crash happened under the older AP mode of driving, as it wasn’t until November 2024 that it was moved over to the new FSD neural net driving code.. I was wrong here, it actually was FSD then, it just wasn’t end to end neural nets then like it is now.

                Also yikes… the report says the AEB kicked in, and the driver overrode it by pressing on the accelerator!

                • KayLeadfoot@fedia.ioOP
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 year ago

                  No shit on that yikes. That blew my fucking mind.

                  Half the time when your AEB activates, you are unconscious or dazed and you’re just flailing around your cabin like a rag doll, because you’ve crashed. If your foot happens to flail into the accelerator, get ready for a very exciting (if short-lived) application of that impressive 0 to 60 time.