TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5

Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:

  • The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
  • This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
  • The crashes are overwhelmingly Teslas rear-ending motorcyclists.

Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.

Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.

  • Ulrich
    link
    fedilink
    English
    43 months ago

    I’m not sure how that’s possible considering no one manufactures self-driving cars that I know of. Certainly not Tesla.

    • bluGill
      link
      fedilink
      93 months ago

      Humans are terrible drivers. The open question is are self driving cars overall safer than human driven cars. So far the only people talking either don’t have data, or have reason cherry pick only parts of the data that make self driving look good. This is the one exception where someone seemingly independent has done analysis - the question is are they unbiased, or are they cherry picking data to make self driving look bad (I’m not familiar with the source so I can’t answer that)

      Either way more study is needed.

      • KayLeadfootOP
        link
        fedilink
        63 months ago

        I am absolutely biased. It’s me, I’m the source :)

        I’m a motorcyclist, and I don’t want to die. Also just generally, motorcyclists deserve to get where they are going safely.

        I agree with you. Self-driving cars will overall greatly improve highway safety.

        I disagree with you when you suggest that pointing out flaws in the technology is evidence of bias, or “cherry picking to make self driving look bad.” I think we can improve on the technology by pointing out its systemic defects. If it hits motorcyclists, take it off the road, fix it, and then save lives by putting it back on the road.

        That’s the intention of the coverage, at least: I am hoping to apply pressure to improve rather than remove. Read my Waymo coverage, I’m actually a big automation enthusiast, because fewer crashes is a good thing.

        • bluGill
          link
          fedilink
          23 months ago

          I wasn’t trying to suggest that you are biased, only that I have no clue and so it is possible you are somehow unfairly doing something.

          • KayLeadfootOP
            link
            fedilink
            13 months ago

            Perfectly fair. Sorry, I jumped the gun! Good on you for being incredulous and inspecting the piece for manipulation, that’s smart.

      • Rhaedas
        link
        fedilink
        63 months ago

        Humans are terrible. The human eyes and brain are good at detecting certain things though that allow a reaction where computer vision, especially only using one method of detection, fails often. There are times when an automated system will prevent a problem before a human could even see it. So far neither is the clear winner, human driving just has a legacy that automation has to beat by a great length and not just be good enough.

        On the topic of human drivers, I think most on the road drive reactively and not based on prediction and anticipation. Given the speed and possible detection methods, a well designed automated system should be excelling at this. It costs more and it more complex to design such a thing, so we’re getting the bare bones of the best minimum tech can give us right now, which again is not a replacement for all cases.

    • @[email protected]
      link
      fedilink
      English
      123 months ago

      Robots don’t get drunk, or distracted, or text, or speed…

      Anecdotally, I think the Waymos are more courteous than human drivers. Though waymo seems to be the best ones out so far, idk about the other services.

        • @[email protected]
          link
          fedilink
          English
          73 months ago

          They have remote drivers that CAN take control in very corner case situations that the software can’t handle. The vast majority of driving is don’t without humans in the loop.

          • @[email protected]
            link
            fedilink
            English
            3
            edit-2
            3 months ago

            They don’t even do that, according to Waymo’s claims.

            They can suggest what the car should do, but they aren’t actually doing it. The car is in complete control.

            Its a nuanced difference, but it is a difference. A Waymo employee never takes control of or operates the vehicle.

            • KayLeadfootOP
              link
              fedilink
              23 months ago

              Interesting! I did not know that - I assumed the teleoperators took direct control, but that makes much more sense for latency reasons (among others)

              • @[email protected]
                link
                fedilink
                English
                13 months ago

                I always just assumed it was their way to ensure the vehicle was really autonomous. If you have someone remotely driving it, you could argue it isn’t actually an AV. Your latency idea makes a lot of sense as well though. Imagine taking over and causing an accident due to latency? This way even if the operator gives a bad suggestion, it was the car that ultimately did it.

    • @[email protected]
      link
      fedilink
      English
      13
      edit-2
      3 months ago

      Because muh freedum, EU are a bunch of commies for not allowing this awesome innovation on their roads

      (I fucking love living in the EU)

    • @[email protected]
      link
      fedilink
      English
      53 months ago

      Because the march of technological advancement is inevitable?

      In light of recent (and let’s face it, long ago cases) Tesla’s “Full Self Driving” needs to be downgraded to level 2 at best.

      Level 2: Partial Automation

      The vehicle can handle both steering and acceleration/deceleration, but the driver must remain engaged and ready to take control.

      Pretty much the same level as other brands self driving feature.

      • @[email protected]
        link
        fedilink
        English
        103 months ago

        The other brands, such as Audi and VW, work much better than Tesla’s system. Their LIDAR systems aren’t blinded by fog, and rain the way the Tesla is. Someone recently tested an Audi with its system against a Tesla with its system. The Tesla failed either 3/5 or 4/5 tests. The Audi passed 3/5 or 4/5. Neither system is perfect, but the one that doesn’t rely on just cameras is clearly superior.

        Edit: it was Mark Rober.

        https://youtu.be/IQJL3htsDyQ

        • @[email protected]
          link
          fedilink
          English
          63 months ago

          It’s hard to tell, but from about 15 minutes of searching, I was unable to locate any consumer vehicles that include a LIDAR system. Lots of cars include RADAR, for object detection, even multiple RADAR systems for parking. There may be some which includes a TimeOfFlight sensor, which is like LIDAR, but static and lacks the resolution/fidelity. My Mach-E which has level 2 automation uses a combination of computer vision, RADAR and GPS. I was unable to locate a LIDAR sensor for the vehicle.

          The LIDAR system in Mark’s video is quite clearly a pre-production device that is not affiliated with the vehicle manufacturer it was being tested on.

          Adding, after more searching, it looks like the polestar 3, some trim levels of the Audi A8 and the Volvo EX90 include a LiDAR sensor. Curious to see how the consumer grade tech works out in real world.

          Please do not mistake this comment as “AI/computer vision” evangelisim. I currently have a car that uses those technologies for automation, and I would not and do not trust my life or anyone else’s to that system.

          • KayLeadfootOP
            link
            fedilink
            23 months ago

            Mercedes uses LiDAR. They also operate the sole Level 3 driver automation system in the USA. Two models only, the new S-Class and EQS sedans.

            Tesla alleges they’ll be Level 4+ in Austin in 60 days, and just skip Level 3 altogether. We’ll see.

            • @[email protected]
              link
              fedilink
              English
              33 months ago

              Yeah, keep in mind that Elon couldn’t get level 3 working in a closed, pre-mapped circuit. The robotaxis were just remotely operated.

          • @[email protected]
            link
            fedilink
            English
            4
            edit-2
            3 months ago

            The way I understand it, is that Audi, Volvo, and VW have had the hardware in place for a few years. They are collecting real world data about how we drive before they allow the systems to be used at all. There are also legal issues with liability.

  • Redex
    link
    fedilink
    English
    123 months ago

    Cuz other self driving cars use LIDAR so it’s basically impossible for them to not realise that a bike is there.

  • @[email protected]
    link
    fedilink
    English
    143 months ago

    It’s because the system has to rely on visual cues, since Tesla’s have no radar. The system looks at the tail light when it’s dark to gauge the distance from the vehicle. And since some bikes have a double light the system thinks it’s a car in front of them that is far away, when in reality it’s a bike up close. Also remember the ai is trained on human driving behavior which Tesla records from their customers. And we all know how well the average human drives around two wheeled vehicles.

  • @[email protected]
    link
    fedilink
    English
    1433 months ago

    Tesla self driving is never going to work well enough without sensors - cameras are not enough. It’s fundamentally dangerous and should not be driving unsupervised (or maybe at all).

    • @[email protected]
      link
      fedilink
      English
      163 months ago

      Most frustrating thing is, as far as I can tell, Tesla doesn’t even have binocular vision, which makes all the claims about humans being able to drive with vision only even more blatantly stupid. At least humans have depth perception. And supposedly their goal is to outperform humans?

      • @[email protected]
        link
        fedilink
        English
        253 months ago

        Tesla’s argument of “well human eyes are like cameras therefore we shouldn’t use LiDAR” is so fucking dumb.

        Human eyes have good depth perception and absolutely exceptional dynamic range and focusing ability. They also happen to be linked up to a rapid and highly efficient super computer far outclassing anything that humanity has ever devised, certainly more so than any computer added to a car.

        And even with all those advantages humans have, we still crash from time to time and make smaller mistakes regularly.

        • @[email protected]
          link
          fedilink
          English
          113 months ago

          They also happen to be linked up to a rapid and highly efficient super computer far outclassing anything that humanity has ever devised

          A neural network that has been in development for 650 million years.

        • bluGill
          link
          fedilink
          23 months ago

          Anyone who has driven (or walked) into a sunrise/sunset knows that human vision is not very good. I’ve also driven in blizzards, heavy rain, and fog - all times when human vision is terrible. I’ve also not seen green lights (I’m colorblind).

          • @[email protected]
            link
            fedilink
            English
            5
            edit-2
            3 months ago

            Human vision is very, very, very good. If you think a camera installed to a car is even close to human eyesight, then you are extremely mistaken.

            Human eyes are so far beyond it’s hard to even quantify.

            And bullshit on you not being able to see the lights. They’re specifically designed so that’s not an issue for colourblind people.

            • bluGill
              link
              fedilink
              13 months ago

              Human vision is very, very, very good. If you think a camera installed to a car is even close to human eyesight, then you are extremely mistaken.

              Why are you trying to limit cars to just vision? That is all I have as a human. However robots have radar, lidar, radio, and other options, there is no reasons they can’t use them and get information eyes cannot. Every option has limits.

            • bluGill
              link
              fedilink
              23 months ago

              And bullshit on you not being able to see the lights. They’re specifically designed so that’s not an issue for colour blind people

              Some lights are, but not all of them are. I often say I go when the light turns blue. However not all lights have that blue tint and so I often cannot tell the difference between a white light and a green light by color. (but white is not used in a stoplight and I can see red/yellow just fine) Where I live all stoplights have green on the bottom so that is always a cheat I use, but that only works if I can see the relative position - in an otherwise dark situation I only see a light in front of me and not the rest of the structure and so I cannot tell. I have driven where stoplights are not green on bottom and I can never remember if green is left/right.

              Even when the try though, not all colorblind is the same. There may not be a mitigation that will work from two different people with different aspects of colorblind.

          • @[email protected]
            link
            fedilink
            English
            13 months ago

            Bro I’m colorblind too and if you’re not sure what color the light is, you have to stop. Don’t put that on the rest of us.

            • bluGill
              link
              fedilink
              23 months ago

              I can see red clearly and so not sure means I can go.

              I’ve only noticed issues in a few situations. When I’m driving at night and suddenly the weirdly aimed streetlight turns yellow - until it changed I didn’t even know there was a stoplight there. The second was I was making a left turn at sunset (sun behind me) and the green arrow came on but the red light remained on so I couldn’t see it was time/safe to go until my wife alerted me.

    • KayLeadfootOP
      link
      fedilink
      863 months ago

      Accurate.

      Each fatality I found where a Tesla kills a motorcyclist is a cascade of 3 failures.

      1. The car’s cameras don’t detect the biker, or it just doesn’t stop for some reason.
      2. The driver isn’t paying attention to detect the system failure.
      3. The Tesla’s driver alertness tech fails to detect that the driver isn’t paying attention.

      Taking out the driver will make this already-unacceptably-lethal system even more lethal.

      • @[email protected]
        link
        fedilink
        English
        663 months ago
        1. Self-driving turns itself off seconds before a crash, giving the driver an impossibly short timespan to rectify the situation.
        • KayLeadfootOP
          link
          fedilink
          633 months ago

          … Also accurate.

          God, it really is a nut punch. The system detects the crash is imminent.

          Rather than automatically try to evade… the self-driving tech turns off. I assume it is to reduce liability or make the stats look better. God.

          • @[email protected]
            link
            fedilink
            English
            37
            edit-2
            3 months ago

            Yep, that one was purely about hitting a certain KPI of ‘miles driven on autopilot without incident’. If it turns off before the accident, technically the driver was in control and to blame, so it won’t show up in the stats and probably also won’t be investigated by the NTSB.

            • @[email protected]
              link
              fedilink
              English
              153 months ago

              so it won’t show up in the stats

              Hopefully they wised up by now and record these stats properly…?

              • @[email protected]
                link
                fedilink
                English
                93 months ago

                If they ever fixed it, I’m sure Musk fired whomever is keeping score now. He’s going to launch the robotaxi stuff soon and it’s going to kill a bunch of people.

              • KayLeadfootOP
                link
                fedilink
                213 months ago

                NHTSA collects data if self-driving tech was active within 30 seconds of the impact.

                The companies themselves do all sorts of wildcat shit with their numbers. Tesla’s claimed safety factor right now is 8x human. So to drive with FSD is 8x safer than your average human driver, that’s what they say on their stock earnings calls. Of course, that’s not true, not based on any data I’ve seen, they haven’t published data that makes it externally verifiable (unlike Waymo, who has excellent academic articles and insurance papers written about their 12x safer than human system).

                • bean
                  link
                  fedilink
                  English
                  23 months ago

                  Fascinating! I don’t know all this. Thanks

                • @[email protected]
                  link
                  fedilink
                  English
                  2
                  edit-2
                  3 months ago

                  So to drive with FSD is 8x safer than your average human driver.

                  WITH a supervising human.

                  Once it reaches a certain quality, it should be safer if a human is properly supervising it, because if the car tries to do something really stupid, the human takes over. The vast vast vast majority of crashes are from inattentive drivers, which is obviously a problem and they need to keep improving the attentiveness monitoring, but it should be safer than a human with human supervision because it can also detect things the human will ultimately miss.

                  Now, if you take the human entirely out of the equation, I very much doubt that FSD is safer than a human at it’s current state.

        • @[email protected]
          link
          fedilink
          English
          183 months ago

          Even when it is just milliseconds before the crash, the computer turns itself off.

          Later, Tesla brags that the autopilot was not in use during this ( terribly, overwhelmingly) unfortunate accident.

      • @[email protected]
        link
        fedilink
        English
        9
        edit-2
        3 months ago

        There’s at least two steps before those three:

        -1. Society has been built around the needs of the auto industry, locking people into car dependency

        1. A legal system exists in which the people who build, sell and drive cars are not meaningfully liable when the car hurts somebody
        • @[email protected]
          link
          fedilink
          English
          63 months ago
          1. A legal system exists in which the people who build, sell and drive cars are not meaningfully liable when the car hurts somebody

          That’s a good thing, because the alternative would be flipping the notion of property rights on its head. Making the owner not responsible for his property would be used to justify stripping him of his right to modify it.

          You’re absolutely right about point -1 though.

          • @[email protected]
            link
            fedilink
            English
            23 months ago

            build, sell and drive

            You two don’t seem to strongly disagree. The driver is liable but should then sue the builder/seller for “self driving” fraud.

            • @[email protected]
              link
              fedilink
              English
              23 months ago

              Maybe, if that two-step determination of liability is really what the parent commenter had in mind.

              I’m not so sure he’d agree with my proposed way of resolving the dispute over liability, which would be to legally require that all self-driving systems (and software running on the car in general) be forced to be Free Software and put it squarely and completely within the control of the vehicle owner.

                • @[email protected]
                  link
                  fedilink
                  English
                  23 months ago

                  I mean, maybe, but previously when I’ve said that it’s typically gone over like a lead balloon. Even in tech forums, a lot of people have drunk the kool-aid that it’s somehow suddenly too dangerous to allow owners to control their property just because software is involved.

    • @[email protected]
      link
      fedilink
      English
      43 months ago

      These fatalities are a Tesla business advantage. Every one is a data point they can use to program their self-driving intelligence. No one has killed as many as Tesla, so no one knows more about what kills people than Tesla. We don’t have to turn this into a bad thing just because they’re killing people /s

      • @[email protected]
        link
        fedilink
        English
        3
        edit-2
        3 months ago

        They had radar. Tesla has never had lidar, but they do use lidar on test vehicles to ground truth their camera depth / velocity calculations.

  • @[email protected]
    link
    fedilink
    English
    253 months ago

    On a quick read, I didn’t see the struck motorcycles listed. Last I heard, a few years ago, was that this mainly affected motorcycles with two rear lights that are spaced apart and fairly low to the ground. I believe this is mostly true for Harleys.

    The theory I recall was that this rear light configuration made the Tesla assume it was looking (remember, only cameras without depth data) at a car that was further down the road - and acceleration was safe as a result. It miscategorised the motorcycle so badly that it misjudged it’s position entirely.

    • @[email protected]
      link
      fedilink
      English
      17
      edit-2
      3 months ago

      The ridiculous thing is, it has 3 cameras pointing forward, you only need 2 to get stereoscopic depth perception with cameras…why the fuck are they not using that!?

      Edit: I mean, I know why, it’s because it’s cameras with three different lenses used for different things (normal, wide angle, and telescopic) so they’re not suitable for it, but it just seems stupid to not utilise that concept when you insist on a camera only solution.

      • amorpheus
        link
        fedilink
        English
        13 months ago

        That seems like a spectacular oversight. How is it supposed to replicate human vision without depth perception?

        • @[email protected]
          link
          fedilink
          English
          13 months ago

          The video 0x0 linked to in another comment describes the likely method used to infer distance to objects without a stereoscopic setup, and why it (likely) had issues determining distance in the cases where they hit motorcycles.

        • KayLeadfootOP
          link
          fedilink
          13 months ago

          Little known fact: the Model S (P) actually stands for Polyphemus Edition, not Plaid Edition.

    • @[email protected]
      link
      fedilink
      English
      43 months ago

      Still probably a good idea to keep an eye on that Tesla behind you. Or just let them past.

    • @[email protected]
      link
      fedilink
      English
      283 months ago

      Whatever it is, it’s unacceptable and they should really ban Tesla’s implementation until they fix some fundamental issues.

    • KayLeadfootOP
      link
      fedilink
      293 months ago

      I also saw that theory! That’s in the first link in the article.

      The only problem with the theory: Many of the crashes are in broad daylight. No lights on at all.

      I didn’t include the motorcycle make and model, but I did find it. Because I do journalism, and sometimes I even do good journalism!

      The models I found are: Kawasaki Vulcan (a cruiser bike, just like the Harleys you describe), Yamaha YZF-R6 (a racing-style sport bike with high-mount lights), and a Yamaha V-Star (a “standard” bike, fairly low lights, and generally a low-slung bike). Weirdly, the bike models run the full gamut of the different motorcycles people ride on highways, every type is represented (sadly) in the fatalities.

      I think you’re onto something with the faulty depth sensors. Sensing distance is difficult with optical sensors. That’s why Tesla would be alone in the motorcycle fatality bracket, and that’s why it would always be rear-end crashes by the Tesla.

      • @[email protected]
        link
        fedilink
        English
        113 months ago

        At least in EU, you can’t turn off motorcycle lights. They’re always on. In eu since 2003, and in US, according to the internet, since the 70s.

        • @[email protected]
          link
          fedilink
          English
          2
          edit-2
          3 months ago

          I assume older motorcycles built before 2003 are still legal in the EU today, and that the drivers are responsible for turning on the lights when riding those.

        • KayLeadfootOP
          link
          fedilink
          13 months ago

          Point taken: Feel free to amend my comment from “No lights at all” to “No lights visible at all.”

      • @[email protected]
        link
        fedilink
        English
        43 months ago

        Because I do journalism, and sometimes I even do good journalism!

        In that case, you wouldn’t happen to know whether or not Teslas are unusually dangerous to bicycles too, would you?

        • KayLeadfootOP
          link
          fedilink
          33 months ago

          Surprisingly, there is a data bucket for accidents with bicyclists, but hardly any bicycle crashes are reported.

          That either means that they are not occurring (woohoo!), or that means they are being lumped in as one of the multiple pedestrian buckets (not woohoo!), or they are in the absolutely fucking vast collection of “severity: unknown” accidents where we have no details and Tesla requested redaction to make finding the details very difficult.

    • KayLeadfootOP
      link
      fedilink
      13 months ago

      They call it the Model 3 because the Tesla Organ-Harvester didn’t translate well to Chinese

  • @[email protected]
    link
    fedilink
    English
    243 months ago

    the cybertruck is sharp enough to cut a deer in half, surely a biker is just as vulnerable.

  • @[email protected]
    link
    fedilink
    English
    43 months ago

    Unless it’s a higher rate than human drivers per mile or hours driven I do not care. Article doesn’t have those stats so it’s clickbait as far as I’m concerned

    • @[email protected]
      link
      fedilink
      English
      83 months ago

      The fact that the other self driving brands logged zero motorcyclist fatalities means the technology exists to prevent more deaths. Tesla has chosen to allow more people to die in order to reduce cost. The families of those five dead motorcyclists certainly care.

      • KayLeadfootOP
        link
        fedilink
        13 months ago

        [Edit: oh, my bad, I replied to you very cattily when I meant to reply to Satan. Sorry! Friendly fire! XD ]

    • @[email protected]
      link
      fedilink
      English
      13 months ago

      Same goes for the other vehicles. They didn’t even try to cover miles driven and it’s quite likely Tesla has far more miles of self-driving than anyone else.

      I’d even go so far as to speculate the zero accidents of other self-driving vehicles could just be zero information because we don’t have enough information to call it zero

      • KayLeadfootOP
        link
        fedilink
        13 months ago

        No, the zero accidents for other self-driving vehicles is actually zero :) You may have heard of this little boutique automotive manufacturer, Ford Motor Company. They’re one of the primary competitors, and they are far above the mileage where you would expect a fatal accident if they were as safe as a human.

        Ford has reported self-driving crashes (many of them!). Just no fatal crashes involving motorcycles, because I guess they don’t fucking suck at making self-driving software.

        I linked the data, it’s all public governmental data, and only the Tesla crashes are heavily redacted. You could… IDK… read it, and then share your opinion about it?

        • @[email protected]
          link
          fedilink
          English
          13 months ago

          And how did it compare self-driving time or miles? Because on the surface if Tesla is responsible for 5 such accidents and Ford zero, but Tesla has significantly more than five times the self-driving time or miles, then we just don’t have data yet …… and I see an announcement that Ford expects full self driving in 2026, so it can’t have been used much yet

          • KayLeadfootOP
            link
            fedilink
            13 months ago

            I don’t think anyone has reliable public data on miles travelled. If it existed, I would use it. The fact that it doesn’t exist tells you what you need to know about Level 2 ADAS system safety ;)

            The only folks who are being real open with their data, near as I can tell, is Waymo. And Waymo has zero motorcycle fatalities, operating mostly in California, where the motorcycle driving culture is… absolutely fucking nuts uniquely risk-accepting.

    • KayLeadfootOP
      link
      fedilink
      103 months ago

      Thanks, 'Satan.

      Do you know the number of miles driven by Tesla’s self-driving tech? Because I don’t, Tesla won’t say, they’re a remarkably non-transparent company where their tech is concerned. Near as I can tell, nobody does (other than folks locked up tight with NDAs). If the ratio of accidents-per-mile-driven looked good, you know as a flat fact that Elon would be Tweeting all about it.

      Sorry you didn’t find the death of 5 Americans newsworthy. I’ll try harder for the next one.

      • @[email protected]
        link
        fedilink
        English
        1
        edit-2
        3 months ago

        You’re right, 5 deaths isn’t newsworthy in the context of tens of thousands killed by human drivers each year.

        Is it worse than human drivers is the only relevant point of comparison, which the article doesn’t make.

  • @[email protected]
    link
    fedilink
    English
    133 months ago

    I’m wondering how that stacks up to human drivers. Since the data is redacted I’m guessing not well at all.

    • KayLeadfootOP
      link
      fedilink
      83 months ago

      If it were good, we’d be seeing regular updates on Twitter, I imagine.

    • Echo Dot
      link
      fedilink
      English
      19
      edit-2
      3 months ago

      Or at least something other than just cameras. Even just adding ultrasonic senses to the front would be an improvement.

      • @[email protected]
        link
        fedilink
        English
        13 months ago

        The range on ultrasonics is too short. They only ever get used for parking type situations, not driving on the roadways.

    • ℍ𝕂-𝟞𝟝
      link
      fedilink
      English
      153 months ago

      Honestly, emergency braking with LIDAR is mature and cheap enough at this point that is should be mandated for all new cars.

      • @[email protected]
        link
        fedilink
        English
        43 months ago

        No, emergency braking with radar is mature and cheap. Lidar is very expensive and relatively nascent

    • TrackinDaKraken
      link
      fedilink
      English
      83 months ago

      How about we disallow it completely, until it’s proven to be SAFER than a human driver. Because, why even allow it if it’s only as safe?

      • @[email protected]
        link
        fedilink
        English
        33 months ago

        As an engineer, I strongly agree with requirements based on empirical results rather than requiring a specific technology. The latter never ages well. Thank you.

        • @[email protected]
          link
          fedilink
          English
          1
          edit-2
          3 months ago

          It’s hardly either / or though. What we have here is empirical data showing that cars without lidar perform worse. So it’s based in empirical results to mandate lidar. You can build a clear, robust requirement around a tech spec. You cannot build a clear, robust law around fatality statistics targets.

          • @[email protected]
            link
            fedilink
            English
            13 months ago

            We frequently build clear, robust laws around mandatory testing. Like that recent YouTube video where the Tesla crashed through a wall, but with crash test dummies.

            • @[email protected]
              link
              fedilink
              English
              1
              edit-2
              3 months ago

              Those are ways to gather empirical results, though they rely on artificial, staged situations.

              I think it’s fine to have both. Seat belts save lives. I see no problem mandating them. That kind of thing can still be well founded in data.

      • @[email protected]
        link
        fedilink
        English
        13 months ago

        This sounds good until you realize how unsafe human drivers are. People won’t accept a self-driving system that’s only 50% safer than humans, because that will still be a self-driving car that kills 20,000 Americans a year. Look at the outrage right here, and we’re nowhere near those numbers. I also don’t see anyone comparing these numbers to human drivers on any per-mile basis. Waymos compared favorably to human drivers in their most recently released data. Does anyone even know where Teslas stand compared to human drivers?

        • @[email protected]
          link
          fedilink
          English
          2
          edit-2
          3 months ago

          There’s been 54 reported fatalities involving their software over the years in the US.

          That’s around 10 billion AP miles (9 billion at end of 2024), and around 3.6 billion on the various version of FSD (beta / supervised). Most of the fatal accidents happened on AP though not FSD.

          Lets just double those fatal accidents to 108 to make it for the world, but that probably skews high. Most of the fatal stuff I’ve seen is always in the US.

          That equates to 1 fatal accident every 125.9 million miles.

          The USA average per 100 million miles is 1.33 deaths, so even doubling the deaths it’s less than the current national average. That’s the equivalent of 1.33 deaths every 167 million miles with Tesla’s software.

          Edit: I couldn’t math, fixed it. Also for FSD specifically, very few places have it. Mainly North America, and just recently, China. I wish we had fatalities for FSD specifically.

        • KayLeadfootOP
          link
          fedilink
          53 months ago

          Bahaha, that one is new to me.

          Back when I worked on an ambulance, we called the no helmet guys organ donors.

          This comment was brought to you by PTSD, and has been redacted in a rare moment of sobriety.

        • @[email protected]
          link
          fedilink
          English
          33 months ago

          I remember finding a motorcycle community on reddit that called themselves “squids” or “squiddies” or something like that.

          Their whole thing was putting road tyres on dirtbikes and riding urban environments like they were offroad obstacles. You know, ramping things, except on concrete.

          They loved to talk about how dumb & short-lived they were. I couldn’t ever find that group again, so maybe I misremembered the “squid” name, but I wanted to find them again, not to ever try it - fuck that - but because the bikes looked super cool. I just have a thing for gender-bent vehicles.

          • @[email protected]
            link
            fedilink
            English
            43 months ago

            Calamari Racing Team. It’s mostly a counter-movement to r/Motorcycles, where most of the posters are seen as anti-fun. Their whole thing is that, not just a specific way to ride, they also have a legendary commenter that pays money for pics in full leather.

            • @[email protected]
              link
              fedilink
              English
              1
              edit-2
              3 months ago

              That’s the one! Thanks, that was un-googleable for me.

              I guess the road-tyres-on-dirt-bikes thing was maybe a trend when I saw the sub.

  • That Weird Vegan
    link
    fedilink
    English
    23 months ago

    if only there was a government department to investigate these kinds of things… Too soon?