Tesla: Failure of the FSD's degradation detection system [pdf]

(static.nhtsa.gov)

135 points | by doener 2 hours ago

8 comments

  • Animats 1 hour ago
    "In the crashes that ODI has reviewed, the system did not detect common roadway conditions that impaired camera visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred."

    Does it not detect them at all, or fail to deal with detected sensor degradation adequately? Does "Full Self Driving (assisted)" slow down under conditions of poor visibility?

    Does Tesla even look for the road surface? One big advantage of those up-top LIDAR units is that you have a good scan of the pavement ahead. If you're not sensing flat pavement ahead, don't go there. That's basic. Vision-only systems, going back to Mobileye, have been overly dependent on looking for known kinds of obstacles. Original Mobileye could only detect car rear ends.

    • dstroot 1 hour ago
      I own two Tesla’s. When conditions are adverse, i.e. fog, heavy rain, the system simply shuts off and reverts back to manual driving. Elon has said several times that humans can drive with two eyes and Tesla should be able to drive with X number of cameras. however, it suffers from the same problems humans do: if it can’t see it can’t drive and ironically that’s when it reverts back to human control.
      • recursivecaveat 1 hour ago
        I definitely agree that in principle a computer can drive with cameras alone. I don't know whether it's a useful statement. Like a human can determine the genre of a movie merely by watching it. I wouldn't suggest to blockbuster in 1990 that they should collect no genre metadata for movies because the database server should automatically sort it out on its own. (Nowadays somewhat feasible with ML of course, but 20+ years later.) What sensors/data you need is a question of where computers are now or will shortly be, and it seems that for now they need the extra structure of LIDAR for best effectiveness.
        • carlmr 1 hour ago
          >I definitely agree that in principle a computer can drive with cameras alone.

          Obvious things first, cameras have way worse contrast and low light sensitivity than human eyes.

          Humans have much more evolved logical thinking capacity, even the stupid ones can figure stuff out that modern AI struggles with.

          Humans have other sensors, too that they use to plausibility check the picture they see. I.e. one of the best sensor fusion systems on the planet.

          When in doubt humans can figure out whether it's a lens occlusion or a some other artifact in their vision by virtue of moving their head around.

          There's probably other things I'm not thinking of. In any case to make full self driving work we should first start by using all available tech to make it safe. When you have safe tech you can slowly start removing individual sensors while verifying that safety remains high. As the experience and system evolves there will be optimization potential.

          And until we have that low light thing and high contrast figured out, camera alone doesn't cut it.

          • Eridrus 39 minutes ago
            Unrelated to FSD, what's a good example where frontier AI struggles with logical thinking that even stupid humans can figure out?

            I personally feel like that isn't really true any more.

            • kerridge0 29 minutes ago
              The recent one was should I drive my car to the car wash if it's only 300 feet from my house although it wasn't a slam dunk.
      • renlo 1 hour ago
        > When conditions are adverse, i.e. fog, heavy rain, the system simply shuts off and reverts back to manual driving.

        I also own a Tesla, and there is no indication shown to the user that FSD's vision is degraded. They need to add this in.

        For example, numerous times I have been driving my Tesla with FSD activated with ostensibly a clean and clear windshield when suddenly the car will do the "clean the windshield in front of the camera routine" without any indication that the car's camera is degraded. If people haven't seen this "clean the windshield routine", the wiper fluid is dispensed and the wiper will vigorously wipe in front of the camera only -- the rest of the windshield only gets a cursory wipe.

        This indicates to me that the camera has poor visibility and I am not informed or aware of this as a driver, which is concerning. I am often curious if there is a thin occluding film on the windshield in the camera box in front of the camera, or something that has degraded FSD's vision, but they do not give you the ability to view the camera feed, nor do they notify you that the vision is degraded. I think a "thin occluding film" may be in the camera box because my normal windshield outside of the camera box started to show a thin chemical film after a couple of months, which apparently (according to a Google search) happens when a new car off-gasses, adding a thin film of chemical byproduct to the windshield. This is my first new car so I've no idea if this is normal or not.

        • dawnerd 24 minutes ago
          Absolutely could be a clouded windshield on the inside (where it's really hard for normal people to clean). I brought this up when I got my last Model Y that it was foggy and they said it was "fine". Took it into service over a year ago and noticed they cleaned it. Clearly it's a problem but they're not being too transparent about it. I suspect they don't want to because it's not the easiest thing to remove the cover for normal people to clean.
        • bradfox2 50 minutes ago
          Maybe you have not received an alert but, yes it does, and it's annoying as all hell. Dirt, sun, etc all pop an alert about degraded performance.
      • 9dev 1 hour ago
        Birds can fly with two wings and humans should be able to fly with X number of limbs.
      • lateforwork 1 hour ago
        > humans can drive with two eyes and Tesla should be able to drive with X number of cameras

        Systems built from cameras that are only nearly as capable as human eyes and software that is only nearly as capable as the human brain will fall short overall. To match or surpass human performance, the individual components need to exceed human abilities where possible--and that's where LiDAR provides an advantage.

      • conductr 1 hour ago
        My Lexus does this too. I rarely get it due to weather however it’s how I know I’m past due for a car wash (dust on sensors)

        In any case, it seems reasonable to me that the human should be making the decisions once conditions become adverse. It’s a simple liability issue for the car company but also I’d rather trust my own judgment if it’s only 80% certain it’s not driving me off a cliff.

      • XorNot 1 hour ago
        Well that, but Elon is also downplaying the quality of the human vision system compared to the cameras Tesla's have.

        They're just not that good - nowhere near human vision performance. And a human in a car has a surprisingly good view of the road and a very fast pan tilt system to look around.

        Tesla's do not actually have 360 degree full binocular vision coverage - nor the ability for a camera to lean left or right to improve an ambiguous sensor picture.

        So while I fully believe that vision only self driving could work, within the constraints of automobile platforms and particularly the Tesla and it's current camera deployments, it is not remotely similar enough to human visual fidelity for that to solve a valid argument.

        • buildbot 1 hour ago
          Literally some of the highest dynamic range sensors attached to probably the best generally intelligent model existing we know of -

          Humans are hard to compete with! I'd want LIDAR & RADAR just to give me an edge.

      • giantrobot 1 hour ago
        > Elon has said several times that humans can drive with two eyes and Tesla should be able...

        And this is an amazingly stupid statement. Humans drive with most of their senses, not just vision. In fact our proprioception plays an important role in driving.

        Even Tesla's use of cameras is poor because they're monocular and fixed in place. Most humans have binocular vision and those visual sensors have multiple degrees of freedom and the ability to adjust focus.

        Even if you wanted to only use vision for navigation it's irresponsible to not use binocular configurations that get more reliable depth sensing.

      • michaelmrose 1 hour ago
        Remember that Elon isn't actually an engineer
        • hn_acc1 1 hour ago
          This. He makes a lot of unfounded assertions.
      • UltraSane 1 hour ago
        "Elon has said several times "

        At this point I truly don't understand why anyone cares what that liar says.

        • maxerickson 1 hour ago
          It's a normative claim.

          It might be dishonest (if he doesn't believe it is possible), but I don't think he's saying that the current systems have reached the mark.

    • altairprime 1 hour ago
      > Does "Full Self Driving (assisted)" slow down under conditions of poor visibility?

      Under conditions of poor camera visibility?

      Humans drivers can see under conditions that cameras cannot, and people will otherwise misinterpret “visibility” as referring unpredictably (and with personal biases) towards either human sight and/or camera processing and/or lidar processing.

    • Sohcahtoa82 1 hour ago
      I think vision-only can certainly work for 99.9% of driving.

      But it's that 0.1% of situations where the results will be catastrophic. Sure, you can detect vehicles, traffic cones, bikes (both bicycles and motorcycles), people, mopeds, traffic lights, lane markings, everything you'd expect on a road.

      But what about the mattress that fell out of someone's truck? If the car doesn't know what a mattress is and what it looks like, it can't really adequately determine its size based on the monocular vision that Tesla has. Sure, maybe it could use motion vectors between video frames to make a guess, but I'm not convinced that's going to work well, especially relative to LIDAR.

      Steering back to the subject at hand...

      > "In the crashes that ODI has reviewed, the system did not detect common roadway conditions that impaired camera visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred."

      I don't think I've ever had my Tesla disable Autopilot based on road conditions, though maybe it's because when conditions are bad, I've just taken manual control preemptively. I've let it go through construction areas where cones are guiding traffic outside the painted lines, and surprisingly, it's handled it fine, though I've only done this at low speeds (~20 mph).

      Camera visibility is another story. In heavy rain at night, I've had it not allow me to enable AP, though I've never had it disable AP and tell me to take control. However, it HAS limited the cruise speed based on visibility.

      All this to say...

      ...anybody buying Tesla's FSD is being swindled, as far as I'm concerned. "FSD (Supervised)" is a scam. If you have to supervise it, it's not self-driving. It's just a party trick that you have to watch to make sure nothing goes wrong.

      • hn_acc1 1 hour ago
        99.9%? I'm not an expert on climate, but I would guess that at minimum 50% of the world faces snow or fog or heavy rain while driving at times. In some places, 30%+ of all driving year-round could be in snow-inclusive conditions.

        99.9% of driving of sea-level, non-rainy, near-the-coast California/Austin weather, maybe. I would guess it's a no-go in the inland foggy conditions in CA, for example, or freezing rain in TX.

        In terms of ALL conditions? 60-70% maybe.

      • cornell532 1 hour ago
        Mine shuts off because of conditions regularly. It's very annoying.

        FSD is a better driver than me 99% of the time.

  • starkeeper 1 hour ago
    "Cameras only" is a cost cutting for profit only feature that is subject to Wile E. Coyote attacks.

    It is a shameful engineering design to leave out LIDAR and it has cost human lives.

    Let's hope Musk does not leave out something important for the moon landing. His proposal for it is absolutely ridiculous, it looks like a children's book fantasy and many smaller top-heavy craft have already toppled on the moon!

    • shmoe 1 hour ago
      NASA already let 'em off the hook for testing an elevator to get down to the surface from the ship. The best part is no part/elevator!

      One giant leap?

      • jedmeyers 1 hour ago
        > One giant leap?

        The gravity is weaker so just jump down /s

        • xnx 1 hour ago
          40 foot drops would be survivable on the moon.
    • IncreasePosts 1 hour ago
      What are wile e coyote attacks? Painting a tunnel entrance on a wall?

      If Tesla FSD is better than the average driver using it, isn't it still a net win, even if it might crash in different scenarios than a human? Especially because a human has a window to override FSD, but FSD doesn't really get a chance to override a human, except in limited scenarios like automatic emergency braking. And it gets more people using it by providing FSD at a lower cost?

      • starkeeper 33 minutes ago
        >> What are wile e coyote attacks? Painting a tunnel entrance on a wall?

        Yes!!! Thank you hopefully I will get credit for inventing this attack :)

        >> IncreasePosts 39 minutes ago | parent | context | flag | on: Tesla: Failure of the FSD's degradation detection ...

        What are wile e coyote attacks? Painting a tunnel entrance on a wall?

        >> If Tesla FSD is better than the average driver using it, isn't it still a net win, even if it might crash in different scenarios than a human?

        I don't think so because it is fooled by simple things that could easily be prevented and counting on a human to override is very risky because the human is simply not alert in the passive mode.

        I think cameras are great, but there is no excuse not to also use LIDAR.

      • kakacik 26 minutes ago
        What if I am better than average driver? Thats anyway a very low bar for success and its not going to fly with general populations, laws and so on.

        tesla cars killing people would be all over the news each time and nobody would care that similar or even marginally smaller amount of people would die anyway. People simply expect way more for giving up control.

        Is it really that hard to see?

      • ModernMech 50 minutes ago
        > What are wile e coyote attacks?

        https://www.thedrive.com/news/tesla-autopilot-fails-wile-e-c...

        > If Tesla FSD is better than the average driver using it, isn't it still a net win, even if it might crash in different scenarios than a human?

        That was the analysis when the industry was in its infancy. I think a lot more work has to go into that argument for people to accept it now that the driverless car industry has been operating for a decade+, it's not really clear that this pans out.

        For example, today you can look at a car and predict how it's going to behave because you have a good model for how people drive. But let's say in the future driverless cars are much "safer" on paper than human drivers, but they behave very differently from them such that it's hard for people to predict their behaviors.

        Now you've created a highly dynamic system where you don't have a good model for the all the actors because some of them behave one way but others behave a completely different way. Does this increase the overall safety of the system or decrease it, despite the new actors being statistically safer than the current ones?

        I don't think you can with great confidence say what's going to happen just by looking at crash rates and comparing to the current system. You're going to change the whole system by introducing large numbers of actors who "crash in different scenarios than a human"

    • delabay 1 hour ago
      The dreaded Wile E Coyote attack, the bane of every commuter's existence.
    • DonHopkins 1 hour ago
      His moon lander will deploy a parachute that keeps the lander suspended for as long as it takes for the AI to grok the fact that there is no air on the moon, and then it finally falls according to cartoon physics with a whistling sound you can hear through the vacuum.

      https://www.youtube.com/watch?v=wRSHzenjiNA

    • ModernMech 1 hour ago
      Yes, this was something that the industry figured out in 2007. But because Musk has a lot of money, people do whatever he says, no matter how ignorant and deadly and dangerous. He never even had a cogent rationale, just absurd amounts of money. The shame is so profound and widespread it's hard to fathom really.
      • starkeeper 31 minutes ago
        The renderings look like the cover of a Young Adult Sci Fi novel by Robert H. Heinlein. Have Spacesuit, Will Travel comes to mind. Probably the first true Science Fiction I have ever read!
    • loxodrome 1 hour ago
      The introduction of self-driving technology at scale will inevitably result in a few accidents no matter how many sensors are used. It's the same with every new technology deployed in high-risk situations, including motor vehicles themselves. Even malfunctioning airbags have caused fatalities. The important thing is to identify the issue early so the company can address it before more people get hurt, which the ODI in this case is thankfully doing.
      • starkeeper 29 minutes ago
        There is still no excuse for not using LIDAR in addition to anything else.
  • darkwizard42 1 hour ago
    Not too much to specifically take away yet, but appears that the degradation detection system did not function well. That is pretty egregious for FSD given a human won't be able to tell if FSD is confident or needs the human to intervene. I'd expect this to be a VERY important test case with high reliability of passing, but who knows.

    Overall, yikes.

    • himata4113 1 hour ago
      A reflection on a white truck makes it confident that it can move into that truck. I really have nothing else to say (happened to my friend).
  • jonthepirate 1 hour ago
    Tesla is a premium product - if someone is going to use FSD they know its a luxury feature and should pay for the most comprehensive safety features available which in my mind would of (of course) require lidar
    • mbreese 1 hour ago
      > Tesla is a premium product

      I'm not sure that's the case anymore. Each Tesla model has gotten more spartan over the years. And the interiors have never been all that "premium" when compared with other manufacturers. They should still offer the most comprehensive safety features, but whether or not thats because of "luxury" or not, I'm not sure.

    • ModernMech 1 hour ago
      Tesla is an expensive product, but it's never been premium as in "high quality". The interior has always felt bare like a portapotty, the body panels have always been misaligned, products like the CyberTruck have trim that is glued on and can be ripped off by a pinky finger. And of course driverless assist systems that have been known to decapitate multiple peoples and get fooled by a cartoon wall.
  • jedberg 1 hour ago
    Ironic to see this on the front page just next to the report about Waymos being 13 times safer than humans.
    • kirubakaran 1 hour ago
      Not surprising at all though, considering the "safety first" vs "yolo" approaches, right?
      • jedberg 55 minutes ago
        Not in the least. Cutting costs by only using cameras and then claiming "well humans can do it so why not us" is one of the dumbest things I've ever heard.
  • ares623 1 hour ago
    This moved from the top spot on the front page so fast
    • haunter 48 minutes ago
      Tesla / Elon is one of the topics where HN can't have a meaningful discussion.

      Some people upvote everything slightly negative about the topic: "see how bad it is!!!"

      Some people flag everything slightly negative about the topic: "we rather not let you see how bad it is"

    • kirubakaran 1 hour ago
      Looks like you're right: https://hnrankings.info/47445175/
  • mrguyorama 1 hour ago
    >In each of these crashes, FSD also lost track of or never detected a lead vehicle in its path.

    Oh good, Tesla vehicles apparently struggle with the task of "Hey, there's a car there" in degraded conditions.

    Probably don't need to worry about that while driving though.

    >Tesla also described internal data and labeling limitations that prevented a uniform identification and analysis of crash events with the subject system engaged. ODI believes this limitation could have led to under-reporting of subject crashes over portions of the defined time-period.

    I thought Tesla was a "Software" company!

    This report is insanely vague though. It's very preliminary, opened yesterday.

    • kvuj 1 hour ago
      > This report is insanely vague though. It's very preliminary, opened yesterday.

      Yeah I think posting this here is premature without any details.

      Maybe I'm misremembering things, but I feel like 4-5 years ago we didn't have these clickbait headlines that fed political discourse. It feels like reddit culture has permeated this place for a while.

      Anytime one of Elon Musk's company has a misstep, the headlines violently shoots to the top of the front page.

      • whoknowsidont 1 hour ago
        It's not premature. Every single expert in this field has warned about these issues since even ~2012 days when these types of platforms were being publicly discussed.

        This is an expected and understood result given the hardware and software involved.

        You will not get past these issues without a RADICAL improvement in camera technology paired with specialized, dedicated processing hardware matched against several (and I mean several several) "common" environment profiles.

        FSD is a scam. It's not safe. It is not technically sound.

        The fact that there aren't many more accidents with the system is a by product of consistent and well thought out road standards, car standards, other safety systems present on cars, and driver education.

        • resfirestar 1 hour ago
          You’re just reciting your priors, which I think supports GP’s point: no one is getting new information out of the posted link, so it’s probably premature to comment on it.
          • whoknowsidont 53 minutes ago
            You are misusing some of those words and I'm not even sure how to interpret them even with a hefty dose of good-faith reading.

            The report is not premature and it's not premature to comment on them.

            Can you clearly and explicitly state why you feel like the report or the commentary is premature?

            • resfirestar 22 minutes ago
              I was agreeing with kvuj and rguyorama that the original link is to an announcement that an investigation is happening, and it's too early in the process to productively discuss it. People have very strong and emotional pro or anti stances on the Tesla Vision system in general, and love an excuse to have the debate again, but in the comments here where people are talking about their stance you might notice that they don't reference any specific facts from the linked report to support their arguments. This is because the report is still vague at this stage and doesn't provide any specifics that inform the discussion.
      • corygarms 1 hour ago
        To be fair, Tesla vehicles are recalled more than any other automaker and it isn't close https://www.autoweek.com/news/industry-news/a43625242/tesla-...
      • dkenyser 1 hour ago
        > clickbait titles that fed political discourse.

        Eh, while I agree with you on the permeation of reddit culture on this board, this post is in no way clickbait or political in nature.

        In fact, the title of this post is literally copy and pasted from the problem description.

      • ModernMech 1 hour ago
        People have been warning about this for over a decade, others have died as a result of the lack of action, and yet we're still sitting on our hands waiting for the government to catch up to what expects have known for years - Tesla Autonomy is fundamentally busted/cooked/broken, and needs to be outlawed.

        The reason this stuff shoots to the top is because Elon Musk and his companies are a red alert menaces to society. People are sick of him and the damage he causes with his money, and wish he and his cars would just fuck off for good. From his cars slamming into people and property, his website spreading hate, his starships raining fiery debris, or he's personally taking an axe to government programs we rely on, everyone has cause to be absolutely done with his antics.

        But since businesses can apparently unleash autonomous murderbots onto the public roadways with zero repercussions for 10+ years, I guess we'll have to settle for endless flamewars about Musk's campaign of destruction on HackerNews instead.

        • sonofhans 35 minutes ago
          You know, I agree with everything you said, and I still wish this discussion weren’t happening on HN.
  • strathmeyer 1 hour ago
    [dead]