• ado__dev 16 hours ago

    I've had FSD since the very first beta and honestly even the 13 mile number is generous. Maybe on freeway only driving it's every 13 miles. On city streets, it's more like every 1-2 miles requires manual intervention unless you want to be the biggest nuisance on the road and a total jerk to everyone around you.

    • Someone1234 15 hours ago

      That's really why "miles" is a poor measure. It implies going faster is safer.

      It would be better to list intervention per hour and then list the categories of driving (city, rural, highway).

      • buran77 15 hours ago

        Isn't it exactly the opposite of how you describe it?

        > That's really why "miles" is a poor measure. It implies going faster is safer.

        A mile is a mile no matter how fast you drive. It means that in a 13mi trip statistically you will have to intervene once regardless of the speed.

        > It would be better to list intervention per hour

        And this means that if you have a 13mi trip you'll just floor it to reduce the time, and thus the number of interventions per time (hour).

        I agree with mentioning road type and other conditions, like weather. You want to know if the system is significantly worse in snow or fog.

      • sandworm101 15 hours ago

        >> unless you want to be the biggest nuisance on the road and a total jerk to everyone around you.

        That is likely a selling point for many customers. Call it compensation for the lack of loud tailpipes on electric cars.

        • aaomidi 14 hours ago

          It’s…not. lol

      • carlgreene 16 hours ago

        I had a really negative view on FSD from just reading and seeing stuff online until I finally decided to rent a Model Y on Turo with FSD...I was absolutely blown away.

        I drove it from Houston to Amarillo (600 miles) and had to touch the wheel only a couple times. That includes pulling me off the freeway, into the freaking parking spot next to the Supercharger, and finally through my neighborhood to the front of my house.

        For the price I don't think the MY or M3 can be beat and will surely be very high on the list for my family's next vehicle

        • buran77 15 hours ago

          > I was absolutely blown away.

          All my acquaintances who have Teslas and FSD were insisting on the same. But every time I hitched a ride as a passenger, most of the times with the explicit goal of seeing this miracle in action, I mainly got a long string of "that shouldn't have happened, usually it's a lot better".

          I'll happily take the blame on me and assume I'm a jinx but it's more likely that Tesla owners get really enthusiastic about their purchase and prop it up by committing the good experiences to anyone that will have them, and the "oh God my life flashed before my eyes" moments to /dev/null. Nobody likes the discomfort of post-decision dissonance.

          • whiplash451 15 hours ago

            While people certainly appreciate anecdata on self-driving experiences, by the very nature of driving incidents/accidents (very rare events highly unevenly distributed following a power law distribution), anecdata will always be detrimental to the collective understanding of FSD.

            We really need to stop sharing personal stories on FSD. What we need is a public statistical description of the situation.

            • daveguy 15 hours ago

              Highway driving 600 miles is significantly different than driving in a city. You should test it around town first and hope for no regressions. Yes, once through the neighborhood was good, but that's probably also significantly less than 13 miles.

              • llamaimperative 16 hours ago

                [flagged]

                • carlgreene 16 hours ago

                  If you are going to talk about statistics then please cite the accident statistics of people using things like FSD vs. not...

                  From the horse's mouth it appears using things like FSD is considerably safer than not[0]. While an independent statistic would be nice, I couldn't find one. Would appreciate seeing your source

                  [0]: https://www.tesla.com/VehicleSafetyReport

                  • llamaimperative 16 hours ago

                    I used your metrics -- a few (let's say 3) interventions over 600 miles.

                    Tesla doesn't publish the relevant stats, probably because they're so outrageously good and strong and perfect that the public can't handle how amazing FSD works in reality.

                    Edit in response to your edit: You're showing me Autopilot data which 1) is a different system and 2) is only useful on highways, so even if it did nothing at all you'd expect to see a massive reduction in frequency of accidents when it was "using Autopilot." Given that Autopilot is mostly ADS features, you'd also expect a large reduction. The better comparison point is "United States recent luxury model highway miles." Distance sensors, lane assist, auto-braking, etc all very obviously improve safety.

                    • Someone1234 16 hours ago

                      Tesla doesn't release raw data, hard to read anything into their claims either way. I do suspect though that if the raw data was beyond reproach then there would be a zip file with CSVs of it in them for third parties/academics to go over.

                      • carlgreene 16 hours ago

                        Also, intervention !== accident. Not sure where you made that conclusion

                        • llamaimperative 16 hours ago

                          Were you intervening on a system that was working perfectly?

                          "Using FSD as full self driving" == no intervention possible. Would you not have crashed if you couldn't intervene?

                          • andrewstuart2 15 hours ago

                            I've used FSD as a subscription from time to time. I can't think of an intervention (and there have been many) that was to avoid an accident. They were all things like "that's closer to those cones than I'd like to be," "okay, apparently missing lane markers that have yet to be repainted is understandably hard," or "that was a trailer speed limit sign -- we're not towing so now we're going way below the posted limit."

                            • llamaimperative 15 hours ago

                              The first two examples, in most scenarios, would yield an accident if unrecovered. The lattermost is also quite dangerous on highways (because driving way below posted limits causes accidents).

                            • ericd 15 hours ago

                              A lot of my interventions when trying it out were things like it going too close to curbs for my comfort, so more violations of my own defensive driving standards rather than potential accidents. But! It did have some near traffic violations that would have happened if I hadn’t intervened.

                              But the distance between my own driving standards and accident is large.

                              • llamaimperative 15 hours ago

                                I don't get it. What does "too close to X for my comfort" mean if not "I believed I was going to make contact without intervening?"

                              • Retric 15 hours ago

                                Not all failures relate to safety. Self driving needs to get somewhere specific reasonably quickly, and avoid accidents.

                                • pants2 15 hours ago

                                  There's a pretty big spectrum between perfect driving and a collision. FSD could drive on the wrong side of the road for miles with no accident because it's an empty road, or oncoming traffic swerved to avoid you.

                                  • undefined 14 hours ago
                                    [deleted]
                              • ra7 14 hours ago

                                That report is pretty much worthless and fails any sort of statistical rigor. It makes many apples-to-oranges comparison and doesn't control for variables such as road type (highway/city), geography, time of day, age of cars and safety features.

                                Tesla refuses to provide raw data like the other self driving companies do to avoid scrutiny.

                                • undefined 16 hours ago
                                  [deleted]
                            • modeless 15 hours ago

                              This is true, however it has been improving quickly. Releases are coming once every couple of months. There is already a newer release than the ones they tested, and each release is noticeably better. There is no indication of a ceiling yet.

                              I do take issue with the claim that "its seeming infallibility in anyone's first five minutes of FSD operation breeds a sense of awe that unavoidably leads to dangerous complacency". Waymo claims this as well, citing it as the reason they never released a driver assistance feature and went for full autonomy instead. However, this is is essentially speculation that has not been borne out in practice. There has been no epidemic of crashes from people abusing Autopilot or FSD. It's been on the roads for years. If "dangerous complacency" was a real problem we it would be blindingly obvious from the statistics by now. There have been just a few well publicized cases but statistically there is no evidence that this "dangerous complacency" problem is worse than normal driving.

                              • deely3 12 hours ago

                                > each release is noticeably better.

                                FSD beta was introduced four years ago. How many more updates will it take for it to achieve FSD as full self-driving capabilities?

                                • modeless 12 hours ago

                                  This is the trillion dollar question. The pace of improvement has increased now that they are using an end-to-end neural net architecture. My personal guess is that they will be able to reach Waymo's current level within 5 years, possibly significantly sooner. But I still expect that a car computer upgrade will be required, and probably a camera upgrade too.

                                  I know Elon says "next year" every year, but put aside his delusional timelines and think about the monumental change that will occur in the world once a car can drive itself better than a human with ~$1000 worth of hardware. Does it really matter if it takes a few extra years? I can wait five more years for that!

                              • pavlov 15 hours ago

                                Like a fool I purchased the FSD feature on a new Tesla in March 2019. All this time later, it still does absolutely nothing in my country. It’s actively dangerous to use because it can’t even recognize speed limits and will happily drive at 120 km/h in a 100 km/h zone.

                                I’m going to get rid of the car soon. This feature cost 7,500 euros but its resale value is essentially zero because everyone knows it’s a complete joke.

                                Obviously my next car won’t be from this scam company. Worst purchase I ever made.

                                • Larrikin 15 hours ago

                                  If you sell the car can the software locked features actually be transferred to the next owner? It seems exactly the kind of thing that would be disabled and require the new owner to purchase.

                                  • ado__dev 15 hours ago

                                    They do transfer to the next owner if you sell it, unless you sell it/trade it in to Tesla. In that case they remove it and charge for it again (or sometimes to boost sales they'll include FSD in used cars)

                                • adamwong246 15 hours ago

                                  Why should I trust Tesla's AI with my life, much less everyone else's? They couldn't even get the CyberTruck's trim right! It's wild that we have not demanded greater governmental oversight over consumer AI products but in time it will become inevitable.

                                  • ravenstine 15 hours ago

                                    The name being false advertising should be reason enough to never use it.

                                    • apelapan 15 hours ago

                                      It doesn't maintain constant altitude over sealevel! You can't give it a compass bearing and have it follow that in a perfectly straight line!

                                  • nkrebs13 15 hours ago

                                    FSD is getting so good so fast. The difference between 1 year ago and now is night and day. It's a godsend for road trips and it amazes me with each passing month's improvements.

                                    It's not perfect and people shouldn't expect that. But I don't understand how anyone experiences FSD and isn't amazed. It's not unsafe -- if anything my interventions are because it's being _too_ safe/timind.

                                    Weather forecasting isn't perfect. But it's pretty good! And it's getting better! Just because weather forecasting isn't perfect doesn't mean I won't use it and it doesn't mean we should stop improving it.

                                    • misiti3780 15 hours ago

                                      There are a lot of smart people on here that are jealous of Musk's accomplishments and/or disagree with his politics.

                                      • trog 4 hours ago

                                        Is that it? Or is that we read articles like this one based on actual data from actual real world testing and evaluation, and recognise that the risk of failure is probably beyond what most of us are willing to tolerate?

                                      • alphabettsy 15 hours ago

                                        Running red lights and stop signs is unsafe. I’ve experienced it and there are many videos demonstrating it including very recently.

                                        It’s amazing. It’s better than adaptive cruise control, but it has a long way to go because it’s often enough unsafe.

                                        • thimabi 15 hours ago

                                          The problem is that, as it stands today, FSD requires constant human attention to catch its eventual mistakes. Humans are notoriously bad at that, and the promises of FSD make drivers even more oblivious to their surroundings, ultimately worsening this issue.

                                        • avalys 15 hours ago

                                          How does it respond when there is a police officer directing traffic?

                                          What if the police officer gives a verbal command?

                                          FSD, as Tesla markets it, is hype. They are no where close to a marketable solution for the use cases they advertise - Robotaxis, the ability to step out of your car and have it park somewhere else by itself, etc.

                                          Yes, they will get it to 99% at some point - but 99% is not good enough for legal liability.

                                          FSD is an ambitious goal and I don’t criticize Musk for pursuing it. But I will criticize him for using it as a distraction from the fact that Tesla has a stale and/or uncompetitive product line and is rapidly losing their lead to both conventional OEMs and Chinese EV makers.

                                          • SmartJerry 12 hours ago

                                            I disagree that a'99%' level won't get good enough for legal liability. Let's say normal human drivers make an error once every 10 miles, but Tesla's FSD makes an error once every 100 miles. The obvious choice is Tesla FSD even though it is not 100% perfect, and in fact it is 10 times better than a human driver. They might even make it illegal to drive without FSD once it becomes widely available. In the end all the speculation about how good it is or can be means absolutely zero because the results will be created in the end by insurance companies or Tesla, which has it's own insurance company. If it actually is causing more crashes, insurance companies will start charging higher rates, less people will buy Tesla's etc. But we aren't seeing that, with FSD already out there, the insurance premiums for Teslas are not skyrocketing to cover all the accidents. I sort of glossed over this but Tesla also provides insurance which guarantees at least one market participant in FSD, if they cannot make money insuring their own cars FSD will fail, if they can make a profitable insurance company insuring them then FSD will succeed.

                                            • margalabargala 15 hours ago

                                              Like is often the case with these things, it will continue being approached until it's essentially a solved problem.

                                              On the front page right now is the 10-year-old xkcd comic about how determining if a photo is of a bird takes a team of researchers. This is now trivial.

                                              Cars driving themselves are doing better than they were 3 years ago. Three years from now they will be better still. I don't know how long it will take, and I don't know who will get there first, but I will bet you that the people who are currently in their 50s will never have to give up driving, because by the time they are too old to drive, self-driving will be good enough for them to use.

                                            • jphalimi 15 hours ago

                                              I am so, so tired of Tesla's claims that FSD is "multiple times safer than humans" when the data they base these claims on are basically people using FSD in a totally safe environment which made them use FSD in the first place (mostly long straight highways).

                                              Anyone trying FSD in a crowded city environment would shit their pants. Unprotected lefts are very often a mess, and interventions are legion. It is really a breath of fresh air to hear news outlets report finally about the actual state of the technology.

                                              • whiplash451 15 hours ago

                                                It's even worse than this: there is no evidence that FSD is safer on highways due to the extremely low number of bad events there.

                                                It would take literally hundreds of millions of miles driven on highways to get statistical significance.

                                                This is also why L3 is going to be much harder than people think.

                                                • modeless 12 hours ago

                                                  Hundreds of millions of miles? That's no problem at all. FSD has already been driven over a billion miles. Basic Autopilot, 9 billion.

                                              • y-c-o-m-b 8 hours ago

                                                FSD?! They can't even fix the damn windshield wipers!

                                                The car ahead of me decides to clean their windshield and 3 droplets get on my windshield? Tesla: WIPER LUDICROUS SPEED ENGAGED!

                                                I just drove next to a semi and got blasted with a tsunami of water in a rain-storm? Tesla: ...

                                                • tzs 7 hours ago

                                                  I'm baffled by Tesla's approach to this.

                                                  There are two ways cars currently do automatic wipers.

                                                  One way uses a very clever, simple, cheap but very effective sensor. Technology Connections did an episode [1] a few months ago on these sensors. The electronics in it are just a couple infrared LEDs and infrared photodiodes.

                                                  The other way is to use a camera, such as the camera you already have for your driver assist functions, and try to use computer vision software to figure out how much rain there is.

                                                  Pretty much everyone except Tesla uses the first way. Tesla uses the second way.

                                                  What baffles me is that of the three options they might have pursued:

                                                  1. Use a rain sensor like the ones everyone else uses,

                                                  2. Use a camera and computer vision software which doesn't work as well, and

                                                  3. Do not have automatic wipers,

                                                  it seems to be that #2 is the worst option. Automatic wipers aren't really needed. Once you get used to where the controls are and how they work on a car without automatic wipers adjusting your wipers for the current conditions becomes one of those things you do completely unconsciously.

                                                  So if my next car, like my current car, does not have automatic wipers it is not going to bother me.

                                                  But if it has them and the don't work well, that very likely will annoy me, at least until I get tired of having to manually intervene and turn auto mode off or until manual intervention becomes something that I do unconsciously and so forget about them. In either case I end up back in a situation equivalent to a car that doesn't have automatic wipers, but with a lower opinion of my car maker.

                                                  Thus this seems like a feature that the car company is better off not doing unless theirs will be as good as what everyone else has which currently means using the IR LED/photodiode sensor.

                                                  [1] https://www.youtube.com/watch?v=TLm7Q92xMjQ

                                                • throwaway2016a 15 hours ago

                                                  Like other here I think 13 miles may be generous for city driving but pretty reasonable for highway.

                                                  With that said, it works MUCH better for me than it did a couple years ago and I find most of the time I disengage it is not because it actually needed human intervention but because it wasn't aware of the social norms at certain intersections.

                                                  For example, I have an intersection near my house that requires you to inch out and essentially floor it first chance you get if you want any hope whatsoever of taking a left hand turn, but FSD will (understandably, I think) not do that.

                                                  • shellfishgene 15 hours ago

                                                    In Germany someone is sueing Tesla over "phantom braking", and the judge ordered an idependent court-appointed expert to check. After 600 km of driving the car braked without reason, and the expert wrote in his report the situation was dangerous enough that he had to stop any further testing. This is now official record and can be referred to in other lawsuits. We'll see what happens...

                                                    • seshagiric 15 hours ago

                                                      I use the monthly subscription every couple of months on my Model Y and the FSD has become quite good. For me the two recurring problems are does not follow traffic rules when merging and navigating to exit in case of back 2 back traffic on highway simply does not work. However in rest of the cases its pretty good or perhaps better way to say the best in market right now.

                                                      • bigtones 16 hours ago

                                                        Waymo requires interventions about that often driving in San Francisco as well from my experience over many trips. Their interventions are automatic when the car calls back to home base to make a determination as to what to do next and the operator makes a choice on how to proceed. Happens about once every half an hour travelling on Waymo in SF for me.

                                                        • panarky 16 hours ago

                                                          I've taken hundreds of Waymo rides in SF, LA and Phoenix and the car only needed assistance one time when it got stuck in the middle of a protest.

                                                          • Workaccount2 15 hours ago

                                                            I wondering if OP is assuming when an intervention is happening rather than actually knowing it is happening.

                                                            • misiti3780 15 hours ago

                                                              Or when there is an unmapped construction site, or when there is anything not mapped correctly.

                                                              • panarky 13 hours ago

                                                                In my experience Waymo cars handle all sorts of unmapped and unpredictable situations very well. New construction sites, merging lane closures, detours, accidents, road signs that have been defaced or knocked down, malfunctioning signal lights, jaywalkers, bike riders running red lights, delivery trucks blocking lanes, double-parked cars, even missing manhole covers, debris in the roadway, etc.

                                                              • dingnuts 15 hours ago

                                                                I just have to call out how absolutely absurd this statement is as an example of a lived West Coast USA stereotype.

                                                                Of course, the self-driving car produced by silicon valley meets its natural predator in the wild of California -- a protest! Hysterical. You can't make this stuff up. Absolutely fantastic example of life being better than fiction.

                                                              • xnx 16 hours ago

                                                                How do you know when a Waymo vehicle is receiving guidance from remote support?

                                                                • tivert 16 hours ago

                                                                  >> Their interventions are automatic when the car calls back to home base to make a determination as to what to do next and the operator makes a choice on how to proceed. Happens about once every half an hour travelling on Waymo in SF for me.

                                                                  > How do you know when a Waymo vehicle is receiving guidance from remote support?

                                                                  If it calls back for guidance like that, I'm sure there'd have to be a noticeable delay where the car stops and waits for a response. It's going to take at least 10-30s for a human to context switch into the car's situation and input the guidance, probably longer.

                                                                  To get a faster response from a human, the human would have to be continuously engaged with the car's driving task.

                                                                  • toast0 15 hours ago

                                                                    If the need for intervention is predictable, like a problem intersection that's difficult to avoid, they could conceivably assign an operator early, so they have time to engage and provide support immediately when required.

                                                                    It would pretty neat if the Waymo itself could accurately predict the need for assistance in the near future as well. But I'm sure that requires pretty specific conditions.

                                                                  • bigtones 6 hours ago

                                                                    The car tells you verbally and shows you on the screen that it is calling back to base to ask for assistance "to get you moving again".

                                                                  • iwontberude 16 hours ago

                                                                    I took four trips the other day and couldn’t notice any interventions at all. So maybe they are just so sneaky that I can’t notice.

                                                                  • bdjsiqoocwk 16 hours ago

                                                                    "requires human intervention every 13 miles" is a horrible metric, because it makes it sound it's not so bad until you remember that the moments for intervention are unpredictable and are also when you're just about to die.

                                                                    • QuercusMax 16 hours ago

                                                                      MTBF of 13 miles, where "failure" could mean "fatal auto crash", sounds pretty terrible to me. That's 5 times per hour at 65 MPH.

                                                                      • enaaem 16 hours ago

                                                                        Experience drivers often consider driving to be a relaxing activity. With FSD you practically become a driving instructor who has to super alert all the time.

                                                                        • LASR 16 hours ago

                                                                          This is exactly how I feel about FSD in my Model 3.

                                                                          It’s like supervising a teenager learning to drive, ready to take over at anytime.

                                                                          It’s a lot less stressful to just drive myself.

                                                                          • pants2 15 hours ago

                                                                            I do find that the rule-based, classic CV "lane assist / adaptive cruise control" of other makes is preferable, because it's extremely predictable. Tesla's full neural model is less predictable.

                                                                            • QuercusMax 15 hours ago

                                                                              I didn't hate that stuff on my 2019 Nissan Leaf. It was kinda funny how it would bounce back and forth between the lane lines sometimes. I didn't trust it enough to ignore the road, just enough to be able to switch my music without worrying about crashing.

                                                                            • gessha 13 hours ago

                                                                              I’m being nitpicky here but if you’re not driving in an instructor’s car and have a second steering wheel and pedals, you can’t take over. You can only touch the steering wheel from the passenger seat and maybe the handbrake. Full control including pedal control is impossible.

                                                                              As of now, I’d trust a teenager I’ve personally taught over some ML system any time.

                                                                        • jjk166 16 hours ago

                                                                          I mean during regular driving there are moments requiring intervention that occur at unpredictable times and when you're about to die if not properly responded to, they're just awash in a sea of lower stakes interventions. My commute is about 3 miles and this morning I had two people in front of me slam their brakes unexpectedly and another person swerve into my lane just ahead of me. So that's roughly one high stakes human intervention per mile.

                                                                          I'm not saying that 13 per mile is good enough, but the metric does seem useful.

                                                                          • SketchySeaBeast 15 hours ago

                                                                            I think the difference is that you're already in prepared for intervention mode at that time, with hands on the wheel, foot on the pedal, and already observing traffic.

                                                                        • oxqbldpxo 15 hours ago

                                                                          I enjoy driving

                                                                          • MisterTea 16 hours ago

                                                                            Lucky 13.

                                                                            • Workaccount2 15 hours ago

                                                                              It's incredible that Tesla is nearly a $1T corporation because it is about to announce robo taxi. Meanwhile it's actual car sales are shrinking quarter by quarter. And it's CEO supports the presidential candidate that wants to do away with carbon credits (~40% of tesla's net income).

                                                                              If you any evidence that the market isn't rigged, but just full of gullible idiots, Tesla is it.

                                                                              • misiti3780 15 hours ago

                                                                                It's pretty simple --

                                                                                If you think Tesla is leading and will solve the FSD problem first, the 1T market cap makes sense, and is possibly undervalued. If you hate Elon Musk because you disagree with his politics and dont drive a Tesla w/ FSD, you make comments like this on HN.

                                                                                • archagon 13 hours ago

                                                                                  And here you are, making comments about people making comments about Musk on HN.

                                                                                  • misiti3780 13 hours ago

                                                                                    Anything I say above not true?

                                                                                    • archagon 13 hours ago

                                                                                      You claim that people hate Musk because of jealousy and/or politics. I don’t see any evidence of that. What I see is people being unwilling to entrust their lives to someone with a recent history of incredibly childish and questionable behavior. He shitposts on Twitter all day and runs his companies like a pig-headed 19th century baron. I don’t detect an iota of leadership in the man anymore: only ego.

                                                                                      Why would functional FSD emerge from a company with a toxic culture like Tesla’s? Any self-respecting engineer with advanced skills will jump ship at the first opportunity.

                                                                                      • misiti3780 12 hours ago

                                                                                        You're conceding that you're more preoccupied with his tweets (i.e., his politics) than his actual achievements, which perfectly supports my point. Thousands of highly skilled engineers continue working at his companies, and they aren’t "jumping ship." It’s not incompetent engineers that are behind innovations like FSD 12, Cybertruck, Falcon 9, Starship, Raptor 3, and so on. These are complex, cutting-edge technologies—stupid engineers don't create them.

                                                                                  • Workaccount2 15 hours ago

                                                                                    There are already non-tesla robotaxis in service that you can get in today. Tesla has been over promising and under delivering for over a decade, there is no evidence this time will be any different.

                                                                                    • misiti3780 12 hours ago

                                                                                      Sure, but they dont scale. Each of those cars costs 100-120K

                                                                                      • trog 4 hours ago

                                                                                        > Sure, but they dont scale. Each of those cars costs 100-120K

                                                                                        Why do you think they don't scale? Tax licenses have historically cost significantly more than this. If the cost to run is much lower - no human driver and cheaper energy costs - it seems like this will scale at least as well as taxis?