Do I understand it correctly? Crash data gets automatically transmitted to Tesla, and after it was transmitted is immediately marked for deletion?
If that is actually designed like this, the only reason I could see for it would be so that Tesla has sole access to the data and can decide whether to use it or not. Which really should not work in court, but it seems it has so far.
And of course I'd expect an audit trail for the deletion of crash data on Tesla servers. But who knows whether there actually isn't one, or nobody looked into it at all.
>> Tesla has sole access to the data
All vehicle manufacturers have sole access to data. There isn't a standard for logging data, nor a standard for retrieving it. Some components log data and it only the supplier has the means to read and interpret it.
Mostly incorrect. At least for the US.
If your car has an EDR, what data it collects is legislated. There is not a standard interface for retrieving it, but the manufacturer is required to ensure that there is a commercially available tool for data retrieval that any third party can use.
https://www.ecfr.gov/current/title-49/subtitle-B/chapter-V/p...
There is a world of difference between "you need our special hardware and software to read the data" and "we deleted it lol".
Another reason is if there’s other kinds of data that gets uploaded to Tesla, and the code for uploading crash data reuses that code.
For the first kind of data, deleting the data from the car the moment there’s confirmation that it now is stored at Tesla can make perfect sense as a mechanism to prevent the car to run out of storage space.
Of course, if the car crashed, deleting the data isn’t the optimal, but that it gets deleted may not be malice.
Data retention is legal's bread and butter. There's no chance such a decision is accidently made by reusing code.
Anytime data is recorded legal is immediately asking about retention so they don't end up empty handed in front of a judge.
Every byte that car records and how it is managed will be documented in excruciating detail by legal.
> Data retention is legal's bread and butter.
As is deleting data. Also, for, say, training data for Tesla’s software, I don’t see legal requirements for keeping it around,
> There's no chance such a decision is accidently made by reusing code.
At Tesla? I know about nothing about their software development practices, but from them, it wouldn’t surprise me at all if this were accidental.
Agreed. Tesla axed their marketing department, why assume they have much of a legal department overseeing how the data uploads are managed?
> Anytime data is recorded legal is immediately asking about retention so they don't end up empty handed in front of a judge.
In my experience, they are setting automated 90 deletion policies on email so they don't end up with surprises in discovery.
Deleting after a certain time makes sense, certainly. Deleting immediately seems dubious to me. Though the descriptions in the article are vague enough that we might be missing some big aspects.
But in the end we wouldn't be discussing this at all if Tesla had simply handed over the data from their servers. If they can't find it, it isn't actually there or they deliberately removed it this affects how I view this process.
Two copies are better than one. If you immediately erase the data, you better be sure the transmitted data is safe and secure. And obviously it wasn't.
I guess one charitable way to look at it is that after a crash, external people could get access to the car and its memory, which could potentially expose private data about the owner/driver. And besides private data, if data about the car condition was leaked to the public, it could be made to say anything depending on who presents it and how, so it's safer for the investigation if only appointed experts in the field have access to it.
This is not unlike what happens for flight data recorders after a crash. The raw data is not made public right away, if ever.
If Tesla securely stored this data and reliably turned it over to the authorities, I wouldn't argue much with this.
But the data was mostly unprotected on the devices, or it couldn't have been restored. And Tesla isn't exactly known for respecting the privacy of their customers, they have announced details about accidents publicly before.
And there is the potential conflict of interest, Tesla does have strong incentives to "lose" data that implicates Autopilot or FSD.
I would rather my cars not automatically rat me out to the authorities, personally.
I wouldn't want them to have selective memory in favor of juicing Elon's marketing scams either.
Your property isn't ratting you out. The software you license from Tesla is ratting you out.
Such a pity there is no way to get an electronics minimal car control unit. Funny how conspicuously unimplemented functionality works.
that's like worrying about external people having access to the drivers wallet in the case of a fatal crash. Like yeah sure but it's more likely that Tesla is sketchy considering their vested interest is controlling crash data reports
It's probably a bit like "This call may be recorded for quality purposes." That's a disclaimer that's usually required by the authorities, to let you know that you're being recorded, but it lets them off the hook, if the recording would be inconvenient to them. If it supports their side, they 100% always have it, but if it supports the caller's side, then it seems they didn't actually record that call ...so sorry...
Tesla's fairly notorious for casual treatment of customer car data (which they have a lot of). There was an article, recently, about how in-car video recordings were being passed around the office.
I know that at least one porn actress recorded a scene in a self-driving Tesla. I'll bet that recording made the rounds "for quality purposes."
> "This call may be recorded for quality purposes."
It's a disclaimer, but it also grants permission for you to record.
Years back I bought a model3 infotainment unit on eBay to hack on - it’s absolutely insane at the amount of data contained on them. After gaining access to the system I was able to get the VIN of the car and find the salvage auction from the car it came out of - it had been wrecked. I then was able to get all the location data that gets logged, showing a glimpse of the previous owners life (house, work, stores they went to, etc) as well as the final resting place of the car. The last gps locations logged were at the end of a “T” intersection in North Carolina - google street view gave a nice look at the trees the car most likely hit :>
Neat! What's the hardware like, a Linux-ish computer with SD cards? Or SSD? Which filesystem?
HW wise, the older units were intel atom based cpu (latest gen is amd I believe?) - the hardware is typical embedded stuff - cpu + eMMC + bt/wifi mcu + cellular daughter card. OS is linux + QT UI stuff. I would expect things have changed for newer HW revisions, but the previous gen did not utilize encryption (dmcrypt) so all data was unprotected at rest.
I suspect it's Windows, actually, and I'm pretty sure the UI is some form of C#.
They tried to recruit me for the UI. If I lived closer, I would have jumped on it. Not only was I bit of a Tesla fanboy at the time, I used to work across the street from their office and really liked that area. (Deer Creek Road in Palo Alto.)
It’s Linux and the UI is Qt
> a glimpse of the previous owners life...
...and potentially death?
I tried to dig up news articles in the area and could not find any reported fatalities - but yea, maybe?
> In the annotated video played for the jury, the vehicle detects a vehicle about 170 feet away. A subsequent frame shows it detecting a pedestrian about 116 feet away. As McGee hurtles closer and closer, the video shows the Tesla planning a path through Angulo’s truck, right where he and his girlfriend were standing behind signs and reflectors highlighting the end of the road.
So the Tesla detected the vehicle and the pedestrian, and then plans a path through them? Wow! How bad is this software?
AI is so unlike anything weve ever seen and its going to revolutionise the world and its literally gna be skynet except it pathfinds like a counterstrike bot just ignore that bit
> Immediately after the wreck at 9:14 p.m. on April 25, 2019, the crucial data detailing how it unfolded was automatically uploaded to the company’s servers and stored in a vast central database, according to court documents. Tesla’s headquarters soon sent an automated message back to the car confirming that it had received the collision snapshot.
> Moments later, court records show, the data was just as automatically “unlinked” from the 2019 Tesla Model S at the scene, meaning the local copy was marked for deletion, a standard practice for Teslas in such incidents, according to court testimony.
Wow...just wow.
It is wild to me that people put so much trust in this company.
Even if Tesla hadn't squandered it's EV lead and was instead positioned to be a robotics and AI superpower, is this really the corporate behavior you would want? This is some fucking Aperture Science level corporate malfeasance.
It’s pretty typical of corporations, the cult surrounding its leader notwithstanding. Not even just US corporations - the VW emissions scandal was huge, and today they are doing as well as ever. That was a big shakeup; the kind of stuff we are seeing from Tesla feels like business as usual.
VW emission scandal ended with actual judgement and two prison sentences.
Miles and miles different - they were not completely untouchable the way tesla and similar hot companies are.
No, it's not typical, because you don't see huge numbers of people defending VW's emissions fraud.
Nope - the VW episode was terrible, but they faced large fines and corrected course and it's history. I'm still slightly squeamish about accepting them but they've turned it around and I think I read have just overtaken Tesla in EV sales in Europe (a self-inflicted Musk wound, of course).
I see no course correction from Tesla. Just continuing and utter tripe from it's CEO, team, and Musk-d-riders.
This is an on-going issue for them and, at this point, with no further change? I hope it drives them into the ground (Autopilot, natch).
You can actively criticize VW on the internet without an army of sycophants coming for you. The standard behavior of Tesla stans is that any problem with the vehicle is in fact your fault and only your fault because it would not be possible for Tesla to do something wrong. It is cult-like.
I just hate the corrupt laws mandating car dealerships.
I am trying to imagine a scenario under which that is defensible and does not raise various questions including compliance, legal, retention. Not to mention, who were the people who put that code into production knowing it would do that.
edit: My point is that it was not one lone actor, who would have made that change.
Assuming no malice, I'd guess it's for space saving on the car's internal memory. If the data was uploaded off of the car, there’s no point keeping it in the car.
I think your answer is the most logical to me as a developer, we often miss simple things, the PM overlooks it, and so it goes into production this way. I don't think its malicious. Sometimes bugs just don't become obvious until things break. We have all found an unintended consequence of our code that had nothing wrong with it technically sooner or later.
Dude we're at the point where cars are practically gathering data on the size of your big toe.
The performance ship sailed, like, 15 years ago. We're already storing about 10000000 more data than we need. And that's not even an exaggeration.
That's 100% wrong. In standard practice, collision files are to be "locked", prevented from local deletion.
>> That's 100% wrong. In standard practice, collision files are to be "locked", prevented from local deletion.
I worked a year in airbag control, and they recorded a bit of information if the bags were deployed - seatbelt buckle status was one thing. But I was under the impression there was no legal requirement for that. I'm sure it helps in court when someone tries to sue because they were injured by an airbag. The argument becomes not just "the bags comply with the law" but also "you weren't wearing your seatbelt". Regardless, I'm sure Tesla has a lot more data than that and there is likely no legal requirement to keep it - especially if it's been transferred reliably to the server.
I don't think its wrong, have you ever pushed code that was technically correct, only to find months later that you, your PM, their manager, their boss' boss, etc all missed one edge case? You're telling me no software developer has ever done this?
You discover it the day you a person dies and your relevant data is not there. Next time it's no longer a "missed edge case".
In a perfect world where developers are omnipresent and all knowing sure? This isn't a perfect world. Heck, how do you account for the developer who coded it leaving the company, and now that code has been untouched for half a decade if not more, because nothing is seemingly wrong with the code, what then? Who realizes it needs to be changed? Nobody. The number of obscure bugs I find in legacy code that stump even the most experienced maintainers never ends.
It's not an edge case; it's wanton criminal sabotage, destruction of evidence, and it deserves a prison sentence for anyone facilitating it at any level.
This is assuming malice out of the gate without any evidence, which is not what we do here on HN. If this is in fact maliciously done, please provide evidence.
Sounds like a pretty standard telemetry upload. You transmit it, keep your copy until you get acknowledgement that it was received so you can retry if it went wrong, then delete it when it succeeds.
It’s just worded to make this sound sketchy. I bet ten bucks “unlinked” just refers to the standard POSIX call for deleting a file.
The process of collecting and uploading the data probably confuses a lot of non-technical readers even if it worked as per standard industry practices.
The real issue is that Tesla claimed the company doesn't have the data after every copy was deleted. There's no technical reason to dispose of data related to a crash when you hold so much data on all of the cars in general.
Crash data in particular should be considered sacred, especially given the severity in this case. Ideally it should be kept both on the local black box and on the servers. But anything that leads to it being treated as instantly disposable everywhere, or even just claiming it was deleted, can only be malice.
> The real issue is that Tesla claimed the company doesn't have the data after every copy was deleted.
Exactly. The issue is deleting the data on the servers, not a completely mundane upload-then-delete procedure for phoning home. This should have been one sentence, but instead they make it read like a heist.
> The real issue is that Tesla claimed the company doesn't have the data after every copy was deleted. There's no technical reason to dispose of data related to a crash when you hold so much data on all of the cars in general.
My money is on nobody built a tool to look up the data, so they have it, they just can't easily find it.
Sketchy is that then someone takes “affirmative action to delete” the data on the server as well.
Also this is not like some process crash dump where the computer keeps running after one process crashed.
This would be like an plane black box uploading its data to the manufacturer, then deleting itself after a plane crash.
I’ll bet another ten bucks that this is a generic implementation for all of their telemetry, not something special cased for crashes.
Deleting the data on the server is totally sketchy, but that’s not what the quoted section is about.
How handling an automobile crash not as a special case is the weird part. Even in the <$50 dashcams from Amazon there is a feature to mark a recording as locked so the auto delete logic does not touch the locked file. Some of them even have automatic collision detection which locks the file for you.
How Tesla could say that detecting a collision and not locking all/any of the data is normal is just insane.
That one's easy: nobody at Tesla cares about having this feature
That might be the case but the article seems to indicate the system knew the data was generated from an accident. So, removing to save space on the car should now be a secondary concern.
The problem with this is that it destroys any chain of evidence. Tesla "lost" this data, in fact. You would never want your "black box" in your car delete itself after uploading to some service because the service could go down, be hacked, or the provider could decide to withhold it, forcing you into a lengthy discovery / custody battle.
This data is yours. You were going the speed limit when the accident happened and everyone else claims you were speeding. It would take forever to clear your name or worse you could be convicted if the data was lost.
This is more of "you will own nothing" crap. And mainly so Tesla can cover its ass.
It is a car. A vehicule which can be involved in a fatal accident. It is not a website. There is no "oversight", nor is it "pretty standard" to do it like that: when you don't think about what your system is actually doing (and that is the most charitable explanation), YOU ARE STILL RESPONSIBLE AS IF YOU HAD DONE IT ON PURPOSE.
One of Tesla’s things is that their software is built by software people rather than by car people. This has advantages and disadvantages.
Maybe this is not appropriate for a car, but that doesn’t excuse the ridiculous breathless tone in the quoted text. It’s the worst purple prose making a boring system sound exciting and nefarious. They could have made your point without trying to make the unlink() call sound suspicious.
I'm a software person but I still take the car person approach when I know i'm building a car. You have a responsibility to understand the gravity of the enterprise you undertake and to take appropriate steps given that gravity. Ignorance shouldn't be a defense, and if you don't know what you don't know then god help you.
There are software people who know what they're doing - some write flight software or medical equipment software. They know how to critically think about the processes of their systems in detail.
So either the problem is Tesla engineers are fucking stupid (doubtful) or this is a poor business/product design.
My money is on the latter.
> their software is built by software people rather than by car people
The rogue engineer defense worked so well for VW and Dieselgate.
The issue of missing crash data was raised repeatedly. Deleting or even just claiming it was deleted can only be a mistake the first time.
> One of Tesla’s things is that their software is built by software people rather than by car people. This has advantages and disadvantages.
So we just shrug because software boys gotta be software boys? That’s completely insane and a big reason why a lot of engineers roll their eyes about developers who want to be considered engineers.
Software engineers who work on projects that can kill people must act like the lives of other people depend on them doing their job seriously, because that is the case. Look at the aviation industry. Is it acceptable to have a bug in the avionics suite down planes at random and then delete the black boxes? It absolutely is not, and when anything like that happens shit gets serious (think 737 MAX).
The developers who designed the systems are responsible, and so are their managers who approved the changes, all the way to the top. This would not happen in a company with appropriate processes in place.
Well if it would be EU for GDPR you can assume contract was terminated because of force majeure and you are not allowed to keep customer data past contract. /s
The artifact in question was a temporary archive created for upload. I can't think of a scenario in which you would not unlink it.
You were right in your first statement, but your follow up is a bad assumption, I think everyone here will agree that in the case of a crash this data should be more easily available and not deleted.
Assuming its not intentionally malicious this is a really dumb bug that I could have also written. You zip up a bunch of data, and then you realize that if you don't delete things you've uploaded you will fill up all available storage, so what do you do? You auto delete anything that successfully makes it to the back-end server, you mark the bug fixed, not realizing that you overlooked crash data as something you might want to keep.
I could 100% see this being what is happening.
And then you delete the server copy?
They didn’t delete the server copy though. That’s what this article is about.
> Tesla later said in court that it had the data on its own servers all along
Obviously no. The behavior of Tesla in discovery of this case is ridiculous. But treating this technical detail as an element of conspiracy is also ridiculous.
If that was the only thing going wrong, yes. But when you have a pattern of conspiracy, deleting immediately on the client instead of having a ring buffer which ages out the oldest event, may be a malicious choice.
I haven't seen anything in the (characteristically terrible and vague) coverage of this case that suggests the Tesla deleted the EDR.
> I can't think of a scenario in which you would not unlink it.
Perhaps if there is some sort of crash.
Exactly. That's the last data I would ever delete from the car, if I was trying to preserve valuable data.
All of their actions point at intentionally wanting that data to disappear, they even suggested turning it on and updating it, which everyone who's ever tried to protect important information on a computer knows is that exact opposite to what you should do.
Any competent engineer who puts more than 3 seconds of thought into the design of that system would conclude that crash data is critical evidence and as many steps as possible should be taken to ensure it's retained with additional fail safes.
I refuse to believe Tesla's engineers aren't at least competent, so this must have been done intentionally.
What if you were the guy who got a ticket that just said "implement telemetry upload via HTTP"?
Which of these is evidence of a conspiracy:
tar cf - | curl
TMPFILE=$(mktemp) ; tar cf $TMPFILE ; curl -d $TMPFILE ; rm $TMPFILE
That's reductive.
The requirements should have been clear that crash data isn't just "implement telemetry upload", a "collision snapshot" is quite clearly something that could be used as evidence in a potentially serious incident.
Unless your entire engineering process was geared towards collecting as much data that can help you, and as little data as can be used against you, you'd handle this like the crown jewels.
Also, to nit-pick, the article says the automated response "marked" for deletion, which means it's not automatically deleted as your reductive example which doesn't verify it was successfully uploaded (at least && the last rm).
You left out the worse part:
> someone at Tesla probably took “affirmative action to delete” the copy of the data on the company’s central database, too
The 'wow' part is that they deleted data from server. The part you quoted sounds like nothing unusual to me.
You don't think it's unusual that the software is designed to delete crash data from the crashed car?
The question is whether this is code that's special for crashes, or code that runs the exact same way for all data uploads, regardless of whether there's a crash.
You're implying it's special for crashes, but we don't know that.
You have it backwards. The fact that after the special condition of a crash it still allows the data to be deleted is an issue. Sure, deleting of normal data is fine, but it clearly detected a crash and did not mark the file in the special crash mode as do not delete is mind boggling. Everyone knows that in a crash detection mode that the data is very important. Not having code to ensure data retention is the laziest at best way of doing things or malevolently designed at worst. Tesla and its leadership do not deserve at best as our default choice.
The crash system uses this code, therefore they chose to do something that would delete the crash data after a crash.
Saying "hey, the upload_and_delete function is used in loads of places!" doesn't free you of the responsibility that you used that function in the crash handler.
Is this a crash handler, or is it their normal telemetry upload loop?
Yes, it's a crash handler that uploads a blackbox "collision snapshot" of the entire car's state leading up to a crash. It's very well documented that Tesla does this, including in the article.
if its not special for crashes thats criminally bad design in a safety critical system.
u know if for instance u weld a gas pipeline and an xray machine reveal a crack in your work, you can go to jail.... but if you treat car software as an appstore item, totally fine??
stop defending ridiculously bad design and corporate practices.
Think of it as the scripts that run on CI/CD actions running unit tests. If a unit test fails, the test artifacts are uploaded to an artifact repository, and then, get this - the test runner instance is destroyed! But we don't think of that as unusual or nefarious.
No one dies when your unit test fails. Different stakes, different practices, what are all the Tesla apologists smoking here?
I don't think you can equate CI/CD unit tests and killing humans with 2 tons of metal.
And yet, that's what you get when your software org comes from that kind of devops culture. And here we are
That's because typically the test runner hasn't just crashed into another test runner at full highway speed
>> You don't think it's unusual that the software is designed to delete crash data from the crashed car?
After it confirmed upload to the server? What if it was a minor collision? The car may be back on the road the same day, or get repaired and on the road next week. How long should it retain data (that is not legally required to be logged) that has already been archived, and how big does the buffer need to be?
> What if it was a minor collision?
Then, I don’t know… Check if it was the case? Seriously, it’s unbelievable. It’s a company with a protocol to delete possibly incriminating evidence in a situation where it can be responsible for multiple deaths.
A very simple answer is "until the next time the car crashes", you just replace the previous crash data with the new data.
If the car requires that a certain amount of storage is always available to write crash data to, then it doesn't matter what's in that particular area of storage. That reserved storage is always going to be unavailable for other general use.
The top HN comment on the front page story about this crash on HN several weeks ago claimed the damages award was too high
Maybe this thread will be different
After reading the article, I am never buying a Tesla.
Props to greenthehacker. may you sip Starbuck's venti-size hot chocolates for many years to come.
Were you considering buying one before today? I'm curious as to what's different about this autopilot death compared to all the other autopilot deaths that have happened previously. Personally for me it was when the guy in Florida got decapitated when his car drove under a semitruck that made me never want to get in one again.
I wasn't opposed to buying a tesla. In my situation, I don't have the ability to charge ev's conveniently, so I'm not in the market so to speak.
Plus, I'm not interested at this time in the "autopilot" "AI" stuff; I believe drivers should be responsible all the time, until such time that full legal liability is put on the manufacturer.
Don't get me wrong... I would love to call my car to come pick me up at the airport!
Autopilot is opt-in. You can drive it like any other car and never use autopilot.
This is very true, but if you had to choose between two microwaves, one of which had a button that occasionally killed people and one which did not, which would you choose? Personally I would feel better buying a microwave that doesn't have the option to decapitate me, even if I would never press it.
all cars have a button that occasionally kills people, it’s called the accelerator pedal
I think you know that's a false equivalence, both because every control in a car has the possibility of killing you and also because every car has an accelerator pedal and I'm talking about an extra button.
So, Musk summoning the Luftwaffe like that didn’t dissuade you from buying one?
Tesla recanted its employee’s testimony “after discovering evidence inconsistent with his stated recollection of events,” it said.
That’s a fancy way to say that he liedVolkswagen was caught cheating on its emission data and the CEO got fired, then prosecuted. Why shouldn't that be the case here?
Firing the CEO is nominally up to the board of directors.
In Tesla's case, the board knows that the valuation of the company is wildly irrational, and they feel that the valuation is tied to the CEO.
You’d need a coalition of Democratic attorneys general to bring a case in the mould of Big Tobacco.
We'd need a third party if you'd actually want to fight american corporations. Unless you intended "small d" democratic
Good news, the CEO of this American corporation is making a third party… (the monkey paw curls)
The really weird thing about the diesel emissions scandal was that someone actually got in trouble for it. It is _rare_ for companies to be punished, particularly criminally, for that sort of thing.
Usually they'd get a DPA
None of this should be surprising to anyone who has given an ounce of effort to examining Elon's character.
Lies about capabilities, timelines, even things as frivolous as being rank one in a video game. He bought Twitter to scale his deception.
Don't worry, once Tesla figures out secure boot nobody will be able to call their bluff and they'll be free to 'lose' crash data with the same impunity the police loses their bodycam footage.
This should be modded up higher. Exactly. The only way hackers found this is because they weren’t using secure boot or encrypted images. Every embedded developer knows about MCUboot. Except managers don’t want the overhead because it is complicated. Once embedded devs get the ok all embedded firmware will basically be like a Signal chat with only the manufacturer having the keys. Heck even PSA compliant hardware MUST be resistant to multibit glitch attacks. Bye bye hackers.
The video is staggering, going super fast before an intersection, with no visibility, a blinking signal, and clear stop sign in sight. I hope FSD got better
not sure if you are saying otherwise but for those who might get confused this crash was with “Autopilot” not FSD, although both are definitely problematic
>> not sure if you are saying otherwise but for those who might get confused this crash was with “Autopilot” not FSD
And the distinction is what?
I'm not serious of course. There are huge swaths of the public whose eyes would glaze over if you tried to explain it, and that's my point.
i think the public can generally grasp the difference between lane assist and a waymo/AV but the naming is bad agreed
I'd like to hear the law say that self-driving cars should collect data (video, sensor inputs, actuator outputs), and that it is the property of the law when an accident happens. No exceptions. The real question is how the law is written, for it should leave no doubt about what Tesla, or any other, is required to do.
Probably all cars should have a black box, as both modern electronics and humans can do weird stuff.
good luck passing such a law in the US
Surely this is the behavior of a company that's confident in the safety of its products!
If Tesla can’t ensure safeguarding of this information, it’s a feature that will get them in big trouble.
So first the data wasn't there, and suddenly it is there. I think the only way to prevent tis in the future is to litigate against those individuals who knowingly lie for a company.
Litigate the company, not the individual. The hiding of the data was almost certainly a result of company ethos and most likely involved multiple levels of people. The maintenance tech was probably the lowest paid of everyone involved.
I'm still convinced that it being called "full self driving" is misleading marketing and really needs to stop, since it isn't according to Tesla
The marketing doesn't even matter. It either needs to be full self driving, or nothing at all. The "semi self-driving but you're still responsible when shit hits the fan" just doesn't work.
Humans are simply incapable of paying attention to a task for long periods if it doesn't involve some kind of interactive feedback. You can't ask someone to watch paint dry while simultaneously expect them to have < 0.5sec reaction time to a sudden impulse three hours into the drying process.
I have a SAE level 2 car. Those features DO help!
Framing is crucial. Example, why was the Autonomous Emergency Braking configured to brake violently to a full stop? Lets consider two scenarios, in both cases we're not paying enough attention to the outside world and are about to strike a child on a bicycle but the AEB policy varies.
1. AEB brakes violently to a full stop. We experience shock and dismay. What happened? Oh, a kid on a bike I didn't see. I nearly fucked up bad, good job AEB
2. AEB smoothly slows the vehicle to prevent striking the bicycle, we gradually become aware of the bike and believe we had always known it was there and our decision eliminated risk, why even bother with stupid computer systems?
Humans are really bad at accepting that they fucked up, if you give them an opportunity to re-frame their experience as "I'm great, nothing could have gone wrong" that's what they prefer, so, to deliver the effective safety improvements you need to be firm about what happened and why it worked out OK.
Same. Not having to worry about keeping the car between the lines allows me to keep my focus on the other cars around me more. Offloading the cognitive load of fine tuning allows more dedication to the bigger picture.
This makes no sense to me. Driving involves all senses, not just vision - if you're not feeling what the car is doing because you're not engaged with the steering wheel what good is it to see what's around you? I also don't understand how one has trouble staying between the lines with minimal cognitive input after more than a few months of driving.
Oh! And also, moving within the lane is sometimes important for getting a better look at what's up ahead or behind you or expressing car "body language" that allows others to know you're probably going to change lanes soon.
My car requires hands on the wheel to continue to operate. So I do feel it moving.
> I also don't understand how one has trouble staying between the lines with minimal cognitive input after more than a few months of driving.
Once you have something assist you with that, you'll notice how much "effort" you are actually putting towards it.
I don't have personal experience but friends with personal experience have sort of shifted my thinking on the topic. They'll note they do need to stay engaged but that it is genuinely useful on long drives in particular. The control handover is definitely an issue but so is manual driving in general. Their consensus is that the current state of the art is by no means perfect but it is improved and it's not like there aren't problems with existing manual driving even with some assistive systems.
I used to think this, but then I got a Model 3. I believe that FSD is presently better than most humans driving today even when they are theoretically “fully engaged in manual driving”.
FSD doesn’t lull humans into a false sense of security, humans do. FSD doesn’t let you use your phone while it’s on. This alone is an upgrade over most human beings, who think occasional quick phone usage while driving is fine (at least for themselves).
I believe that if you replaced all human drivers in the US with FSD as it exists today, fatalities would go down immediately.
Humans are not a gold standard, and the current median human driver is easy to outperform on safety.
If you live in a city, please send this article to your municipal and state electeds. Tesla is lobbying for the right to train and activate its Level 4 product, marketed as Level 5, in cities where Musk is deeply unpopular. There is massive political capital to be had in banning Tesla’s self-driving features on even the flimsiest grounds.
I would rather take a bullet than be a luddite who gets in the way of technological advancement on "the flimsiest of grounds."
> be a luddite who gets in the way of technological advancement on "the flimsiest of grounds”
Blocking a technology is Luddism. Blocking a company is politics.
Why do you think Musk put so much money into helping Trump win? Tesla was under multiple investigations for safety and unkept promises, and he knew that he would not have leverage to halt those under a Harris administration.
If that was his goal he would have minded his own business after the election, instead of spouting invective posts against Trump on X.
That was after Musk realized he had alienated his entire consumer base.
And he wants to bring them back by alienating Trump while doubling down on his rhetoric?
He has an ego and narcissism but he isn't dumb. He sees the problems but also cant admit hes wrong or anything.
> but he isn't dumb.
Musk’s assistant peeked back the muttered and said he had another meeting. “Do you have any final thoughts?” she asked.
“Yes, I want to say one thing.” the data scientist said. He took a deep breath and turned to Musk.
“I’m resigning today. I was feeling excited about the takeover, but I was really disappointed by your Paul Pelosi tweet. It’s really such obvious partisan misinformation and it makes me worry about you and what kind of friends you’re getting information from. It’s only really like the tenth percentile of the adult population who’d be gullible enough to fall for this.”
The color drained from Musk’s already pale face. He leaned forward in his chair. No one spoke to him like this. And no one, least of all someone who worked for him, would dare to question his intellect or his tweets. His darting eyes focused for a second directly on the data scientist.
“Fuck you!” Musk growled.
https://www.techdirt.com/2024/10/25/lies-damned-lies-and-elo...That happened much earlier. The split with Trump happened after it finally sunk in that that Republicans weren't actually interested in smaller government or cost savings, that that was just a rhetorical weapon that they deploy selectively to get elected.
he's not a very smart man
I mean, if we was rational, sure, that's probably what he should have done. But, y'know, he clearly _isn't_.
He was under some imaginary assumption that Trump cared about the national deficit because of his campaign speeches. Once he realized that Trump really didn't care two hoots about it and only planned to increase it even more he had a late buyer's realization.
I'm absolutely not a fan of Trump, but this is a highly questionable assumption.
The much more likely hypothesis in my view is that he was helping Trump because of personal conviction (only in small parts motivated by naked self-interest).
You should expect rational billionaires to tend politically right out of pure self-interest and distorted perspective alone; because the universal thing that such parties reliably do when in power is cutting tax burden on the top end.
That’s insane. Do you remember DOGE or Elon taking his cronies into the same departments investigating him? Do you even remember?
What would Elon even be in court for? Being a politically incorrect dumbass on ex-twitter is not punishable by law.
Sending a bunch of scriptkiddies around and having them cut government funding and gut agencies is not really how you make evidence "vanish", how would that even work?
And, lastly, jumping in front of an audience at every opportunity and running your mouth is the absolute last thing anyone would ever do if the goal was to avoid prosection. But it is perfectly in line with a person that has a very big ego and wants to achieve political goals.
Labor violations, taxes, National Highway traffic safety administration investigation Tesla.. are you willfully ignorant or a troll?
I'm not a troll.
I scrutinise beliefs and assumptions even if they are convenient, and you should, too.
I don't believe that Musks main motivation to participate in the 2024 election was to avoid prosecution, because his actions are not really compatible with this, and there is a much more plausible alternative hypothesis that he preferred (possibly no longer) the republican platform for non-prosecution reasons/personal conviction instead, which his actions are very compatible with.
> Labor violations, taxes, National Highway traffic safety administration investigation Tesla
Let me say it like this: Billionaires generally don't have to care about minor infractions like this at all. The whole system is set up to shield them from liability, and wealth is an excellent buffer against effective prosection regardless of who is president. There have been a plethora of infinitely more serious infractions with zero real consequences for the CEOs involved, and this is not because they participated in past presidential election campaigns. See: the VW diesel emission fraud or much worse, leaded gas in the last century (and what associated industry did to keep that going).
Musk is on record saying to Tucker Carlson that “If [Trump] loses, I’m fucked.”
So this isn't so much of an assumption, as taking him at his word.
All the context I have for this is that he was grandstanding in front of a rightwing audience (after Trump was shot at, notably) and playing the "surely I would get unjustly prosecuted for my political incorrectness under the democrats".
What is your actual point? What would he stand in front of a judge for, right now, if Harris had won?
My actual point is that when someone tells you who they are, you should consider believing them.
You'd have to ask Musk what he feels so guilty about that he had to buy an election.
The Left was coming after Musk pretty hard before the election. I don’t know the context of the quote you pulled but it’s not hard to see how if Trump lost, there was going to be consequences for Musk.
He has committed a lot of fraud and was facing consequences for that. That has nothing to do with left or right.
Fraud has nothing to do with vandalizing Tesla dealerships last I checked.
We were talking about Tesla's fraud cases, not some vandalism cases last time I checked.
Actually we were talking about personal consequences to Musk.
You are right it doesn’t. That is (wrongly) done by people who are (rightly) mad at him for making american life harder and global life more dangerous, in a self serving attempt to evade the justice system.
Can you give an example of these many instances of fraud?
For instance he has made fraudulent statements regarding the current and near future capabilities of Tesla in an effort to inflate stock prices numerous times. He was in fact ordered by a judge to stop making such statements but he didn’t obey that.
He used to be quite charismatic, I believed him up until about 2017 or so. Then I figured he was just a bit greedy and maybe money got to his head but still a respectable innovator. However during 2020 or 2021 (I don’t exactly remember) he started to get quite unpleasant and making obviously short-term decisions, such as relying only on cameras for self driving because of chip shortages but dressing it up as an engineering decision.
We can start from the linked article?
Yeah, that's where I started, and I would recommend you do the same:
> U.S. District Judge Beth Bloom, who presided over the case, said in an order that she did not find “sufficient evidence” that Tesla’s failure to initially produce the data was intentional.
It does actually, because only one side is interested in finding or fighting fraud.
Currently yes, but it is not inherently so. The problem with the US regime is that it is compromised, corrupt and heading towards fascism.
The problem is not that the republican party used to be a conservative right party.
What I’m saying is this is not a sports competition where Musk is automatically an opponent of the Democratic party because he supported Trump. He supported Trump in order to improve his chances with the legal system because he knew Trump would be willing to be so corrupt.
Another world might be imagined in which the Democratic party was taken over in 2016 but that is not the world we live in.
Both are interested in finding and fighting fraud, but only from the other side. Leticia James charged Trump with a rack of felonies for putting false info on a loan application. The Trump DOJ charged Leticia James for doing exactly the same. Both sides claim the charges against them are politically motivated and the charges against the other side are completely legitimate.
> Both sides claim the charges against them are politically motivated and the charges against the other side are completely legitimate.
This is how conservatives keep people going 'both sides!' even though they manufacture whatever is required to be that way.
I’m less convinced we need to keep bringing this up in every single thread involving Tesla.
Everytime this comes up, I am on the opposite site of this. It is clearly full self driving. It can stop at red lights, cross intersections, make turns, park, drive, change lanes, break and navigate on its own. There are various videos online where FSD managed to drive a route start to finish without a single human override. That's full self driving. It can also crash like humans "can" and that why it needs supervision. In this sense, we as humans are also "full self driving" with a much (?) lower risk of crashing.
Like also everytime let the downvotes rain. If you downvote, it would be nice, if you could tell me where I am wrong. It might change my view on things.
> It is clearly full self driving. It can stop at red lights, cross intersections, make turns, park, drive, change lanes, break and navigate on its own. That's full self driving
All this demonstrates is the term “full self driving” is meaningless.
Tesla has a SAE Level 3 [1] product they’re falsely marketing as Level 5; when this case occurred, they were misrepresenting a Level 2 system as Level 4 or 5.
If you want to see true self driving, take a Waymo. Tesla can’t do that. They’ve been lying that they can. That’s gotten people hurt and killed; Tesla should be liable for tens if not hundreds of billions for that liability.
The other confusion with self driving for me is, is the “self” the human or the car?
Self driving can totally means the human own-self driving.
Having SAE level is clearer.
If it's a meaningless term then it can't be misrepresenting to use it.
> If it's a meaningless term then it can't be misrepresenting to use it
It’s meaningless because Tesla redefines it at will. The misrepresentation causes the meaninglessness.
That's something different. The problem with the level is, that it only focuses on the attention the human driver needs to give to the automation. In this sense my Kia EV6 is also Level 2/3, same as FSD. However FSD can do so much more than my Kia EV6. That's a fact. Still the same level. Where did Tesla say FSD is SAE Level 5 approved? They would be responsible everytime FSD is active during a crash. Tesla is full self driving with Level 2/3 supervision and in my opinion this is not missleading.
Also "All this demonstrates is the term “full self driving” is meaningless." prooves my point that it is not missleading.
> FSD can do so much more than my Kia EV6. That's a fact. Still the same level
The levels are set at the lowest common denominator. A 1960s hot rod can navigate a straight road with no user input. That doesn’t mean you can trust it to do so.
> Where did Tesla say FSD is SAE Level 5 approved?
They didn’t say that. They said it could do what a Level 5 self-driving car can do.
“In 2016, the company posted a video of what appears to be a car equipped with Autopilot driving on its own.
‘The person in the driver's seat is only there for legal reasons,’ reads a caption that flashes at the beginning of the video. ‘He is not doing anything. The car is driving itself.’ (Six years later, a senior Tesla engineer conceded as part of a separate lawsuit that the video was staged and did not represent the true capabilities of the car.) [1]”
> Tesla is full self driving with Level 2/3 supervision and in my opinion this is not missleading
This is tautology. You’re defining FSD to mean whatever Tesla FSD can do.
[1] https://www.npr.org/2025/07/14/nx-s1-5462851/tesla-lawsuit-a...
How would you name a system which can do everything a Level 5 system can, but with Level 2/3 supervision? A name, which a PR team would choose without the missleading stuff as you are saying.
> How would you name a system which can do everything a Level 5 system can, but with Level 2/3 supervision?
FSD cannot “do everything a Level 5 system can.” It can’t even match Waymo’s Level 4 capabilities, because it periodically requires human intervention.
But granting your premise, you’d say it’s a Level 2 or 3 system with some advanced capabilities. (Mercedes has a lane-keeping and -switching product. They’re not constantly losing court cases.)
>if you could tell me where I am wrong
It needs to have a crash rate equal to or ideally lower than a human driver.
Tesla does not release crash data (wonder why...), has a safety driver with a finger on the kill switch, and only lets select people take rides. Of course according to Elon always-honest-about-timelines Musk, this will all go away Soon(TM) and we will have 1M Robotaxis on the road by December 31st.
Completing a route without intervention doesn't mean much. It needs to complete thousands of routes without intervention.
Keep in mind that Waymos have selective intervention for when they get stuck. Teslas have active intervention to prevent them from mowing down pedestrians.
I see this brought up a lot, but I don't think it's really an issue. It's misleading in a very technical sense, but it's so misleading that nobody is mislead. Just like nobody thinks the "Magic Eraser" is actually magic. I fundamentally just don't think anybody is out there actually believing this thing is L5 full self driving, especially after all the warnings it shows you and the disclaimers when you buy it.
The problem here isn't that people think they don't need to pay attention because their car can drive itself and then crash. The problem is that people who know full well that they need to focus on driving just don't because fundamentally the human brain isn't any good at paying attention 100% of the time in a situation where you can get away with not paying attention 99.9% of the time, and naming just can't solve this.
Everyone is mad at Tesla but they're literally the only company collecting this kind of crash metadata.
Other car manufacturers would never get in trouble for this because it's not even possible for them to do it in the first place!
People aren't mad that they collect the data, everyone does that, but that they immediately deleted it, then lied about it ever happening, in a matter of life and death.
I would deeply encourage you to re-assess whatever led you to make this comment, because you have fallen wildly off the mark here. Corporations are not your friend.
Everyone is mad because they killed people and lied about it.
Wrong. Almost all modern cars track location and tons of other data. Ford even has a screen that pops-up saying basically "hey you're opting into this FYI".
They should be saving every crash as a unit test to ensure it never happens again.
The description of the guy finding the data while at a Starbucks doesn't do justice to his setup shown in the photo. My dude has a seriously chaotic and awesome setup there.
I imagine, he dumped the car data onto his laptop, so that he could work on the problem in a more cozy place, than his messy bitcave
This will continue until people go to prison
Huge props to the hacker (@greentheonly) ... considering the cutbacks in journalism, perhaps we're entering a world where some of the most important investigative journalism will be done by hackers.
Unpaid, unrewarded excellence.
the dirtiest of doggery
Can you imagine air craft makers avoid this sort of black box autodelete! A red handed catch!
No paywall link https://archive.is/s1psp
So, will tesla get nuked from orbit for what is obviously a serious, intentional and systemic discovery violation or is this just ok because it's a big corp?
With everything that is wrong with Tesla, I'll be the first to say that all Tesla cars need to be taken off of the roads, at least until all of their auto-driving features have been fully removed.
Of course they will say they don't have the key data.
Do we expect them to admit they were outright lying and wrong considering their leader is a pill popping Nazi salute making workaholic known to abuse his workers?
Lying to a court is usually pretty serious. Any sensible legal department will tell you never to do that, whatever your CEO says.
Unless you paid off the president they assume.
Did they have a falling out a few months ago?
You mean, the other allegations on this same person would not be judged something serious and could even be recommended?
Usually.
But today you just have a private dinner with the president and he'll wave it away.