The Tesla Thread

At what cost though? I mean Tesla didn't go out of its way to make the braking of its car underperform and a firmware update isn't going to physically improve the brakes (obviously) so what is the consequence of what they've done? Less regen with more use of the physical brakes? More regen?

Who cares about regen in an emergency braking situation? A quick 3 seconds on google will tell you that it is an issue with the ABS and the video's that people have posted online confirm the car was locking up when emergency breaking (thus increasing stopping distance). They changed the calibration algorithm on the ABS module and hey presto its fixed. There was nothing wrong with the physical components, its all software. The software was pushed out over the air in a few days and all the cars were fixed, end of drama.

Its no different to how BMW managed to fix the crash protection in the i3 without changing the car. They just re-timed the airbags and it passed the test. The difference is that over 30 thousand cars had to go back to dealers for a minor software update.
https://www.greencarreports.com/new...-recall-announced-for-specific-safety-concern
 
I think Level 2 automation is inherently dangerous and should be heavily legislated. Tesla's system isn't just dangerous because of its failures or naming; it's dangerous because it encourages inattention whilst not permitting it to be safe. All Level 2 systems will do this.

To be fair to Tesla they are pretty clear that you should be in control and alert at all times, and that the system is a "beta", not a finished product.

Calling it autopilot was a bit of an own goal though.
 
Wouldnt purchase a Tesla. they just dont have the same level of experience as other mainstream manufacturers.

Perfect example: My CFO's Model S - 66 plate - literally body panels falling off and interior falling apart.

Tesla may look the party - will need a few more years to iron out quality issues. not built to last . :/

just my two worthless cents.
 
I think Level 2 automation is inherently dangerous and should be heavily legislated. Tesla's system isn't just dangerous because of its failures or naming; it's dangerous because it encourages inattention whilst not permitting it to be safe. All Level 2 systems will do this.

I agree. Level 2 can (and does) throw you into dangerous situations with no notice. Combined with the ability to remove your hands from the wheel for short periods and the use of touch screen media controls create an environment for inattention. What confuses me even more, people seem to defend it to their death saying it is amazing. Unfortunately, as some have found out, it can result in their death. I quickly came to the conclusion it is like having a inexperienced learner who just about understands the major controls drive you from stop start queues to 70mph on the motorway, and therefore it should be treated as such.
 
Last edited:
I expect the autonomous systems are statistically safer than the huge percentage of bad drivers on the road. Not saying it doesn't encourage inattention mind.
 
I expect the autonomous systems are statistically safer than the huge percentage of bad drivers on the road. Not saying it doesn't encourage inattention mind.

This does not appear to be the case, at least with Tesla. Accident rates with Autopilot are a bit worse than the average rate per mile but Autopilot is also almost entirely used in conditions where accident rates are lower anyway.
 
I expect the autonomous systems are statistically safer than the huge percentage of bad drivers on the road. Not saying it doesn't encourage inattention mind.

Autonomous systems would be, but we don't have any of them. We have these kinds of Level 2 "assist" systems where the driver is expected to pay attention and take over at any moment, yet is disengaged from the actual act of driving, allowing the mind to wander and not pay attention. When you're driving, you're actively engaged. When you've overseeing a semi-automated system, you're in a passive, waiting for something to happen mode. It's easy to lose concentration, and hard to switch from a passive to an engaged mode instantaneously.

The sooner we get past this halfway house of part automated to fully automated, the better it will be.
 
Autonomous systems would be, but we don't have any of them. We have these kinds of Level 2 "assist" systems where the driver is expected to pay attention and take over at any moment, yet is disengaged from the actual act of driving, allowing the mind to wander and not pay attention.

Yes, all of this. Lower level automation systems help you keep concentration (e.g. lane departure warnings), take over some simple tasks but leave you engaged (e.g. automatic choke or gears, cruise control), or intervene in an emergency to improve safety (e.g. brake assist) and higher level stuff allows you to disengage. Level 2 does neither. It's an ugly middle ground. I think some level 2 systems such as self-park are fine, but stuff like Autopilot is inherently unsafe.
 
Yes, all of this. Lower level automation systems help you keep concentration (e.g. lane departure warnings), take over some simple tasks but leave you engaged (e.g. automatic choke or gears, cruise control), or intervene in an emergency to improve safety (e.g. brake assist) and higher level stuff allows you to disengage. Level 2 does neither. It's an ugly middle ground. I think some level 2 systems such as self-park are fine, but stuff like Autopilot is inherently unsafe.

It's another example of undesirable emergent behaviour when a complex technological solution is married with human psychology. People don't behave how you expect them to, often with worse outcomes where they are involved in the system. That human in the loop often seems to be the last thing considered by the engineers building these complex systems.
 
Which part of auto pilot to you have particular issue with? I assume it's auto steer/lane keeping and auto lane change. The rest of it is available on a wide range of other cars (traffic aware cruise, auto park, etc.)

Do you have a problem with Nissan Pro Pilot which is essentially the same as the old auto pilot.
 
Which part of auto pilot to you have particular issue with? I assume it's auto steer/lane keeping and auto lane change. The rest of it is available on a wide range of other cars (traffic aware cruise, auto park, etc.)

Yes, I think the system is dangerous in all cars, it's not a Tesla-specific problem. Tesla have merely made a lot more of a song and dance about theirs. The recent negative publicity is the flip side of the huge amounts of positive publicity they've got previously.

Do you have a problem with Nissan Pro Pilot which is essentially the same as the old auto pilot.

I'm not familiar with this system specifically, but probably: yes.
 
Which part of auto pilot to you have particular issue with?

I've given it some thought, and I think the biggest problem is Elon Musk... (I can feel the down votes on Reddit already). The expectations are much higher then the reality. The expectation is at a minimum it can handle simple scenarios like motorways. The reality is, it works <99% well in this situation. 99% is not 100% and that <1% could (and has) caused deaths. I think level 2 systems do a disservice to well engineered solutions currently in the making from respectable businesses (Waymo) and realistically, should be used in very brief stints to keep concentration levels up, which inevitably decline when humans are put in a observational only position.
 
Curious - do you have links for those stats?

I was basing it on a claim from Tesla that they were a bit safer than manual driving given before the two recent deaths. Since that tripled the deaths, I assumed that this would result in the rate now being higher than manual driving. But, looking it up to find you links, I find that there is simply no reliable data. Tesla's oft-repeated claims are essentially garbage, and no publicly available data gives enough information to make a reasonable estimate of the safety of the system.
 
Tesla's oft-repeated claims are essentially garbage, and no publicly available data gives enough information to make a reasonable estimate of the safety of the system.

Please don't make stuff up, it dones't help the discussion.

If I remember correctly Tesla's said they were going to release the data shortly. Wait until its available before forming an opinion.
 
Please don't make stuff up, it dones't help the discussion.

I'm not making stuff up. They're essentially garbage. The data, even if applied correctly, doesn't support the claims made. They are being wrongly interpreted, over-interpreted, and incorrectly applied. The lack of publicly available data is just unfortunate as it prevents anyone else from carrying out an independent analysis.
 
I'm not making stuff up. They're essentially garbage. The data, even if applied correctly, doesn't support the claims made. They are being wrongly interpreted, over-interpreted, and incorrectly applied. The lack of publicly available data is just unfortunate as it prevents anyone else from carrying out an independent analysis.

How is it not garbage, you don't have the data, you don't know how its being interpreted, you don't know how it is being applied, you don't know its being misused.

I don't disagree that the claims are under dispute and claims should be fact checked. But all those articles do is try to poke holes in the claims and its speculation at this time because they don't have the underlying data set that is being used. There are also choices and assumptions you have to make when you a looking at big data sets, everyone has an opinion but that doesn't make one way or another wrong but it does mean the same data set can in two different people hands can be presented in many different ways. Its like the whole brexit campaign all over again.

Until the data is actually examined by a 3rd party then that is all it is, speculation. You are trying to justify your opinion with facts that simply don't exist in the public domain, those articles even state that and you have stated it yourself.

The only people that do know are Tesla and until they release the data themselves or are forced to release it by someone else then we will never truly know.
 
How is it not garbage, you don't have the data, you don't know how its being interpreted, you don't know how it is being applied, you don't know its being misused.

I know what Tesla are saying about the data. What they are saying is garbage, regardless of the limited data released. When the argument the interpretation is based on is wrong, then the data doesn't matter. The three articles I linked explain all this.
 
Back
Top Bottom