特斯拉完了
版主: trieste, nk
版面规则
无为而治,对于有举报的辱骂和人身攻击,楼里首骂者初次封禁1天,2次1周,多次封一个月。同楼里的非升级回骂,酌情降低惩罚
无为而治,对于有举报的辱骂和人身攻击,楼里首骂者初次封禁1天,2次1周,多次封一个月。同楼里的非升级回骂,酌情降低惩罚
-
- 论坛支柱
- 帖子: 9381
- 注册时间: 2022年 7月 22日 17:25
- 昵称(选填): YWY(夜未央)
#3 Re: 特斯拉完了
持仓抄底锁利,你钱你定
看牛观猪喊熊,自娱自乐
股市变幻莫测,不作不死
赌途曲折无常,吃枣药丸
看牛观猪喊熊,自娱自乐
股市变幻莫测,不作不死
赌途曲折无常,吃枣药丸
-
- 著名点评
- 帖子: 4368
- 注册时间: 2022年 8月 22日 22:41
-
- 论坛支柱
- 帖子: 9381
- 注册时间: 2022年 7月 22日 17:25
- 昵称(选填): YWY(夜未央)
#7 Re: 特斯拉完了
我也来个标题: NHTSA对特斯拉为期3年的调查完了,但同时针对特斯拉对Autopilot的修整力度开启新的调查
我的解毒:之前的调查结束了(完了),特斯拉也通过over the air recall加强了对Autopilot驾驶者的提醒力度(提醒司机时刻观察路况准备接管)。但NHTSA对特斯拉的recall fix的调整力度不甚满意,于是又开启了新一轮调查。
再多说几句:特斯拉只是L2的水平,需要司机监管。特斯拉也确实需要完善提高对司机的管控。最后是数据说话,如果大数据显示特斯拉的安全水平不够、事故频繁,我想老马自己都会不好意思再吹。
详情大家可以看下面链接(全文也转过来贴在下面了)。大家也多分享。
https://techcrunch.com/2024/04/26/tesla ... l-crashes/
Tesla Autopilot investigation closed after feds find 13 fatal crashes related to misuse
The National Highway Traffic Safety Administration closed a long-standing investigation into Tesla’s Autopilot driver assistance system after reviewing hundreds of crashes involving its misuse, including 13 that were fatal and “many more involving serious injuries.”
At the same time, NHTSA is opening a new investigation to evaluate whether the Autopilot recall fix that Tesla implemented in December is effective enough.
NHTSA’s Office of Defects Investigation said in documents released Friday that it completed “an extensive body of work” which turned up evidence that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.”
“This mismatch resulted in a critical safety gap between drivers’ expectations of [Autopilot’s] operating capabilities and the system’s true capabilities,” the agency wrote. “This gap led to foreseeable misuse and avoidable crashes.”
The closing of the initial probe, which began in 2021, marks an end of one of the most visible efforts by the government to scrutinize Tesla’s Autopilot software. Tesla is still feeling the pressure of multiple other inquiries, though.
The Department of Justice is also investigating the company’s claims about the technology, and the California Department of Motor Vehicles has accused Tesla of falsely advertising the capabilities of Autopilot and the more-advanced Full Self-Driving beta software. The company is also facing multiple lawsuits regarding Autopilot. Tesla, meanwhile, is now going “balls to the wall for autonomy,” according to CEO Elon Musk.
NHTSA said its investigation reviewed 956 reported crashes up until August 30, 2023. In roughly half (489) of those, the agency said either there “was insufficient data to make an assessment,” the other vehicle was at fault, Autopilot was found to not be in use or the crash was otherwise unrelated to the probe.
NHTSA said the remaining 467 crashes fell into three buckets. There were many (211) crashes where “the frontal plane of the Tesla struck another vehicle or obstacle with adequate time for an attentive driver to respond to avoid or mitigate the crash. It said 145 crashes involved “roadway departures in low traction conditions such as wet roadways. And it said 111 of the crashes involved “roadway departures where Autosteer was inadvertently disengaged by the driver’s inputs.”
These crashes “are often severe because neither the system nor the driver reacts appropriately, resulting in high-speed differential and high energy crash outcomes,” the agency wrote.
Tesla tells drivers they need to pay attention to the road and keep their hands on the wheel while using Autopilot, which it measures via a torque sensor and, in its newer cars, the in-cabin camera. But NHTSA, and other safety groups, have said that these warnings and checks do not go far enough. In December, NHTSA said these measures were “insufficient to prevent misuse.”
Tesla agreed to issue a recall via a software update that would theoretically increase driver monitoring. But that update did not really appear to change Autopilot much — a sentiment NHTSA seems to agree with.
Parts of that recall fix require the “owner to opt in,” and Tesla allows a driver to “readily reverse” some of the safeguards, according to NHTSA.
NHTSA spent nearly three years working on the investigation into Autopilot, and met or interacted with Tesla numerous times throughout the process. It performed many direct examinations of the crashes, and relied on the company to provide data about them as well.
But the agency criticized Tesla’s data in one of the supporting documents.
“Gaps in Tesla’s telematic data create uncertainty regarding the actual rate at which vehicles operating with Autopilot engaged are involved in crashes. Tesla is not aware of every crash involving Autopilot even for severe crashes because of gaps in telematic reporting,” NHTSA wrote. According to the agency, Tesla “largely receives data from crashes only with pyrotechnic deployment,” meaning when air bags, seat belt pre-tensioners or the pedestrian impact mitigation feature of the car’s hood are triggered.
NHTSA claims that limiting to this level means Tesla is only collecting data on around 18% of crashes that are reported to the police. As a result, NHTSA wrote that the probe uncovered crashes for which Autopilot was engaged that Tesla was not notified of via telematics.
我的解毒:之前的调查结束了(完了),特斯拉也通过over the air recall加强了对Autopilot驾驶者的提醒力度(提醒司机时刻观察路况准备接管)。但NHTSA对特斯拉的recall fix的调整力度不甚满意,于是又开启了新一轮调查。
再多说几句:特斯拉只是L2的水平,需要司机监管。特斯拉也确实需要完善提高对司机的管控。最后是数据说话,如果大数据显示特斯拉的安全水平不够、事故频繁,我想老马自己都会不好意思再吹。
详情大家可以看下面链接(全文也转过来贴在下面了)。大家也多分享。
https://techcrunch.com/2024/04/26/tesla ... l-crashes/
Tesla Autopilot investigation closed after feds find 13 fatal crashes related to misuse
The National Highway Traffic Safety Administration closed a long-standing investigation into Tesla’s Autopilot driver assistance system after reviewing hundreds of crashes involving its misuse, including 13 that were fatal and “many more involving serious injuries.”
At the same time, NHTSA is opening a new investigation to evaluate whether the Autopilot recall fix that Tesla implemented in December is effective enough.
NHTSA’s Office of Defects Investigation said in documents released Friday that it completed “an extensive body of work” which turned up evidence that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.”
“This mismatch resulted in a critical safety gap between drivers’ expectations of [Autopilot’s] operating capabilities and the system’s true capabilities,” the agency wrote. “This gap led to foreseeable misuse and avoidable crashes.”
The closing of the initial probe, which began in 2021, marks an end of one of the most visible efforts by the government to scrutinize Tesla’s Autopilot software. Tesla is still feeling the pressure of multiple other inquiries, though.
The Department of Justice is also investigating the company’s claims about the technology, and the California Department of Motor Vehicles has accused Tesla of falsely advertising the capabilities of Autopilot and the more-advanced Full Self-Driving beta software. The company is also facing multiple lawsuits regarding Autopilot. Tesla, meanwhile, is now going “balls to the wall for autonomy,” according to CEO Elon Musk.
NHTSA said its investigation reviewed 956 reported crashes up until August 30, 2023. In roughly half (489) of those, the agency said either there “was insufficient data to make an assessment,” the other vehicle was at fault, Autopilot was found to not be in use or the crash was otherwise unrelated to the probe.
NHTSA said the remaining 467 crashes fell into three buckets. There were many (211) crashes where “the frontal plane of the Tesla struck another vehicle or obstacle with adequate time for an attentive driver to respond to avoid or mitigate the crash. It said 145 crashes involved “roadway departures in low traction conditions such as wet roadways. And it said 111 of the crashes involved “roadway departures where Autosteer was inadvertently disengaged by the driver’s inputs.”
These crashes “are often severe because neither the system nor the driver reacts appropriately, resulting in high-speed differential and high energy crash outcomes,” the agency wrote.
Tesla tells drivers they need to pay attention to the road and keep their hands on the wheel while using Autopilot, which it measures via a torque sensor and, in its newer cars, the in-cabin camera. But NHTSA, and other safety groups, have said that these warnings and checks do not go far enough. In December, NHTSA said these measures were “insufficient to prevent misuse.”
Tesla agreed to issue a recall via a software update that would theoretically increase driver monitoring. But that update did not really appear to change Autopilot much — a sentiment NHTSA seems to agree with.
Parts of that recall fix require the “owner to opt in,” and Tesla allows a driver to “readily reverse” some of the safeguards, according to NHTSA.
NHTSA spent nearly three years working on the investigation into Autopilot, and met or interacted with Tesla numerous times throughout the process. It performed many direct examinations of the crashes, and relied on the company to provide data about them as well.
But the agency criticized Tesla’s data in one of the supporting documents.
“Gaps in Tesla’s telematic data create uncertainty regarding the actual rate at which vehicles operating with Autopilot engaged are involved in crashes. Tesla is not aware of every crash involving Autopilot even for severe crashes because of gaps in telematic reporting,” NHTSA wrote. According to the agency, Tesla “largely receives data from crashes only with pyrotechnic deployment,” meaning when air bags, seat belt pre-tensioners or the pedestrian impact mitigation feature of the car’s hood are triggered.
NHTSA claims that limiting to this level means Tesla is only collecting data on around 18% of crashes that are reported to the police. As a result, NHTSA wrote that the probe uncovered crashes for which Autopilot was engaged that Tesla was not notified of via telematics.
持仓抄底锁利,你钱你定
看牛观猪喊熊,自娱自乐
股市变幻莫测,不作不死
赌途曲折无常,吃枣药丸
看牛观猪喊熊,自娱自乐
股市变幻莫测,不作不死
赌途曲折无常,吃枣药丸
-
- 论坛精英
- 帖子: 6131
- 注册时间: 2023年 8月 28日 11:36
#9 Re: 特斯拉完了
打倒蝴蝶 打死蝴蝶 石压死蝶
蝴蝶飞,被鸟追
蝴蝶死了万人捶
蝴蝶飞,被鸟追
蝴蝶死了万人捶
#13 Re: 特斯拉完了
还有向德国官员demo FSD的新闻
-
- 论坛支柱
- 帖子: 9381
- 注册时间: 2022年 7月 22日 17:25
- 昵称(选填): YWY(夜未央)
#14 Re: 特斯拉完了
持仓抄底锁利,你钱你定
看牛观猪喊熊,自娱自乐
股市变幻莫测,不作不死
赌途曲折无常,吃枣药丸
看牛观猪喊熊,自娱自乐
股市变幻莫测,不作不死
赌途曲折无常,吃枣药丸
-
- 论坛精英
- 帖子: 7173
- 注册时间: 2022年 9月 11日 03:58
- 昵称(选填): papabear
-
- 知名人士
- 帖子: 67
- 注册时间: 2022年 8月 7日 03:23
- 昵称(选填): 有话直说
#16 Re: 特斯拉完了
特斯拉FSD无论怎么升级都是L2,目前的方法不管层数在怎么多,始终缺少一点,就是人的灵魂,实际就是人的内心的意愿。在任何未知领域,雨雪天,山路,人的灵魂决定了人会谨慎,随机应变,这是人保障安全的最后屏障,而机器不会,只要放开人的监管,FSD车毁人亡的事情会越来越多
-
- 著名写手
- 帖子: 326
- 注册时间: 2022年 7月 23日 22:13
-
- 见习点评
- 帖子: 1526
- 注册时间: 2023年 2月 22日 13:25
-
- 论坛点评
- 帖子: 2341
- 注册时间: 2024年 3月 4日 19:11
- 昵称(选填): 认清现实全面的事实
#20 Re: 特斯拉完了
纯视觉方案,上限不行