Tesla refuses to be the Back Pot Man again

After Tesla released the autopilot test software FSD Beta 10.5 on November 21, careful test users found that when updating FSD Beta 10.5, the system added a new explanation clause. Tesla requires the acquisition of camera recording data inside and outside the car with VIN (Vehicle Identification Number).


This is the first time Tesla has obtained image data related to vehicle VIN, which also means that Tesla can obtain image data of specific vehicles.


Tesla's more detailed data collection for the beta version of the autopilot software may be able to improve the capabilities of the FSD Beta more targeted. At present, this clause of Tesla has been supported by some Twitter netizens.


To upgrade FSD Beta 10.5, you need to agree to VIN related image authorization


Tesla adjusted the update terms this time, on the one hand, to improve the machine learning capabilities of FSD Beta, and on the other hand to be able to analyze the cause of the accident through the image data of the vehicle in the event of an accident designed for FSD Beta.


At present, Tesla owners have received an update notification when they downloaded the latest version of FSD Beta. The content of the notice this time is roughly the same as that of the previous version upgrade, but a new clause has been added.


The clause states: When the FSD Beta function is turned on if a car accident or any serious safety risk occurs, I (the owner) agree to Tesla's collection of image data inside and outside the car, and the data contains the vehicle VIN mark.


Vehicle Identification Number (Vehicle Identification Number), abbreviated as VIN, is a set of 17 letters or numbers. It is a unique number used on a car to identify the car’s manufacturer, engine, chassis number, and other features. material.

Previously, Tesla was also collecting camera data from vehicles, but it had nothing to do with the vehicle or the owner. Tesla called it a "shadow mode." When the driver is driving manually, Tesla's automatic driving will work in real-time, sensing the external environment and calculating the entire driving process, but it will not be executed. If there is a deviation between the human driving process and Tesla's judgment, at this time, Tesla will record the data for a period of time before and after. Then the video data is sent back to Tesla's data center for analysis and learning, and finally to improve the ability of autonomous driving.


Previously, Tesla had been questioned due to camera privacy issues. Foreign media Consumer Reports stated that in addition to using camera data for machine learning, Tesla may also use this data for other purposes, such as may be used for construction. A system that tracks driver behavior or other commercial activities. In China, due to privacy and security considerations, the sensing capabilities of Tesla sensors have also received widespread attention.


By adding clauses related to safety risks and accidents, Tesla speculated that it hoped to obtain image evidence of relevant vehicles in FSD accidents.


Before adding this clause, Tesla could not contact any vehicle with the collected video without the authorization of the owner. Now, this situation is about to change.


Some Twitter netizens said that they support Tesla's update and adjustment this time. By viewing the video data, it can prevent the car owner from dumping the responsibility of the car accident to the FSD.


Tesla CEO Elon Musk also agreed with netizens in the comment section.


Tesla is now pushing a new version of the software to its internal testers and car owners who participated in the FSD Beta test with a driving safety score of 98 or above.


A Model 3 accident occurred and the owner said that the FSD had a problem


As the beta software for Tesla's fully automated driving, FSD Beta has a certain degree of peculiarities. Users participating in the test need to provide driving data to Tesla in order to improve its autonomous driving capabilities. Therefore, Tesla has a certain legitimacy in collecting images related to vehicles. In addition, by collecting specific data in specific scenarios (accidents or security risks), Tesla can improve the ability of FSD in a more targeted manner.

According to foreign media speculation, the direct cause of the adjustment may be a recent incident related to FSD Beta.


On November 3, a Tesla owner in the United States stated that there was a problem with the FSD Beta, which led to a car accident. The car owner submits the accident report on the NHTSA website.


The accident report showed that the Tesla owner turned on FSD Beta while driving Model Y, and the vehicle entered the wrong lane, causing the vehicle to collide with another vehicle. Subsequently, the car owner submitted the accident report to the NHTSA website.


The owner stated that the vehicle turned on the FSD Beta function at the time and the system issued a warning when the car turned halfway to the left. It tried to turn the steering wheel to prevent the vehicle from entering the wrong lane, but FSD Beta did not exit and still maintained the autopilot function. After entering the wrong lane, his car collided with the oncoming car, and the driving side of the car was severely damaged.


In response to the accident report, many Tesla owners expressed suspicion, believing that it is unlikely that the owner will not be able to take over the steering wheel. This is because when the Tesla FSD Beta is turned on, the manual operation has a higher priority. Whether the driver depresses the brake pedal, accelerator pedal, or turns the steering wheel, the FSD will automatically exit, and all driving operations will be completed by the driver.


Tesla has not yet responded to this accident.


However, this time Tesla added this authorization clause during the FSD Beta 10.5 update, which also reflects that Tesla is acquiring some data through this method of collecting images related to the vehicle identification number. When encountering FSD-related accidents later, Tesla or the official investigation agency can better analyze whether it is the cause of FSD or the owner's own operation problem.


Tesla is constantly adjusting its fully automated driving test program


Tesla is very strict in the control of FSD push, and the user's safety score needs to reach a certain point to get the update push of FSD Beta. Elon Musk also stated that if drivers use FSD Beta irresponsibly, they will be removed from Tesla's internal testing program after receiving a warning.


This time Tesla requested to obtain the driving video data of the user's vehicle. The reason may be to analyze what caused the accident at the time of the accident, to prevent the owner from pushing the responsibility to the FSD, or to use machine learning to promote the iteration of its FSD.


Tesla continues to adjust its fully automated driving test program and believes that its FSD Beta will gradually improve and promote the development of the entire autonomous driving field.

2 views0 comments

Recent Posts

See All