Tesla’s Autopilot system, with its “Full Self-Driving” option, is nothing like its name suggests: it’s neither a true autopilot nor a fully self-driving vehicle. Instead, it’s an advanced driver assistance system that can help ease the driver’s workload when on a freeway or on clearly marked city streets. However, it’s far from a perfect system, as shown in this new video from YouTube channel Beta Tech OG, when his Model 3 nearly ran into an oncoming train.
In this video, the driver tests the Autopilot system, with the Full Self-Driving option, on the streets of Denver, Colorado. During the nearly 18-minute video, the Model 3 nearly hits several obstacles, causing the driver to give Tesla’s software a low rating. Two of the errors the software made were more glaring than the others and one of them nearly saw the Tesla being hit by one of Denver’s light rail trains as it attempted to turn left on the path of the train.
Before almost crashing into the train, the Model 3 was stopped at a red light, with its left turn signal on. This event signaled to the car that the driver wanted to turn left and, after the light turned green, the car was supposed to wait until the passage was clear to turn. However, instead of waiting, the Tesla either calculated that it had enough space and time to make the turn before the train arrived, or it misrecognized the train. However, what’s odd is that the Autopilot display on the infotainment screen, which shows the car’s recognized surroundings, materialized the train. You can actually see it on the screen before the car starts turning. Why the Tesla decided to turn left in front of the train, despite noticing it, is unclear; the driver was forced to take over and change the course of the car. If the car thought it could make the lap in time, its programming is too aggressive. Even the train operator honked, also believing that a turn was the wrong decision in this situation.
Later in the video, near the end, the Model 3 attempted to make a routine left turn, but took the turn too wide and nearly hit two pedestrians standing on the corner. At the end, the driver said he was “super disappointed with Tesla”, which is understandable given the expectations he had and the results on the ground. However, in Tesla’s defense, there is a button to report any issues that arise while using Autopilot. The catch is that all it takes is a single button press and the driver isn’t required to add any details about the situation, so it’s unclear how much of the report function to Tesla is really helpful.
There’s a much wider conversation to be had about the ethics of Tesla beta-testing its Autopilot Full Self-Driving option on public roads, using customers — not trained professionals — to test it. While Tesla is by no means the only company bringing its vehicle to town, when it comes to advanced driver assistance systems, other brands are much more cautious in their approach and don’t release software until it has not been fully tested by professionals in controlled environments. Luckily no one was hurt this time around and he was able to get his car away from the train and oncoming pedestrians, but that doesn’t mean everyone will be able to react in time in the future.
Apart from these two elements, there are also a few others. In fact, walking towards the oncoming train is just one of many serious cases in this video:
- the Tesla practically crashed into a barrier (which indicated that the road was blocked)*(07:06);
- the Tesla has chosen the wrong path and is visibly confused (confer the display on the dashboard, 11:06);
- the Tesla tried to run a red light, while the cars were moving (12:12);
- the Tesla stopped in the middle of an intersection for no reason (13:09);
- the Tesla chose the wrong lane for a left turn (1:25 p.m.);
- the Tesla constantly turned the left turn signal on and off for no reason (in a place where it was not even allowed to turn left, 15:02);
- the Tesla failed to turn left properly and almost hit pedestrians (17:13).
All this during a driving experience that lasted maybe 30 minutes (at times the video is sped up). Moreover, these are only the most serious cases, the Tesla drives very clumsily. It changes lanes too often (also at intersections) for no reason and it moves strangely at red lights, getting very close to passing traffic, for no reason.
Overall, the driver feels this is an incredibly poor performance.
A netizen still wondered: “I just don’t understand why you would give control to a system that is clearly still in beta. The cost of failure is far too high to take the risk”.
“Don’t be a crash test dummy for Tesla”
The electric car, generally equipped with a driver assistance system, is becoming more popular and a large number of new players have arrived on the market during the last decade, some being subsidiaries of large groups already existing on the combustion vehicles. This has led to increased competition and companies now seem to be using all means to promote their products. This may have happened in early January with Dan O’Dowd, who published a “harsh critique” of Tesla’s Full Self-Driving (FSD) software in the form of an advertisement in the New York Times.
Dan is the co-founder and CEO of Green Hills Software. Green Hills is a private company that builds operating systems and programming tools for on-board systems, but with the advent of the electric car it has also moved into developing driver assistance systems (ADAS – Advanced Driver Assistance Systems). Title Don’t be a Tesla crash test dummy (“don’t be a crash test dummy for Tesla”), Dan’s ad claims that in its current version, the FSD would kill millions of people every day if it powered more cars.
Dan based his criticism on a study of videos posted online that show Tesla owners using the full Full Self-Driving feature, which Tesla says is in beta and only allows limited range under driver supervision. . According to his study, the videos show that FSD makes a “critical driving error” every eight minutes and an “unintentional error” every 36 minutes that “would likely cause a collision”. On the banner ad, Dan considers the FSD to be the “worst commercial software” he’s ever seen and thinks it’s still in an alpha phase.
Because of this, he thinks it should be tested by internal Tesla employees rather than Tesla owners. “The software that drives the self-driving cars on which millions of lives will depend has to be the best software,” he said. While a restricted version of FSD is available to anyone with a Tesla, owners can also apply to become beta testers of a more advanced version if they have a high enough driving safety score, as determined. by the software of their car. Dan is indeed campaigning to ban Tesla’s FSD.
He said he placed the announcement under the auspices of the Dawn Project, a pressure group that campaigns for this. This is an organization that describes itself as “dedicated to making computers truly safe for mankind.” According to some critics, Dan’s advert appears partly as a publicity stunt designed to draw attention to his own business. Green Hills Software said earlier that month that its technology was being used to develop driver assistance software for the all-electric BMW iX, a sporty SUV that BMW showed at CES 2022.
In response, Elon Musk, CEO of Tesla, attacked the software from Green Hills Software. Musk tweeted, “Green Hills software is garbage,” and endorsed a comment that “FSD critics still have a huge financial stake in a competing solution.” But for his part, Dan said the best sources of information about a product are its competitors. “They tear them to pieces, they find out what they’re doing right, they find out what they’re doing wrong. They know better than anyone, and they’ll tell you. The seller will never tell you these things,” he said.
Additionally, he also alleged that the original version of Tesla’s Autopilot, which was a precursor to the FSD, was built using Green Hills Software. “I pulled out of the project and said, ‘I don’t know if this is right, if this is what we should be doing here, it’s not going to work,'” Dan said. had no further reaction aside from this and Tesla, which does not have a functioning media affairs office, did not comment on Dan’s claims that there is a connection between the technology driver assistance software from Green Hills Software and the FSD.
Be that as it may, some comments online have noted that it is “utterly ridiculous” to use third-party YouTube videos – instead of directly testing the technology – to extract “evidence” that the FSD would kill millions of people every day if installed on every car in the world. Another red flag would be the fact that the advertisement does not once use the full name of the Full Self-Driving Beta software. It does not mention the “Beta” part, which may lead some readers to believe that the Beta FSD is a finished product while it is still in development testing.
Source: video in text
What do you think ?
What do you think of the comments of the Internet user who thinks that since FSD is still in beta phase, drivers should not give it control?
What do you think of Tesla’s approach of giving its users the opportunity to test a beta version unlike the competition which only has its vehicles tested by professionals when its software is still in beta?