After five months and more than 1,600 kilometres driven with Tesla’s autonomous driving system, an American journalist offers a nuanced account: the experiment is revolutionary, but far from infallible. Between technological fascination and embarrassing situations, he draws up a lucid assessment of the promise and limits of autonomous driving on our roads.
A long-term trial in the heart of Silicon Valley
When Alistair Barr, tech editor at Business Insider, took possession of his Tesla Model 3 Performance in December, he wasn’t expecting such an adventure. Thanks to a free trial offer, he discovered Tesla’s famous Full Self-Driving (FSD) software, presented as the quintessence of driver assistance. Passionate about cars and technological innovation, he decided to test the system in real-life conditions, on the roads of Silicon Valley and beyond, to gauge its strengths and weaknesses.
The first few trips are a real surprise. The FSD handles the majority of driving situations, making journeys in dense traffic much less stressful. The system manages stops and changes of direction, enabling the driver to arrive at his or her destination in better spirits. Barr also highlights the energy efficiency of the software, which consumes less battery power than conventional human driving.
But this efficiency has a downside: scrupulous compliance with the highway code, sometimes to excess. For example, FSD systematically stops at stop signs, where many human drivers would tend to give way only slightly. This behaviour, initially perceived as a nuisance, quickly becomes a reminder of bad human habits and a guarantee of safety.
Technical limits and risk situations
However, the magic works until the system reaches its limits. Pothole management leaves something to be desired, with the Tesla often driving straight ahead where the driver would have avoided the obstacle. More worryingly, the FSD can become completely blocked when faced with unexpected situations, such as a vehicle parked in a narrow lane, forcing the driver to take control again.
On the motorway, the software also shows its weaknesses: in ‘Chill’ mode, it stays in the slow lane and is slow to anticipate the necessary lane changes, which exposes the driver to last-minute manoeuvres, far from the fluidity of an experienced driver. More serious incidents have occurred, such as during a test in San Francisco, where the Tesla ran a red light in FSD mode, or made a risky U-turn in the wrong lane, in full view of the police. These episodes are a reminder that autonomy is not synonymous with infallibility.
After several months of use, the initial enthusiasm has given way to a degree of caution. The journalist confides that he is now reluctant to pay the monthly subscription of 99 dollars for the FSD, preferring to activate it occasionally during long journeys. The experiment has also generated moments of embarrassment, particularly when the car performs a dangerous manoeuvre in front of a friend or police officers, causing an unexpected feeling of embarrassment.
A gap between assistance and total autonomy
The analysis of Bryant Walker Smith, an expert in mobility law, sheds some essential light on the issue: Tesla’s SDF remains a driving assistance system, not a truly autonomous technology. The driver must remain vigilant and ready to intervene at all times. The difference between current assistance and the promise of total autonomy is comparable, according to Smith, to the difference between climbing a cliff with a rope or solo: the risk is not the same.
The journalist’s verdict is clear: Tesla’s autonomous driving impresses with its advances, but its current limitations mean that constant vigilance is required. The experience reveals both the potential of the technology and the need not to let your guard down. For the time being, the future of autonomous driving looks promising, but the road to total autonomy remains full of pitfalls.
