How Tesla’s Autopilot Feature Has Turned Deadly

The Tesla Autopilot feature has been increasingly involved in fatal crashes due to drivers relying solely on this technology to operate the vehicle instead of actively paying attention to the road.

By Ryan Clancy | Published

Electric car manufacturer Tesla is in hot water once again, and for once, it is not something to do with what Elon Musk said. Tesla is at the forefront of innovation regarding technology, with its cars breaking every electric vehicle sales record. While it is at the front of producing new technology for the world, it seems the world may not be ready for it, as the Tesla Autopilot feature has been increasingly involved in fatal accidents.

Technology has taken leaps and bounds in the last few decades, with innovation only becoming fast and faster. Every week or month, a company seems to produce new technology to use in our daily lives. Did anyone think we would have voice-activated lights twenty years ago? Thank you, Alexa.

While any innovation is good, Tesla’s self-driving mode has come under fire again. Tesla’s self-driving mode, Autopilot, was involved in a fatal motorbike accident this summer, bringing the total number of crashes up to three. These accidents raise questions about Autopilot’s safety and whether it should be allowed in vehicles at all.

The three fatal crashes occurred within 51 days of each other this summer, and all involved a driver using Tesla Autopilot. The release of details from the third crash sparks new debate on whether they are fit for purpose.

Tesla Autopilot uses cruise control and lane-keeping systems technology, which keeps the vehicle within the same lane. It is installed into cars to make the driving experience hassle and stress-free. While this sounds like the car is essentially driving itself, it still requires a fully attentive driver behind the wheel.

And this is where the confusion lies. Research shows that a high percentage of drivers are using systems like Autopilot as self-driving modes to be busy with other activities like eating, drinking, or reading a book, which means they are not concentrating on driving.

tesla autopilot self-driving autonomous vehicle

There are concerns that there is a blindspot in Autopilot’s technology for motorcycles, and the technology makes the driver lazy and complacent. It is a worry that if Tesla Autopilot can’t see motorbikes, will it be able to see pedestrians or children? People feel that the government should implement regulations for Tesla to ensure motorcycle detection, especially after three fatal accidents in less than two months.

Even though numerous car manufacturers, like Tesla and General Motors, that supply these systems have said that their addition does not make any vehicle self-driving, they are still a massive safety hazard for all drivers on the road, especially at night when visibility is poor. Motorcycle safety advocates state that the National Highway Traffic Safety Administration should be testing these parameters to ensure the safety of motorcyclists during safety checks of all new vehicles.

Technology that can be used to make life easier or to complete a task with less error is a technology worth developing and implementing into our lives. Technology like Tesla Autopilot looks good on paper but in real-life applications is a significant safety hazard for all drivers both in the vehicle and outside it. Is the world ready for computer-assisted driving, or has technology gone far enough?