The first fatality of autonomous driving was a Joshua Brown of Canton, Ohio who was driving on a Florida road recently. The Tesla he was driving failed to recognise a trailer, and crashed roof first into the underside of the trailer, killing the driver(or should we say the passenger) instantly. Passing beneath the trailer, the Tesla continued to drive, striking two separate fences and a pole before finally stopping. This makes us think if we are actually ready for such technology?
2016 Tesla Model S 90D
The auto industry is betting big on self-driving cars with plans from major manufacturers to roll out the technology at various time in the near future. As of now, the most advanced among them is Tesla Motors which has rolled out the technology to thousands of cars which it says is in beta mode. Tesla Motors has always advised that the driver must always be alert and ready to take control of the vehicle if necessary.
Tesla has been using a technology which learns as the Tesla cars on road clock miles in real world conditions. ‘Fleet Learning’ as Tesla CEO Elon Musk explained , is “Essentially the network of vehicles is going to be constantly learning. As we release the software and more people enable autopilot, the information of how to drive is uploaded to the network. So each car, each driver is effectively an expert trainer in how the autopilot should work.”
But how far the technology has come? Is it really safe that people need not worry about their lives or will these become coffin on wheels? Mobileye, the company certain parts of autopilot technology said that the crash involved a vehicle turning in front of the Tesla. Their current emergency braking system does not react to such conditions and it will only be available in 2018.
So the question is whether the Tesla is at fault for giving customers half-baked technology or was Joshua at fault for trusting this piece of technology with his life? Tesla has been very forthright advising that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times,” and that “you need to maintain control and responsibility for your vehicle” while using it. But by simply putting up a disclaimer, Tesla cannot wash their hands off the incident.
As per Joshua’s YouTube account where he uploaded videos about Tesla, he was clearly impressed by the piece of technology. In one video titled “Autopilot Saves Model S”, he spoke highly of this software which avoided a crash. Maybe.. just maybe he trusted the technology which is still evolving too much to even think that this technology might fail sometimes.
So that brings us to the question whether this technology is needed or not? Yes, it is a fabulous piece of technology which will evolve rapidly over next few years. But do you pass on a failure prone technology to customers? No. Fleet learning is a magnificent tool as drivers all over the world encounter various driving scenarios and the autopilot learns them rapidly, achieving the goal of fool proof technology as fast as possible.
This is a question of “what came first, the Chicken or the Egg?’ Without clocking million of miles in real world, the technology will not improve and without using beta mode for a few years it won’t be failsafe. As all transition are tough, a move from manual to autonomous will be hard. The way out as of now is not to trust technology fully and be a responsible driver as manufacturers strive to bring in the latest and greatest of the 21st century to us.