A Tesla Model S crashed into a police car last Wednesday (August 26) while the driver was busy watching movies on a mobile phone, according to authorities in Raleigh, North Carolina, USA. .
The crash happened around midnight on a stretch of a 4-lane highway that passed through the deserted countryside. A policeman from the highway patrol was assisting a Nash County deputy in response to a roadside collision when a Tesla, in Autopilot mode, crashed. into the car.
The driver, identified by police as Devainder Goli, is a doctor in Raleigh, was charged with violating state regulations by failing to stop during a highway emergency, concurrently watching movie while driving.
No one was injured in the crash, the driver also did not comment.
Tesla Model S after a collision with a police car.
“We blocked a single lane that night and then death was there just a few steps behind us”, Nash County Sheriff Keith Stone recalls.
Mr. Stone added: “The crash shows that automation will never replace a driver’s attention. No texting, no phone calls, focus on what you’re doing, it’s driving.”
This is not the first time a Tesla car has caused an accident or crashed into an obstacle on the road while operating in Autopilot mode, the manufacturer’s name calls its incomplete driver assistance software. Tesla’s Autopilot manual says the driver should stay awake and keep hands on the steering wheel at all times, and the system also provides warnings if the driver leaves the wheel within seconds.
The video shows a Tesla car in Autopilot mode crashing into an obstacle on the highway by not recognizing it from afar.
But public safety officials, industry experts, and various consumer groups have argued that CEO Elon Musk’s name and shared information about its function was misleading and possibly exacerbating. add unsafe behavior.
“People abuse Autopilot, not because they are new and don’t understand it.”Elon Musk shared on this issue. “First-time Autopilot users are extremely paranoid about it. It’s not like ” If you use a different name, I’ll actually treat it differently. ‘ with Autopilot, it’s because someone is overusing it and using it as opposed to how we were told it should be used. “