Welcome To aBlackWeb

3 crashes, 3 deaths raise questions about Tesla's Autopilot

DOS_patos

Unverified Legion of Trill member
Three crashes involving Teslas that killed three people have increased scrutiny of the company’s Autopilot driving system just months before CEO Elon Musk has planned to put fully self-driving cars on the streets.

On Sunday, a Tesla Model S sedan left a freeway in Gardena, California, at a high speed, ran a red light and struck a Honda Civic, killing two people inside, police said.

On the same day, a Tesla Model 3 hit a parked firetruck on an Indiana freeway, killing a passenger in the Tesla.


And on Dec. 7, yet another Model 3 struck a police cruiser on a Connecticut highway, though no one was hurt.

The special crash investigation unit of the National Highway Traffic Safety Administration is looking into the California crash. The agency hasn't decided whether its special-crash unit will review the crash that occurred Sunday near Terre Haute, Indiana. In both cases, authorities have yet to determine whether Tesla's Autopilot system was being used.

NHTSA also is investigating the Connecticut crash, in which the driver told police that the car was operating on Autopilot, a Tesla system designed to keep a car in its lane and a safe distance from other vehicles. Autopilot also can change lanes on its own.

Tesla has said repeatedly that its Autopilot system is designed only to assist drivers, who must still pay attention and be ready to intervene at all times. The company contends that Teslas with Autopilot are safer than vehicles without it, but cautions that the system does not prevent all crashes.

Even so, experts and safety advocates say a string of Tesla crashes raises serious questions about whether drivers have become too reliant on Tesla's technology and whether the company does enough to ensure that drivers keep paying attention. Some critics have said it's past time for NHTSA to stop investigating and to take action, such as forcing Tesla to make sure drivers pay attention when the system is being used.

NHTSA has started investigations into 13 Tesla crashes dating to at least 2016 in which the agency believes Autopilot was operating. The agency has yet to issue any regulations, though it is studying how it should evaluate similar “advanced driver assist” systems.

“At some point, the question becomes: How much evidence is needed to determine that the way this technology is being used is unsafe?" said Jason Levine, executive director of the nonprofit Center for Auto Safety in Washington. “In this instance, hopefully these tragedies will not be in vain and will lead to something more than an investigation by NHTSA."
Levine and others have called on the agency to require Tesla to limit the use of Autopilot to mainly four-lane divided highways without cross traffic. They also want Tesla to install a better system to monitor drivers to make sure they're paying attention all the time. Tesla's system requires drivers to place their hands on the steering wheel. But federal investigators have found that this system lets drivers zone out for too long.

Tesla plans to use the same cameras and radar sensors, though with a more powerful computer, in its fully self-driving vehicles. Critics question whether those cars will be able to drive themselves safely without putting other motorists in danger.

Doubts about Tesla's Autopilot system have long persisted. In September, the National Transportation Safety Board, which investigates transportation accidents, issued a report saying that a design flaw in Autopilot and driver inattention combined to cause a Tesla Model S to slam into a firetruck parked along a Los Angeles-area freeway in January 2018. The board determined that the driver was overly reliant on the system and that Autopilot's design let him disengage from driving for too long.

In addition to the deaths on Sunday night, three U.S. fatal crashes since 2016 — two in Florida and one in Silicon Valley — involved vehicles using Autopilot.

David Friedman, vice president of advocacy for Consumer Reports and a former acting NHTSA administrator, said the agency should have declared Autopilot defective and sought a recall after a 2016 crash in Florida that killed a driver. Neither Tesla's system nor the driver had braked before the car went underneath a semi-trailer that had turned in front of the car.
“We don't need any more people getting hurt for us to know that there is a problem and that Tesla and NHTSA have failed to address it," Friedman said.
In addition to NHTSA, states can regulate autonomous vehicles, though many have decided they want to encourage testing.

In the 2016 crash, NHTSA closed its investigation without seeking a recall. Friedman, who was not at NHTSA at the time, said the agency determined that the problem didn't happen frequently. But he said that argument has since been debunked.
Friedman said it's foreseeable some drivers will not pay attention to the road while using Autopilot, so the system is defective.

“The public is owed some explanation for the lack of action,” he said. “Simply saying they're continuing to investigate — that line has worn out its usefulness and its credibility.”
In a statement, NHTSA said it relies on data to make decisions, and if it finds any vehicle poses an unreasonable safety risk, “the agency will not hesitate to take action.” NHTSA also has said it doesn't want to stand in the way of technology given its life-saving potential.

Messages were left Thursday seeking comment from Tesla.

Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University, said it's likely that the Tesla in Sunday's California crash was operating on Autopilot, which has become confused in the past by lane lines. He speculated that the lane line was more visible for the exit ramp, so the car took the ramp because it looked like a freeway lane. He also suggested that the driver might not have been paying close attention.

“No normal human being would not slow down in an exit lane,” he said.

In April, Musk said he expected to start converting the company's electric cars to fully self-driving vehicles in 2020 to create a network of robotic taxis to compete against Uber and other ride-hailing services.
At the time, experts said the technology isn't ready and that Tesla's camera and radar sensors weren't good enough for a self-driving system. Rajkumar and others say additional crashes have proved that to be true.
Many experts say they're not aware of fatal crashes involving similar driver-assist systems from General Motors, Mercedes and other automakers. GM monitors drivers with cameras and will shut down the driving system if they don't watch the road.
“Tesla is nowhere close to that standard,” he said.
He predicted more deaths involving Teslas if NHTSA fails to take action.
“This is very unfortunate,” he said. “Just tragic."
 
The fact that anyone is stupid enough to use any sort of 'autopilot' is astounding.
The fact that people are stupid enough to expect autopilot to do 100% of the driving for you that you just take a nap or take your eyes off the road is more astounding.
 
airplanes had autopliot waaayy before Tesla first put autopilot in their vehicles, and the pilot(s) still have to be aware and alert to take control of the situation.

Tesla just have to take that L and make the autopilot safer, cause even though it’s these dumbass drivers fault it’s also Elon’s fault for selling it like it’s fully self drive-able when it’s not
 
airplanes had autopliot waaayy before Tesla first put autopilot in their vehicles, and the pilot(s) still have to be aware and alert to take control of the situation.

Tesla just have to take that L and make the autopilot safer, cause even though it’s these dumbass drivers fault it’s also Elon’s fault for selling it like it’s fully self drive-able when it’s not
here i come with my shit.

tesla has alot of problem.

these fuckers dont even do the proper tests to be sure the cars are safe.

in 2018

in 2017
 
Tesla's "autopilot" is just adaptive radar cruise control other vehicles have had for years, just a lil more advanced. The reason they call it "autopilot" is simply for marketing and it worked. But like always, people have more money than brains and think the car drives itself and end up crashing. IDK how they havent been sued for misleading people with that name.

Also heres a fun fact. Cruise control was invented in 1958 and they initially called it "auto-pilot", but decided it wasnt a good idea to name it that and changed the name.

airplanes had autopliot waaayy before Tesla first put autopilot in their vehicles, and the pilot(s) still have to be aware and alert to take control of the situation.

Tesla just have to take that L and make the autopilot safer, cause even though it’s these dumbass drivers fault it’s also Elon’s fault for selling it like it’s fully self drive-able when it’s not
Auto-pilot on planes and boats are different. All it does is hold a general course, not drive it for you.
 
Yeah I wouldn’t use autopilot or anything that has the car do anything for me

The only thing I might use is to summon the car to me
 
Back
Top