OAKLAND, Calif. - A self-driving Tesla was put to the test by KTVU Friday to try to determine how a Model S caused a Thanksgiving Day eight-car pileup on the Bay Bridge.
The crash has raised questions about the technology and the driver's role in the incident. Several Tesla owners and experts suggest human error may be a factor, amid a federal investigation.
In an effort to recreate the circumstances of the Nov. 24, 2022 crash, the test was conducted in the same lane, at the same spot, with roughly the same amount of traffic.
Wilmer Awayan, co-founder of Tesla Owners East Bay, a regional group of thousands of Tesla enthusiasts, drove a Model 3 at KTVU's request for the test. His car is a different model but contains the same full self-driving software.
"I know the car will make a move quickly and you’ve got to be attentive all the time," he said. "This is where people I think get complacent and they forget that you need to be fully aware."
After engaging the full self-driving system, the Tesla was able to safely travel through the Yerba Buena Tunnel without any incident or issue. The test was considered successful.
"I think the system worked well through there," Awayan said. "Again, I don’t know what the other person was doing at the time of the crash or the status of the full self-driving."
Following a public records request, the California Highway Patrol released several surveillance videos Thursday that show the white Tesla in question that changed lanes and braked on the Bay Bridge on Thanksgiving. Eight cars were involved and several people reported minor injuries.
The CHP report states the driver told police he had been using the full self-driving feature before the Tesla brakes activated and it moved into the left lane, coming to a stop.
The CHP could not determine if the software was in operation, according to the report.
"The driver was not responsive," MotorTrend Testing Director Eric Tingwall said. "That’s the one thing that’s really obvious."
Tingwell said despite the name, self-driving requires an attentive driver who ready to take over.
Unlike other makes that have radar and lasers, Tesla fully depends on its advanced camera system to function.
Tesla drivers have reported issues with phantom braking, where the car suddenly starts to stop for no apparent reason or other cases where the Tesla senses a threat that’s not actually there.
"We are years if not decades away from that true self-driving ideal where somebody could sleep in the driver seat or not even be in the driver seat at all," Tingwall said.
Many of the issues are attributed to not only the full self-driving technology but also the autopilot features.
"It fails routinely in every day, basic situation that human drivers can negotiate," said Tingwall.
Tesla did not respond to KTVU’s request for comment.
But those documented problems underscore the need to keep eyes on the road and hands on the wheel.
Tesla’s full self-driving mode will warn a driver to pay attention, but if the warnings are ignored, the system will shut down.
"It’s counting down to disable the autopilot," Awayan said. "See, I’m slowing down right now."
KTVU found after several warnings, the shutdown happened so quickly that it could pose a danger on the road.
"I’m taking over because we’re going too slow," Awayan said. "It’s not perfect. It cannot replace the driver right now. No way."