字幕表 動画を再生する
(door slamming) (feet plodding)
(traffic buzzing)
- [Narrator] February 27th, 2021,
police body cam footage
shows a traffic stop
on a highway in Montgomery County, Texas.
- [Police officer] Hey, how's it going?
- [Narrator] As this 2019 Tesla Model X
driving in autopilot mode,
strikes their vehicles at 54 miles per hour,
injuring five officers,
and hospitalizing the subject of the original traffic stop.
- [Paramedic] Again, roll up, roll up.
One, one, two, three. (patient moaning)
- [Narrator] The crash is one of 16
between Tesla's and emergency vehicles
being investigated
by the National Highway Traffic Safety Administration
or NHTSA, to determine whether Tesla's Autopilot
has contributed to the accidents.
The journal obtained exclusive dashcam footage
and partial data logs from the Tesla in the Texas crash.
We annotated the footage for clarity.
These materials show the car's Autopilot System
failed to recognize the stopped emergency vehicles in time,
and though, its Driver Monitoring System
appears to have worked as designed,
it was not enough to sideline the impaired driver
and prevent the collision.
We also obtained eight Crash Reports
included in the NHTSA investigation.
In at least six,
the incidents occurred
when emergency vehicle lights were flashing.
NHTSA declined to comment on an ongoing investigation,
and Tesla did not respond to a request for comment.
(dramatic music)
The driver in the Texas crash begins his 45-minute trip
just before 12:30 AM.
"He is intoxicated," according to police reports.
The police investigation notes,
"Several instances when he swerves in his lane."
About four minutes into his drive,
the log show he sets the car on Autopilot
at a speed of 64 miles per hour.
Tesla's Autopilot is a system
that partially automates many driving tasks on highways,
including steering, braking, and lane changes.
Drivers using Autopilot are supposed to remain engaged
so they can take control of the car at any time.
Federal investigators have said, "Tesla's marketing,
including the name Autopilot, exaggerates its capabilities,
and encourages drivers to misuse the technology."
The Autopilot System for a 2019 Model X,
the model in the Texas crash,
judged whether the driver was alert
based on whether their hands were on the wheel.
If the system did not detect hands on the wheel,
the driver received an alert.
If the driver didn't respond,
Autopilot would disengage.
"The driver in the Texas crash
receives one of these alerts less than two minutes
after engaging Autopilot,"
according to the car's logs, and he complies.
He receives two more in the next minute, and complies.
In fact, he receives and complies with 150 of these alerts
over the course of this 45-minute drive.
By the design of Tesla's Driver Monitoring System,
the driver was paying enough attention
to operate the vehicle in Autopilot.
This year and model of Tesla
uses a combination of radar and camera technology
to recognize objects in all directions.
Autonomous vehicle experts say,
"The radar can easily recognize moving vehicles,
but that it has difficulty
distinguishing stationary obstacles,
leaving it mainly up to the cameras to detect them."
Around 15 minutes into the drive,
the logs indicate the technology is working.
It recognizes a vehicle about 120 yards ahead.
Then again, about 35 minutes into the drive,
the Tesla sees a vehicle as it merges 70 yards ahead
and tracks it as it drives off.
About 45 minutes in,
the Tesla approaches emergency vehicles
on the side of the road.
Autopilot is not designed to identify them as obstacles
because they're not in a lane,
but an attentive driver
would typically know to slow down or change lanes.
Around the same time,
the driver receives his 150th warning
to keep his hands on the wheel.
He complies, but it's not enough
to get him to respond to the stopped vehicles.
Seconds later, other police cars
are visibly blocking the lane,
but the logs show no sign that the Tesla sees them.
Experts in autonomous vehicle safety
who reviewed the crash footage
say, "There's a difference
between the way the car's camera sees an ordinary vehicle
and an emergency vehicle.
The police car's flashing lights created a hazy image
that the car's software likely did not recognize
as a stopped vehicle."
The logs indicate
that, "The car finally recognized something in its path,
just 2.5 seconds and 37 yards before the crash.
Autopilot attempts to slow the car,
and ultimately, disengages,
but at 54 miles per hour, it is too late.
The five officers injured in the crash
are suing Tesla,
claiming that "The Autopilot Feature
was responsible for the accident."
An attorney for the officers declined to comment.
Tesla denies the lawsuit's allegations,
and claims, "The fault lies with the driver."
The driver did not respond to attempts to contact him.
In 2021, after repeated recommendations
from federal investigators,
Tesla began using internal cameras
to monitor driver attentiveness,
but safety experts continue to find flaws in its design,
and drivers still find ways to fool the system.
The same year,
Tesla also issued a software update
that was designed to improve Autopilot's ability
to detect emergency vehicles.
Though one of the crashes NHTSA is investigating
occurred after these updates.
The government has now expanded its investigation
beyond crashes with emergency vehicles
and is investigating the overall effectiveness
of Tesla's Autopilot System.
And they could ask the company to recall the cars
if they find the technology flawed.