Should Tesla be blamed when a driver doesn’t pay attention?

That’s a question that hangs over this piece from the Associated Press:

A design flaw in Tesla’s Autopilot semi-autonomous driving system and driver inattention combined to cause a Model S electric car to slam into a firetruck parked along a California freeway, a government investigation has found.

The National Transportation Safety Board determined that the driver was overly reliant on the system and that Autopilot’s design let him disengage from driving.

Elon Musk’s view on this is pretty clear: the feature is an assistance function, not a self-driving one (though he does promise it’ll be ‘full’ self-driving next year). I’ve used it, and he’s right — it only runs in limited circumstances, and if you keep your hands off the wheel, it’ll bleep at you and eventually disengage.

But here’s where Tesla’s responsibility gets a little less clear. First, the firm insists on misleadingly calling it Autopilot, and second, there is an argument—one I’ve heard from Google self-driving engineers in particular—that humans simply aren’t capable of snapping their brains into action when they need to take over. We’re just not built that way.

Tesla will point to many instances where drivers credit Autopilot—or just Tesla’s safety functions in general—with saving their life.

From the AP, here’s what the company says about the fatal accident near Culver City:

“Since this incident occurred, we have made updates to our system including adjusting the time intervals between hands-on warnings and the conditions under which they’re activated,” the statement said without further giving details.