Video of Tesla crash shows exactly why Autopilot isn't true self

时间:2024-09-21 17:33:18 来源:泸州新闻网

Listen up, Tesla fans.

Your favorite carmaker's Autopilot is a super-smart marriage of high tech hardware and software that provides one of the best semi-autonomous driving experiences currently on the road.

That said: It is nota fully self-driving system and shouldn't be used as such.

A video posted to the Tesla Motors subreddit and spotted by Electrekshows exactly what can happen when a Tesla Model S driverputs too much trust in his car's Autopilot. It's a reminder that a human driver is always needed in a Tesla, even when the Autopilot system is engaged.

SEE ALSO:The Tesla Model 3 gets real in July

The accident first came to light when a redditor posted pictures of his thrashed Model S. He claimed his first-generation Autopilot never alerted him to take manual control of the car -- the system is supposed to cue drivers to take over when the car can't handle road conditions -- before it "misread the road" and sideswiped the highway barrier. He walked away from the crash with some minor bruises.

A few days later, the video of the alleged incident was posted to the subreddit. The footage, taken by the dashcam of a car directly behind the Model S, shows a poorly-marked construction zone playing havoc with the Autopilot system, running the car into the lane barrier. You can watch for yourself here.

Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.By signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!

Other redditors were quick to point out the original poster never mentioned a construction zone when he shared the pics, let alone the shoddy road markings that made the particular stretch of highway tricky for even attentive human drivers. Most commenters agreed on one thing: the driver wasn't paying enough attention to the road -- and was therefore using Autopilot the wrong way.

They're right. As cool as it is that you can flip on Autopilot on the highway and let the car handle most of the work, it's not safe to give it free rein, or even take your hands off the wheel.

Tesla's first generation Autopilot system has Level 2 autonomy, which means the car can handle some of the driving responsibilities on its own, mostly for breaking and keeping the car in its lane. But a human still needs to be in the driver's seat keeping an eye on the road, ready to take the wheel when the machine can't handle the conditions.

Some groups, like the German government, have argued that "Autopilot" is a misnomer at best, and potentially fatal false advertising at worst. The system came under fire after the first self-driving fatality last year, when a distracted driver's Model S drove underneath a white tractor trailer. A National Highway Traffic Safety Administration (NHTSA) investigation, however, cleared Tesla of any liability, deeming Autopilot safe if used correctly.

Tesla has bigger plans for the system with Enhanced Autopilot. Updated sensor hardware now available in new models, along with incremental software updates that will be released throughout this year, will purportedly bring the system to Level 5 autonomy. That means full-on self-driving cars that can handle any and all road conditions just as well as a human driver can -- potentially better.

For now, though, the full Enhanced Autopilot and Level 5 autonomy isn't here -- so until it is, remember that your "self-driving" cars still need your full attention.


Featured Video For You
I tried a self-driving car in London and lived to tell the tale

UPDATE: July 22, 2021, 10:29 p.m. EDT This story has been updated to remove a video embedded from Vidme.

推荐内容