Just when we thought they were safe, self-driving cars had to show the world they can be duped by road signs... again. It's a really good thing self-driving cars are in the experimental, soft-launch phase still because despite how amusing this story is, an issue like this could cost lives.

Self-driving cars rely on a camera array (among many other tools) to decide what constitutes safe driving in a given scenario. The cameras check for people and animals, but also for buildings, puddles, and of course, road signs. Speed limit information can come from GPS technology in a fashion similar to the way Google Maps provides speed limit information in the app, but self-driving cars can also get this info by "reading" road signs.

Related: Goldeneye 007 Mod Puts Tesla's Cybertruck Where It Really Belongs

This particular experiment involved McAfee (yes – of antivirus fame), tampering with the speed limit. The goal was to find out if the Mobileye EyeQ3 camera installed on the 2016 Tesla Model S could properly adjust to a 35 MPH speed limit sign, that had been slightly altered with black tape so it appeared to read 85 MPH. It was a very slight modification,  so much so that it's almost indiscernible without prior knowledge, and the EyeQ3 fell for it. When set to traffic-aware cruise control (self-driving mode), the Tesla accelerated well past 35 MPH until the driver manually stopped the car at around 50 MPH.

Self-Driving Cars Are Not THAT Bad

Tesla cybertruck rocket league

While it made for an amusing video and gave every cynic an "I told you so" moment, self-driving cars are not as unreliable as the footage might make them seem. The EyeQ3 camera McAfee is testing is not in any of Tesla's newer vehicles. The company now uses a camera they developed in-house. As far as Mobileye goes, they've updated the software in the EyeQ3, and have since released upgraded models of that device as well. Their tech powers the AI-controlled braking systems in several types of self-driving vehicles, and the consensus is they perform adequately.

It's understandable that people would be alarmed seeing such expensive technology hacked by something as rudimentary as a piece of tape, but this isn't the first time a vehicle has been confused by signs. In 2017, researchers at several universities were able to convince a self-driving vehicle to ignore stop signs. They did so by arranging small black-and-white stickers in specific places on the sign. The configuration was based on knowledge of how the vehicle's algorithm detected stop signs, so it's not something a random person would have been able to do, but it still makes it easy to cast doubts on the validity of our self-driving future. Fortunately, these sorts of tests have led manufacturers to give self-driving vehicles far more robust traffic detection systems that go beyond what a camera lens can detect. That's good news because it'll take a lot to get people out of their nightmares and into these cars.

Next: Elon Musk Unveils Tesla Cybertruck And It Looks Like It's From Cyberpunk 2077

Source: McAfee / YouTube