Experts Trick Tesla Autopilot Using a $300 Projector
Six months ago, Ben Nassi, a graduate student at Ben-Gurion University, under the supervision of Professor Yuval Elovici, conducted a series of successful spoofing attacks on the Mobileye 630 Pro system, which assists drivers in controlling their vehicles. At that time, Nassi used budget drones and projectors. Since then, the researcher has improved his attack and recently managed to fool the Tesla Model X autopilot, as he revealed last week at the Cybertech Israel conference.
The essence of the attack, described in the report Phantom of the ADAS: Phantom Attacks on Driving Assistance Systems (PDF), is that a modern car with autopilot and ADAS (Advanced Driving Assistance Systems) can be deceived and made to perform unwanted actions (such as braking or changing lanes) simply by projecting 2D images onto the road, surrounding objects, or by embedding certain “triggers” into video billboards.
The researcher’s report focuses on two technologies: Mobileye 630 PRO (used in Honda, Mazda, and Renault vehicles) and the Tesla Model X HW 2.5 autopilot.
How the Attack Works
These attacks are based on the fact that humans and AI recognize images differently. The images Nassi and his colleagues used to fool the Tesla would not confuse a human driver at all. In fact, some attacks were specifically designed to mislead the car while remaining almost invisible to people. The car only needs to “see” the fake image for a few milliseconds for the attack to work.
Last year, experts from Tencent Keen Security Lab demonstrated a similar trick on Tesla’s autopilot by applying barely noticeable “interference” to the road surface, which the car mistook for lane markings.
As mentioned above, the Ben-Gurion University team focused their research on 2D projections. For their experiments, they used inexpensive projectors costing about $300, available at any retail store. These projectors can be operated by a person or mounted on drones (this time, due to drone regulations in the country where the tests were conducted, the researchers did not use drones).
Nassi identified several cases where 2D projections are perceived by vehicle systems as real objects. For example, cars are confused by two-dimensional projections of people and vehicles projected onto the road, causing the autopilot to brake to slow down or stop the car entirely.
Manipulating Lane Markings and Road Signs
The researchers also partially replicated the Tencent Keen Security Lab experiment by projecting new lane markings onto the road, which made the car think it needed to turn or change lanes. Such an attack could be dangerous, as it might cause a car to suddenly move into an adjacent or even oncoming lane.
But it’s not just about projecting 2D objects onto the road. Projections of road signs onto almost any surface—including walls and even trees—work just as well. In fact, autopilot systems cannot distinguish these fakes from real road signs.
Worse yet, projections aren’t always necessary. For example, the autopilot can be fooled by embedding almost invisible 2D objects into video ads on roadside billboards. The researchers embedded an image of a road sign into a video on a billboard, causing the car to believe it could travel much faster than the speed limit.
Challenges and Risks
Although the experts have already informed engineers at Mobileye and Tesla about their findings, it’s unlikely that anything will change soon. The problem is that it’s still very difficult to teach software to distinguish between real road signs and convincing projections on, say, tree leaves. The same goes for telling real lane markings apart from fake projected lines. However, it may be possible to teach cars to distinguish projected people and vehicles from real ones (similar to how facial recognition systems tell photos apart from real faces).
Of course, most automakers have repeatedly emphasized that ADAS and autopilot systems should only be used under human supervision, meaning the driver must always monitor the road and be ready to take control or hit the brakes. However, as we know, many people ignore this rule. Nassi and his colleagues warn that attacks using 2D projections do not require significant financial investment or technical expertise. In fact, any teenager with a cheap projector (and, if desired, a drone) could “prank” drivers on the nearest highway this way.