GhostStripe Attack Tricks Car Autopilots into Ignoring Road Signs
A team of researchers has developed an attack that can disrupt the operation of self-driving cars by exploiting their camera-based computer vision systems. As a result, these vehicles stop correctly recognizing road signs.
The attack, called GhostStripe, will be presented in detail at the ACM International Conference on Mobile Systems next month. For now, the researchers have published a report stating that GhostStripe is invisible to the human eye but can be extremely dangerous for drivers of vehicles like Tesla and Baidu Apollo. This is because it interferes with the sensors used in these cars, specifically the CMOS camera sensors.
How the GhostStripe Attack Works
The core of the attack involves using LEDs to illuminate road signs with various light patterns, making it impossible for the car’s software to recognize them. The attack targets the rolling electronic shutter of CMOS camera sensors. LEDs rapidly flash different colors on the sign, causing, for example, the red shade on a “Stop” sign to appear differently for each scan line.
As a result, the camera captures an image with lines that don’t match each other. This image is then cropped and sent to the car’s software classifier, which typically uses a neural network. Since the image contains many mismatched lines, the classifier fails to recognize it as a road sign, so the car does not respond to it.
“For a stable attack, it’s necessary to carefully control the LED flickering based on information about the victim camera’s operation, evaluating the position and size of the road sign in the camera’s field of view in real time,” the experts write.
Two Versions of the Attack
The team developed two versions of their attack:
- GhostStripe1: This version does not require access to the target vehicle. It uses a monitoring system to track the car’s location in real time and dynamically adjusts the LED flickering so the sign cannot be read properly.
- GhostStripe2: This targeted version requires access to a specific vehicle. For example, access could be gained secretly during maintenance. This attack involves installing a special converter on the camera’s power wire to detect framing moments and more precisely control the timing.
“This way, the attack targets a specific victim vehicle and allows for better control over the road sign recognition results,” the researchers note.
Testing and Effectiveness
The team tested their system on a real road and a car equipped with a Leopard Imaging AR023ZWDR camera, which is used in the reference hardware for Baidu Apollo. They tested GhostStripe on “Stop,” “Yield,” and “Speed Limit” signs. GhostStripe1 succeeded in 94% of cases, while GhostStripe2 worked in 97% of cases.
The report also notes that bright ambient light reduces the effectiveness of the attack. “This degradation occurs because the attacking LED light is suppressed by the surrounding illumination,” the researchers say. This means attackers would need to consider the time and place when planning an attack.
Possible Defenses
According to the researchers, the simplest way to defend against such attacks is to replace the rolling shutter CMOS camera with a sensor that captures the entire image at once, or to randomize the line scanning. Additionally, increasing the number of cameras can reduce the attack’s success rate. GhostStripe patterns can also be included in AI training so the system learns to recognize and handle such issues.