Airline pilots faced with hacked or spoofed safety systems tend to ignore them – but could cost their airlines big sums of money, an infosec study has found.
An Oxford University research team put 30 Airbus A320-rated pilots in front of a desktop flight simulator before manipulating three safety systems: the Instrument Landing System (ILS), the Ground Proximity Warning System (GPWS) and the Traffic Collision Avoidance System (TCAS).
The team, who presented their paper at the NDSS infosec symposium, found that while their attacks against these systems "created significant control impact and disruption through missed approaches", all pilots in the study were able to cope and land their simulated aircraft safely.
Pilots in the study were exposed to false warnings from each of the systems to see what their reactions were. Most of them carried out missed approaches at first and tended to ignore or distrust the "hacked" system while going around to carry out a safe landing.
A go-around is expensive, with airlines racking up bills for extra landings, fuel and delay penalties.
Commenting on their findings, the researchers said in their paper: "Pilots are extensively trained to deal with the many faults which can emerge when flying an aircraft, and this was reflected in the results. However, the attacks generated situations which shared some features with faults but largely were different; they lacked indication of failure."
Whilst alarms force action they are quickly turned off or ignored if considered spurious.
Lead researcher Matt Smith, explaining the reasoning behind the study, told The Register: "We know these attacks exist but we don't know what would happen if they occurred," adding that there is existing research demonstrating attacks against aeroplanes but little analysing their potential effects in this way.
Terrain ahead. Pull up!
Each of the 30 pilots in the study was put in front of a desktop simulation of an Airbus A330, which Smith explained was because there weren't any good enough representations of the A320 available for the X-Plane simulator used in the experiments. After a familiarisation flight, helped by the fact the A330 is very similar to its short-haul sister aircraft, the experiments began with three simulated flights onto runway 33 at the UK's Birmingham Airport.
For the GPWS phase, Smith's team simulated a false aural alarm, where the system plays the message "Terrain, pull up!" over the cockpit loudspeakers. Pilots are trained to react to the warning so they don't fly into the ground.
On the first approach, two-thirds of pilots went around, while on the second try just over half of those who didn't land the first time round disabled GPWS before trying again, successfully. Those who went around largely did so between 20 and 30 seconds after the false alarm.
Traffic, traffic! Climb now!
Next was the TCAS attack. TCAS works by sensing the location of nearby aircraft fitted with TCAS gear and ordering pilots to climb or descend if algorithms calculate that the two aeroplanes will come too close for safety. Critically, TCAS can cause pilots to ignore air traffic control (ATC) instructions: pilots can bust an ATC-imposed altitude restriction (for example, "maintain 3,000ft") if their TCAS equipment orders them to do so.
On the A320, TCAS has three pilot-selected modes: TA/RA, meaning it gives a visual and audio warning before telling the pilot to "climb now" or "descend now", TA only (audio warnings only without the RA, Resolution Advisory, part – meaning the system does not order pilots to climb or descend) and standby (off). Due to limitations of the simulator, Smith's team were not able to simulate the visual warning on the airliner's cockpit screens.
By triggering a false TCAS RA, the researchers looked to see what the pilots would do, with the experiment including a "descend" RA shortly after takeoff among other activations, which Smith said was not unheard of in some crowded airspace such as the departure routes from Heathrow.
All but one of the pilots obeyed the false TCAS orders at first. On average, pilots "complied with over four RAs before reducing sensitivity", something Smith's team said "shows that there is no straightforward response."
Most of the pilots switched from TA/RA to TA only after false activations, with some turning it off altogether over worries about the "additional workload" and distraction caused by false alarms. Two also diverted their flights back to the origin airport.
Glideslope. Pull up!
For the ILS scenario, Smith's research team moved the position of the glideslope, the radio beam that guides aeroplanes down to safe landings. An ILS system consists of a glideslope, an angled beam that controls how far along the runway the aircraft touches down, and a localiser, which tells it where the middle of the runway is. All experiments were carried out in simulated good weather so pilots could use other visual references to double-check the ILS.
Four of the 30 pilots in the study chose to continue with their landing anyway despite the simulated glideslope having been moved to a point several thousand metres down the runway. A landing too far along the runway would risk the airliner running off the far end into the grass, potentially causing the runway to be closed.
Of the rest, 30 per cent fell back to using the aeroplane's internal GPS system to carry out an area navigation (RNAV) approach, using onboard systems to calculate a glideslope and localiser path without needing the external radio beams. A fifth of the pilots went for a visual approach; landing by looking out of the window and flying accordingly, while a quarter used the localiser beam but judged the touchdown point visually. Two pilots asked for a Surveillance Radar Approach, where ATC does all the hard work of lining the aeroplane up with the runway by looking at the radar screen and giving the pilot course corrections. This depends solely on the airport's own radar and radio equipment being available.
The pilots in the study ranged from captains with more than two decades' flying experience to newly qualified first officers with two or fewer years in their logbooks, giving a reasonably wide cross-section of aviation experience.
Smith mused to El Reg: "If industry engaged with penetration testing on these systems and tried to fully map out what the attacks might be, what they presented to the pilots as, they should at least be able to give a list of situations that might come about as a result of an attack." He added that this could be used to develop situation-specific checklists, much as pilots already have standardised checklist responses for instrument failures.
The full study is on the NDSS website for free download. ®
Sponsored: Webcast: Simplify data protection on AWS