This article is more than 1 year old

Cyberlaw experts: Take back control. No, we're not talking about Brexit. It's Automated Lane Keeping Systems

They're not self-driving cars, did you know that?

Comment The UK government said in April that "the first types of self-driving cars could be on UK roads this year" but this is not entirely accurate.

Firstly, the announcement refers not to self-driving vehicles, but vehicles fitted with automated lane-keeping systems (ALKS), and secondly, we already have technology similar to this driving on our roads. For example, Tesla’s Autopilot and Nissan’s ProPilot can drive in a single lane, however under the current law, drivers must keep their hands on the wheel and their eyes on the road.

The government’s announcement seemingly indicates the intention to allow drivers to take their eyes off the road, and for the driving assistance system to be responsible while the system is engaged. The change is proposed to be restricted to motorways and to speeds of 37mph (+- 60km/h). No doubt, manufacturers will be releasing new versions of vehicles fitted with ALKS in UK showrooms before the end of the year.

Are we ready for this?

This announcement would suggest that industry and regulators have solved all the issues relating to how drivers interact with advanced driving assistance systems. However, the truth is we still have much to learn, prompting Thatcham Research and the British Association of Insurers to urge the government to revise its current plans. The reluctance to support the move is because ALKS falls far short of true "self-driving" and requires intense monitoring by the driver to take back control at a moment's notice.

It's important to note that drivers will remain responsible if they fail to take back control of the vehicle in a timely manner, or if they mis-use the ALKS. This type of system is "conditional" automation, and depends upon a driver who understands the limits of the technology, and one who understands their legal responsibilities, monitoring the automated system.

The UK government is not currently considering mandatory training for operating ALKS, nor will the operation of self-driving vehicles require a special licence

Problems will arise if drivers have unrealistic expectations about the system's capability, and their responsibilities, or are tempted to push the technology to its limits.

Cars fitted with driver assistance systems like ALKS are marketed as a luxury, a feature which "takes the burden out of driving."

In the US, some states have passed laws allowing vehicles with self-driving features. There have been many tragic examples of where drivers have misunderstood or wilfully misused the capabilities of driving assistance systems in their vehicles, resulting in accidents and fatalities.

Training and sleepwalking into Ts&Cs

The introduction of ALKS where the driver is able to divert their attention from the road, represents a fundamental shift in driver behaviour and responsibility. How drivers will learn these new responsibilities is not self-evident. How can we all be satisfied that drivers of vehicles fitted with ALKS know how and when to take back control of their vehicle?

The UK government is not currently considering mandatory training for operating ALKS, nor will the operation of self-driving vehicles require a special licence.

Instructions and information about such responsibilities will be delivered to the driver via an in-vehicle Human-Machine Interface (HMI).

However, we have no way of knowing whether drivers will pay attention to HMI-delivered instructions and information. Nor do we know whether they will appreciate that in handing over the driving task to the vehicle that they take on what is potentially a far more substantial responsibility: to immediately react when called upon to take back control, in circumstances where they have allowed themselves to become distracted.

There have been many studies in the development of self-driving cars which show that human brains struggle to take over the driving task at short notice. We also know that when confronted with instructions and information via digital interface, that users of technology tend to skim over what is being communicated [see here].

On a daily basis we receive a barrage of digital information and legal terms via social media, mobile devices and web services. We are predisposed to select "I accept" assuming that whatever was communicated, could not be detrimental or else it would not be legal. In that assumption, we are often incorrect. Legal terms presented to the user potentially operate to their detriment.

Rights, responsibilities and liabilities

In the case of vehicles which allow a shared system of responsibility between vehicle system and driver, such terms may place the driver at risk of criminal sanctions, civil liabilities (such as negligence) or they may also involve allowing access to the driver's personal data.

However more importantly, the driver must understand the limits of the vehicle they are driving. With all the hype surrounding "self driving" vehicles and their capabilities, it is not difficult to imagine a scenario where a driver may misunderstand the limits of the technology. It is this type of misunderstanding that could tragically lead to the deaths of drivers who potentially ignore warnings and instructions from vehicle systems with self-driving features.

Further, extensive research is required to determine how well drivers absorb information delivered by HMI. However, it is likely that drivers will require additional education and training to safely use conditional automation.

There is evidence that simulator and virtual reality systems may play a useful part in preparing drivers to operate automated driving systems. Questions remain as to whether this low-fidelity training environment is as effective as experience in an automated vehicle in a real-world environment, though.

Transport minister Rachel Maclean said that ALKS may improve road safety by reducing "human error", however we have work to do to ensure drivers understand their personal responsibility when operating a vehicle fitted with such advanced driving assistance systems, and that their trust in the system is appropriately calibrated to its capability.

There is not enough evidence that drivers understand the limitations of ALKS, and it is the behaviour of the driver engaging with the system in a manner for which is it was not intended that is likely to lead to problems. ®

Dr Jo-Ann Pattinson is a postdoctoral research fellow at the University of Leeds, where she is researching legal issues associated with autonomous vehicles. Dr Subhajit Basu FRSA is an associate professor in Information Technology Law at Leeds.

More about

TIP US OFF

Send us news


Other stories you might like