Cyberlaw experts: Take back control. No, we're not talking about Brexit. It's Automated Lane Keeping Systems

They're not self-driving cars, did you know that?

Comment The UK government said in April that "the first types of self-driving cars could be on UK roads this year" but this is not entirely accurate.

Firstly, the announcement refers not to self-driving vehicles, but vehicles fitted with automated lane-keeping systems (ALKS), and secondly, we already have technology similar to this driving on our roads. For example, Tesla’s Autopilot and Nissan’s ProPilot can drive in a single lane, however under the current law, drivers must keep their hands on the wheel and their eyes on the road.

The government’s announcement seemingly indicates the intention to allow drivers to take their eyes off the road, and for the driving assistance system to be responsible while the system is engaged. The change is proposed to be restricted to motorways and to speeds of 37mph (+- 60km/h). No doubt, manufacturers will be releasing new versions of vehicles fitted with ALKS in UK showrooms before the end of the year.

Are we ready for this?

This announcement would suggest that industry and regulators have solved all the issues relating to how drivers interact with advanced driving assistance systems. However, the truth is we still have much to learn, prompting Thatcham Research and the British Association of Insurers to urge the government to revise its current plans. The reluctance to support the move is because ALKS falls far short of true "self-driving" and requires intense monitoring by the driver to take back control at a moment's notice.

It's important to note that drivers will remain responsible if they fail to take back control of the vehicle in a timely manner, or if they mis-use the ALKS. This type of system is "conditional" automation, and depends upon a driver who understands the limits of the technology, and one who understands their legal responsibilities, monitoring the automated system.

The UK government is not currently considering mandatory training for operating ALKS, nor will the operation of self-driving vehicles require a special licence

Problems will arise if drivers have unrealistic expectations about the system's capability, and their responsibilities, or are tempted to push the technology to its limits.

Cars fitted with driver assistance systems like ALKS are marketed as a luxury, a feature which "takes the burden out of driving."

In the US, some states have passed laws allowing vehicles with self-driving features. There have been many tragic examples of where drivers have misunderstood or wilfully misused the capabilities of driving assistance systems in their vehicles, resulting in accidents and fatalities.

Training and sleepwalking into Ts&Cs

The introduction of ALKS where the driver is able to divert their attention from the road, represents a fundamental shift in driver behaviour and responsibility. How drivers will learn these new responsibilities is not self-evident. How can we all be satisfied that drivers of vehicles fitted with ALKS know how and when to take back control of their vehicle?

The UK government is not currently considering mandatory training for operating ALKS, nor will the operation of self-driving vehicles require a special licence.

Instructions and information about such responsibilities will be delivered to the driver via an in-vehicle Human-Machine Interface (HMI).

However, we have no way of knowing whether drivers will pay attention to HMI-delivered instructions and information. Nor do we know whether they will appreciate that in handing over the driving task to the vehicle that they take on what is potentially a far more substantial responsibility: to immediately react when called upon to take back control, in circumstances where they have allowed themselves to become distracted.

There have been many studies in the development of self-driving cars which show that human brains struggle to take over the driving task at short notice. We also know that when confronted with instructions and information via digital interface, that users of technology tend to skim over what is being communicated [see here].

On a daily basis we receive a barrage of digital information and legal terms via social media, mobile devices and web services. We are predisposed to select "I accept" assuming that whatever was communicated, could not be detrimental or else it would not be legal. In that assumption, we are often incorrect. Legal terms presented to the user potentially operate to their detriment.

Rights, responsibilities and liabilities

In the case of vehicles which allow a shared system of responsibility between vehicle system and driver, such terms may place the driver at risk of criminal sanctions, civil liabilities (such as negligence) or they may also involve allowing access to the driver's personal data.

However more importantly, the driver must understand the limits of the vehicle they are driving. With all the hype surrounding "self driving" vehicles and their capabilities, it is not difficult to imagine a scenario where a driver may misunderstand the limits of the technology. It is this type of misunderstanding that could tragically lead to the deaths of drivers who potentially ignore warnings and instructions from vehicle systems with self-driving features.

Further, extensive research is required to determine how well drivers absorb information delivered by HMI. However, it is likely that drivers will require additional education and training to safely use conditional automation.

There is evidence that simulator and virtual reality systems may play a useful part in preparing drivers to operate automated driving systems. Questions remain as to whether this low-fidelity training environment is as effective as experience in an automated vehicle in a real-world environment, though.

Transport minister Rachel Maclean said that ALKS may improve road safety by reducing "human error", however we have work to do to ensure drivers understand their personal responsibility when operating a vehicle fitted with such advanced driving assistance systems, and that their trust in the system is appropriately calibrated to its capability.

There is not enough evidence that drivers understand the limitations of ALKS, and it is the behaviour of the driver engaging with the system in a manner for which is it was not intended that is likely to lead to problems. ®

Dr Jo-Ann Pattinson is a postdoctoral research fellow at the University of Leeds, where she is researching legal issues associated with autonomous vehicles. Dr Subhajit Basu FRSA is an associate professor in Information Technology Law at Leeds.

Similar topics

Broader topics

Narrower topics

Other stories you might like

  • Stolen university credentials up for sale by Russian crooks, FBI warns
    Forget dark-web souks, thousands of these are already being traded on public bazaars

    Russian crooks are selling network credentials and virtual private network access for a "multitude" of US universities and colleges on criminal marketplaces, according to the FBI.

    According to a warning issued on Thursday, these stolen credentials sell for thousands of dollars on both dark web and public internet forums, and could lead to subsequent cyberattacks against individual employees or the schools themselves.

    "The exposure of usernames and passwords can lead to brute force credential stuffing computer network attacks, whereby attackers attempt logins across various internet sites or exploit them for subsequent cyber attacks as criminal actors take advantage of users recycling the same credentials across multiple accounts, internet sites, and services," the Feds' alert [PDF] said.

    Continue reading
  • Big Tech loves talking up privacy – while trying to kill privacy legislation
    Study claims Amazon, Apple, Google, Meta, Microsoft work to derail data rules

    Amazon, Apple, Google, Meta, and Microsoft often support privacy in public statements, but behind the scenes they've been working through some common organizations to weaken or kill privacy legislation in US states.

    That's according to a report this week from news non-profit The Markup, which said the corporations hire lobbyists from the same few groups and law firms to defang or drown state privacy bills.

    The report examined 31 states when state legislatures were considering privacy legislation and identified 445 lobbyists and lobbying firms working on behalf of Amazon, Apple, Google, Meta, and Microsoft, along with industry groups like TechNet and the State Privacy and Security Coalition.

    Continue reading
  • SEC probes Musk for not properly disclosing Twitter stake
    Meanwhile, social network's board rejects resignation of one its directors

    America's financial watchdog is investigating whether Elon Musk adequately disclosed his purchase of Twitter shares last month, just as his bid to take over the social media company hangs in the balance. 

    A letter [PDF] from the SEC addressed to the tech billionaire said he "[did] not appear" to have filed the proper form detailing his 9.2 percent stake in Twitter "required 10 days from the date of acquisition," and asked him to provide more information. Musk's shares made him one of Twitter's largest shareholders. The letter is dated April 4, and was shared this week by the regulator.

    Musk quickly moved to try and buy the whole company outright in a deal initially worth over $44 billion. Musk sold a chunk of his shares in Tesla worth $8.4 billion and bagged another $7.14 billion from investors to help finance the $21 billion he promised to put forward for the deal. The remaining $25.5 billion bill was secured via debt financing by Morgan Stanley, Bank of America, Barclays, and others. But the takeover is not going smoothly.

    Continue reading

Biting the hand that feeds IT © 1998–2022