Brain-plug weapons could provide war crime immunity

Lawyer spots future brainhat-slaughter atrocity loophole

Comment An American law student has published an analysis of international law regarding war crimes that might be committed using future brain-interface-controlled weapon systems.

Stephen White, studying at Cornell Law School, had his paper Brave New World: Neurowarfare and the Limits of International Humanitarian Law published (pdf) in the current issue of the Cornell International Law Journal. The paper has been picked up here and there on the tech net. In it, White makes particular reference to the various "Brain-Machine Interface" ploys being pursued by DARPA, the Pentagon mad-science outfit that loves a long shot.

An example of such kit is the famous mind-probe hat, intended to monitor a soldier's brainwaves as he eyes the situation around him and throw up a threat marker on his visor before his conscious mind has even realised there is danger there.

DARPA reckon this could sometimes provide a useful speed advantage, as White notes:

One of the justifications for employing a brain-machine interface is that the human brain can perform image calculations in parallel and can thus recognize items, such as targets, and classify them in 200 milliseconds, a rate orders of magnitude faster than computers can perform such operations. In fact, the image processing occurs faster than the subject can become conscious of what he or she sees...

White looks forward to a day where such a system, rather than simply flagging up a subconsciously-spotted danger and perhaps training a weapon on it, actually opens fire without further ado. That would be fine if the soldier's brain had correctly spotted a legitimate target, but obviously less so if instead a noncombatant got smoked. That would be a war crime. But would the soldier be guilty? All he or she did was think a bad thought, White argues, and:

Anglo-American criminal law has refused to criminalize someone solely for his or her bad thoughts... This fairness concern reflects the awareness... that punishing bad thoughts might have perverse social consequences... criminal law has refused to stigmatize those who contemplate bad deeds but do not actually perform them.

A layman might argue that in fact certain areas of bad thought are often punished by the criminal law; but we'll skip over that. White says that to bust someone for a war crime you need to show that he or she consciously chose to commit it, and presumably he knows what he's on about.

In summary, a brain-interface guided weapon could circumvent the pilot’s normal volitional processing signals and rely solely on the recognition activity, thereby making it impossible for courts to determine whether a volitional act occurred before weapon targeting... a prosecutor could never definitively prove anything more than the most attenuated guilt for misdirected attacks on protected persons.

According to bonce-boffins cited by White, the conscious mind - especially in situations such as combat, where a lot of subconscious instincts are in play - tends to operate mainly by vetoing actions, rather than by thinking of them itself. The bloodthirsty subconscious tends to work on the "kill 'em all and let God sort them out" principle, but the more civilised part of the mind can suppress these impulses if it wants to. Under this theory, a human being doesn't so much exercise free will as "free won't".

So what's to be done? It would be silly to try and prohibit brain-shortout weapons altogether, says White. He reckons that if people had prohibited the smartbomb and the target-seeking weapon, for instance, we'd still be stuck with horrible messy cluster bombs.

Such a prohibition... might create the unintended consequence of hindering research into weapon systems that may prove more accurate than existing weapons...

International humanitarian law, therefore... should create incentives to produce maximally predictable and accurate weapons and to clarify the lines of authority in wartime in order to make criminal accountability easier to determine.

But then White suddenly executes a neck-snapping volte face, and starts arguing for wholesale technology suppression.

[This] would likely require prosecution of high-level civilian developers... Many high-level weapon designers have escaped prosecution for design of indiscriminate or disproportionate military systems... For instance, after World War II, the Nazi engineer Wernher von Braun evaded war crimes charges because the United States sought his expertise in designing rockets that were critical for military dominance in the Cold War.

Putting engineers on notice of their potential liability may create incentives for them to create less indiscriminate and disproportionate weapons. A view of command responsibility would also create de facto liability for those most responsible for sanctioning the use of such weapons.

Frankly, White seems to have gone off the rails altogether here. So the US should have executed Von Braun because his weapons were used to randomly bombard London? They'd have morally been bound to round up and shoot every boffin at the Manhattan Project, too - the A-bombs produced by Oppenheimer and his crowd were vastly more indiscriminate and deadly than the V-2s.

In any case, the successors to Von Braun's V-2s did allow the West not to lose the Cold War - which many would say was a good thing in itself, well worth amnestying him. Prosecutors can let people off in exchange for testimony, after all - why not for crucial help in preserving the very legal system they represent?

Furthermore, in the end the ballistic rockets turned into ICBMs accurate enough to take out individual hardened silos, in the process spawning various technologies including integrated circuits and then GPS. GPS is a major part of the precision weaponry that White approves of.

Getting back to brain-machine interfaces, there aren't that many military applications where milliseconds count so much that you might shortcircuit the human brain - even if you really could. Mostly it just wouldn't make sense.

For instance, quicksilver brain-directed weapons could conceivably be handy for close quarter gunfighting one day. But if the system is going from subconscious assessment to shooting without further ado, it must be aiming the gun itself as well as firing it - this thing is mainly an automated weapons turret on a robot, now. Why not put the operator and his wired-up brain off the battlefield via remote link, then? At which point the need to be really quick so as to keep him safe has vanished, so you may as well just give him a normal, consciously operated firing switch.

Etc, etc.

The brain-machine legal point is mildly interesting, but realistically brain jumpwire systems are probably never going to be a big deal; not ones with firing authority, anyway. The fact that DARPA is looking at an idea doesn't mean that it's likely to come into service - quite the reverse, actually. It could be that the international war-crimes judiciary has rather more serious issues to worry about.

As for "Putting engineers on notice of their potential liability" - oh dear. So, we'll hang rocket engineers in case they make ICBMs? Why not lock up Sir Frank Whittle, inventor of the jet engine, while we're at it. Jets have allowed far more indiscriminate ordnance to be dropped since 1945 than ever was before. Of course, that all means no space programme, no airliners, no silicon chips, no computers. No fertilisers; you might invent nerve gas by accident, so just try to eat less. Actually, moving on back in history, no technology at all - inventing the stone axe would be a crime under potential-liability rules of this sort.

Alternatively, how about putting lawyers on notice of their countervailing liability in cases of stifling progress and so condemning the human race to prolonged and unnecessary death and suffering? ®

Other stories you might like

  • Nothing says 2022 quite like this remote-controlled machine gun drone
    GNOM is small, but packs a mighty 7.62mm punch

    The latest drone headed to Ukraine's front lines isn't getting there by air. This one powers over rough terrain, armed with a 7.62mm tank machine gun.

    The GNOM (pronounced gnome), designed and built by a company called Temerland, based in Zaporizhzhia, won't be going far either. Next week it's scheduled to begin combat trials in its home city, which sits in southeastern Ukraine and has faced periods of rocket attacks and more since the beginning of the war.

    Measuring just under two feet in length, a couple inches less in width (57cm L х 60cm W x 38cm H), and weighing around 110lbs (50kg), GNOM is small like its namesake. It's also designed to operate quietly, with an all-electric motor that drives its 4x4 wheels. This particular model forgoes stealth in favor of a machine gun, but Temerland said it's quiet enough to "conduct covert surveillance using a circular survey camera on a telescopic mast."

    Continue reading
  • Qualcomm wins EU court battle against $1b antitrust fine
    Another setback for competition watchdog as ruling over exclusive chip deal with iPhone nullified

    The European Commission's competition enforcer is being handed another defeat, with the EU General Court nullifying a $1.04 billion (€997 million) antitrust fine against Qualcomm.

    The decision to reverse the fine is directed at the body's competition team, headed by Danish politico Margrethe Vestager, which the General Court said made "a number of procedural irregularities [which] affected Qualcomm's rights of defense and invalidate the Commission's analysis" of Qualcomm's conduct. 

    At issue in the original case was a series of payments Qualcomm made to Apple between 2011 and 2016, which the competition enforcer had claimed were made in order to guarantee the iPhone maker exclusively used Qualcomm chips.

    Continue reading
  • Proposed Innovation Act amendment would block US investment in China
    We're just astounded to see bipartisan efforts in Congress in this day and age

    A draft US law that would, for one thing, subsidize the US semiconductor industry, has gained an amendment that would turn the screws on American investments in foreign countries.

    The proposed update states that semiconductors, large-capacity batteries, pharmaceuticals, rare-earth elements biotech, AI, quantum computing, hypersonics, fintech and autonomous technologies are all included as sectors in which foreign investment would be limited, specifically in "countries of concern," or those considered foreign adversaries, like China. The amendment also would restrict construction investments and joint ventures that would involve sharing of IP and monetary rewards.

    US entities that have invested in a sector or country covered under the amendment would be required to notify the federal government, and the proposal also includes authorization for the executive branch to form an interagency panel responsible for reviewing and blocking foreign investments on national security grounds, the Wall Street Journal said of the amendment.

    Continue reading

Biting the hand that feeds IT © 1998–2022