Will there ever be a real 'Lie Detector'?

Polygraph Pollyannas


Column Lie detectors figure prominently in the sauciest dramas, like espionage and murder, but they deeply polarize opinion. They pit pro-polygraph groups like the CIA, the Department of Energy and police forces against America's National Academy of Sciences, much of the FBI, and now the US Congressional Research Service. The agencies in favor of lie detectors keep their supporting data secret of obfuscated. The critics have marshaled much better arguments.

They have countless earnest references on the site antipolygraph.org, including an amusing 1941 screed on "How to Beat the Lie Detector", or an elegant essay in Science Magazine. My favorite: a letter by the convicted CIA double-agent Aldrich Ames - written from prison! - with the authority of someone who kept his traitorous career intact by successfully beating polygraphs time and time again: "Like most junk science that just won't die... because of the usefulness or profit their practitioners enjoy, the polygraph stays with us," he wrote.

So it's clear the old lie detector technology is bunk, pure and simple. Will there ever be a new technology which does in fact detect lies? No, and here's why.

Cheating the system

First, the problem with hiding your lies, the "false negative" problem. Ordinary polygraphs measure simple things like breathing, blood pressure, and skin electricity; presumably, when you get lie you get tense, and glitches in those measurements give you away. But the problem is that those signals are different for everyone. And since there is no common "lying blood pressure" or "lying skin resistance," the test needs to calibrate it for you individually: someone needs to determine your own personal *difference* between "lying" and "truth-telling" measurements.

That's the rub: a diligent or practiced liar can beat the system by ensuring that there isn't such a difference. For example, he might try suppressing the "lying" indicators, by learning to calm himself or breathe naturally during a lie. Or he might boost the baseline indicators, when he knows he is being calibrated for "truth-telling": by making a sharp inhalation, surreptitiously poking himself, or even deliberately lying or blurting out an uncomfortable truth.

These particular tricks have been known to beat polygraphs for decades, but the principles still apply to any kind of physiological measurements, because human biology varies so strongly. Your reporter knows this variability firsthand, having once worked on a fancy but doomed technology to measure blood pressure in sick people.

Basically, anyone determined to beat a polygraph (or presumably any other kind of lie detector) can game the system by practicing ways to screw up the testing methodology. But that merely makes the test less useful at catching bad guys. What about good guys?

False positives

Such tests can ruin their lives, when a "false positive" comes up. Part of the problem is that tests which measure stress, like polygraphs, also tend to *induce* stress, since the consequences of failure can be as drastic as unemployment or prison. The biggest problem here is in sheer numbers: you have to sacrifice a lot of innocent people to catch every bad guy.

Even if you take at face value the self-interested American Polygraph Associations own reliability numbers- they quote 92 per cent accuracy, without explaining its exact meaning - that would wrongly "fail" roughly 100 people out of every thousand tested. Even 99 per cent accuracy would tend to produce far more wrongly-ruined careers than rightfully-caught evildoers, at least for rare crimes like terrorism, murder, and spying. Sadly, no one seems to have hard numbers on just how bad the false-positive problem is...or if they do, they're secret.

But perhaps a new, brain-related technology will solve these problems?

One contender is "Brain Fingerprinting", which claims to use brain waves to measure the familiarity of information: Was someone exposed to this information before, regardless of its emotional salience?  Tiny electrical signals on the scalp (brain waves) evidently reverberate  in a slightly different pattern if you see a familiar vs. an unfamiliar image. Here are two improvements upon polygraphs: the signal comes straight from the brain, rather than from secondary physiological markers, and it claims to deal with more neutral familiarity and "knowledge" of experience rather than the stress of lying abut it (although the information to be tested must be suddenly flashed on a screen for the technique to work).

The inventor and chief promoter, Lawrence Farwell, has sound academic credentials, a handful of refereed publications, a US Senator's testimonial, and has helped reverse a murder conviction. But it will take a far more ambitious research program than his to confirm his methods measure "evidence stored in the brain". Measuring whether this works is at least as hard (and important) as measuring if a heart-drug works, and that kind of research program costs hundreds of millions of dollars.

Mind magnet

An even sexier technology is brain-imaging. In particular, "functional magnetic resonance imaging"- with the subject enclosed by a giant liquid-helium-cooled magnet - is a method for showing not just what your brain looks like, but which parts work harder (leading to colorful brain-pictures with glowing red spots indicating processes like "concentration" or "arousal").

One boffin, Dr. Scott Faro of Philadelphia, has found a handful of regions which seem to glow a bit more when a volunteer is lying then when truth-telling, here. His claims are both more scientific and more circumspect.

"We have just begun to understand the potential of fMRI in studying deceptive behavior," he says.

That caution is encouraging, and not just because a single interrogation costs thousands of dollars in a gadget costing millions. Like any biometric method, lie detection has an uphill fight to bring its false-positive rate low enough to justify its expense and consequences.

But it's also much harder and riskier than other biometrics: you can't verify or refute a lie detection  as easily as a retinal  scan, and you can't measure how well people might game it or react under stress.  And of course, countless careers can be permanently ruined by your mistakes.

The present brand of lie detection still hasn't proved itself scientifically in seventy years of trying, so it should be shelved before it derails even more careers or mistakenly vets even more spies. The new methods may be better, but we should test them as carefully as we do drugs before we give them an equivalent chance to do serious damage. ®

Bill Softky has worked on dozens of science and technology projects, from the deep paradoxes of nerve cells  to automatically debugging Windows source code.  He hopes someday to reverse-engineer the software architecture of mammalian learning, and meanwhile works as Chief Algorithmist at an internet advertising startup.


Other stories you might like

  • Stolen university credentials up for sale by Russian crooks, FBI warns
    Forget dark-web souks, thousands of these are already being traded on public bazaars

    Russian crooks are selling network credentials and virtual private network access for a "multitude" of US universities and colleges on criminal marketplaces, according to the FBI.

    According to a warning issued on Thursday, these stolen credentials sell for thousands of dollars on both dark web and public internet forums, and could lead to subsequent cyberattacks against individual employees or the schools themselves.

    "The exposure of usernames and passwords can lead to brute force credential stuffing computer network attacks, whereby attackers attempt logins across various internet sites or exploit them for subsequent cyber attacks as criminal actors take advantage of users recycling the same credentials across multiple accounts, internet sites, and services," the Feds' alert [PDF] said.

    Continue reading
  • Big Tech loves talking up privacy – while trying to kill privacy legislation
    Study claims Amazon, Apple, Google, Meta, Microsoft work to derail data rules

    Amazon, Apple, Google, Meta, and Microsoft often support privacy in public statements, but behind the scenes they've been working through some common organizations to weaken or kill privacy legislation in US states.

    That's according to a report this week from news non-profit The Markup, which said the corporations hire lobbyists from the same few groups and law firms to defang or drown state privacy bills.

    The report examined 31 states when state legislatures were considering privacy legislation and identified 445 lobbyists and lobbying firms working on behalf of Amazon, Apple, Google, Meta, and Microsoft, along with industry groups like TechNet and the State Privacy and Security Coalition.

    Continue reading
  • SEC probes Musk for not properly disclosing Twitter stake
    Meanwhile, social network's board rejects resignation of one its directors

    America's financial watchdog is investigating whether Elon Musk adequately disclosed his purchase of Twitter shares last month, just as his bid to take over the social media company hangs in the balance. 

    A letter [PDF] from the SEC addressed to the tech billionaire said he "[did] not appear" to have filed the proper form detailing his 9.2 percent stake in Twitter "required 10 days from the date of acquisition," and asked him to provide more information. Musk's shares made him one of Twitter's largest shareholders. The letter is dated April 4, and was shared this week by the regulator.

    Musk quickly moved to try and buy the whole company outright in a deal initially worth over $44 billion. Musk sold a chunk of his shares in Tesla worth $8.4 billion and bagged another $7.14 billion from investors to help finance the $21 billion he promised to put forward for the deal. The remaining $25.5 billion bill was secured via debt financing by Morgan Stanley, Bank of America, Barclays, and others. But the takeover is not going smoothly.

    Continue reading

Biting the hand that feeds IT © 1998–2022