Are brains analog, or digital?

Hell breaks loose after Cornell claim


A new study conducted at Cornell University suggests that we think in analog, not digital. It's a bold claim which, if true, threatens to make thirty years of linguistics and neuroscience metaphors look very silly indeed.

Professor Michael Spivey, a psycholinguist and associate professor of psychology at Cornell claims that the mind "should be thought of more as working the way biological organisms do: as a dynamic continuum, cascading through shades of grey."

And that's a good description of the difference between the way analog and computers work. And The Professor wants you to know this. Although he says it in a funny sort of psycho-babble.

For example, Spivey eschews the word "brain" in his research, making the heroic leap to the word "mind" to explain his discoveries. I'm not telling you about your brain, he's saying, but about your mind. Yes, yours.

Some may think this very presumptious, but in fact this isn't uncommon in the jargon employed by fringe areas of enquiry today. For example, the Anglo-American "philosophy" camp has an exciting project, "The Philosophy of the Mind". Revelations such as "thermostats have souls," are considered a major breakthrough, apparently. Members of this camp earnestly discuss whether the mind is more like a worn-out, replaceable drive belt, or whether it's more like a DLL in a Windows-like operating system. But groups like this are plentiful on the internet, a place where you'll find people who believe vapor trails are impregnated with mind-controlling dye. In each case (as opposed to Confuscian or Hindu and Buddhist thinking), the mind is fairly oily, and ugly, and eminently disposable component.

Back to the research. The researchers observe that "even partial linguistic input can start [our emphasis] the dynamic competition between simultaneously active representations," proudly. Quite what they mean by the rather open-ended term "active representations" is never explained.

Cornell's sort of PR-department must have been puzzled too, because they hedged their bets with a release headed: "New Cornell study suggests that mental processing is continuous, not like a computer."

But, wait a second. Not like which computer? The one that crashed a minute ago leaving only a couple of ghostly dialog boxes on the screen, one of which said something like, "Do You Wish To Cancel Your Changes?" with the options "Yes" "No" and "Cancel"? If the mind is a computer, this isn't the sort of confusion any mind would want to find itself in. Maybe Spivey means some other kind of computer? Or some other kind of mind.

We'll return to that question in a moment, but we immediately see that Professor Spivey's approach is beginning to raise one or two more urgent questions.

The study examined the behaviour of only 42 students. That number might be significant - it might not. The test itself required users to click on pictures on a computer screen corresponding to a word that was read out. When near-homophones were presented as the choice, such as "candy" and "candle", the students were slower to click on the correct object, and "their mouse trajectories were more curved".

"When there was ambiguity, the participants briefly didn't know which picture was correct and so for several dozen milliseconds, they were in multiple states at once," he says. "They sort of partially heard the word both ways, and their resolution of the ambiguity was gradual rather than discrete;" from which he concludes, "it's a dynamical system."

(Or maybe the eager-to-please students were just trying too hard. There's no indication of the make-up of the group).

But Spivey is on a mission. The following phrase betrays his presence amongst us as an Emergent Person:

In the "dynamical system" approach he endorses, he tells us, "perception and cognition are mathematically described as a continuous trajectory through a high-dimensional mental space; the neural activation patterns flow back and forth to produce nonlinear, self-organized, emergent properties - like a biological organism."

In other words, he's saying that if you look at big things - and you can give them a stray, or opaque name, such as "systems" - there are smaller things swimming around inside, that may explain their behaviour. But anyone who's ever looked at a goldfish bowl knows this, and it's hardly the basis for a new science. Spivey already has a firm metaphor for how the mind should work, and is looking for evidence - or anything, really - that fits this metaphor.

By which point we were completely baffled by what Spivey was really trying to say.

So we turned to a real scientist for an explanation.


Other stories you might like

  • Robotics and 5G to spur growth of SoC industry – report
    Big OEMs hogging production and COVID causing supply issues

    The system-on-chip (SoC) side of the semiconductor industry is poised for growth between now and 2026, when it's predicted to be worth $6.85 billion, according to an analyst's report. 

    Chances are good that there's an SoC-powered device within arm's reach of you: the tiny integrated circuits contain everything needed for a basic computer, leading to their proliferation in mobile, IoT and smart devices. 

    The report predicting the growth comes from advisory biz Technavio, which looked at a long list of companies in the SoC market. Vendors it analyzed include Apple, Broadcom, Intel, Nvidia, TSMC, Toshiba, and more. The company predicts that much of the growth between now and 2026 will stem primarily from robotics and 5G. 

    Continue reading
  • Deepfake attacks can easily trick live facial recognition systems online
    Plus: Next PyTorch release will support Apple GPUs so devs can train neural networks on their own laptops

    In brief Miscreants can easily steal someone else's identity by tricking live facial recognition software using deepfakes, according to a new report.

    Sensity AI, a startup focused on tackling identity fraud, carried out a series of pretend attacks. Engineers scanned the image of someone from an ID card, and mapped their likeness onto another person's face. Sensity then tested whether they could breach live facial recognition systems by tricking them into believing the pretend attacker is a real user.

    So-called "liveness tests" try to authenticate identities in real-time, relying on images or video streams from cameras like face recognition used to unlock mobile phones, for example. Nine out of ten vendors failed Sensity's live deepfake attacks.

    Continue reading
  • Lonestar plans to put datacenters in the Moon's lava tubes
    How? Founder tells The Register 'Robots… lots of robots'

    Imagine a future where racks of computer servers hum quietly in darkness below the surface of the Moon.

    Here is where some of the most important data is stored, to be left untouched for as long as can be. The idea sounds like something from science-fiction, but one startup that recently emerged from stealth is trying to turn it into a reality. Lonestar Data Holdings has a unique mission unlike any other cloud provider: to build datacenters on the Moon backing up the world's data.

    "It's inconceivable to me that we are keeping our most precious assets, our knowledge and our data, on Earth, where we're setting off bombs and burning things," Christopher Stott, founder and CEO of Lonestar, told The Register. "We need to put our assets in place off our planet, where we can keep it safe."

    Continue reading

Biting the hand that feeds IT © 1998–2022