Software engineers – the ultimate brain scientists?

Part I: Everything you know about AI is probably wrong


Guest Opinion Bill Softky is a scientist at the Redwood Neuroscience Institute, founded by Jeff Hawkins. He has worked as a software architect, visualization designer, and educator.

Can software engineers hope to create a digital brain? Not before understanding how the brain works, and that's one of the biggest mysteries left in science. Brains are hugely intricate circuits of billions of elements. We each have one very close by, but can't open it up: it's the ultimate Black Box.

The most famous engineering brain models are "Neural Networks" and "Parallel Distributed Processing." Unfortunately both have failed as engineering models and as brain models, because they make certain assumptions about what a brain should look like.

The trouble is, real problems such as robotic motion and planning, audio or visual segmentation, or real-time speech recognition are not yet well enough understood to justify any particular circuit design, much less "neural" ones. A neuron is the brain's computational building block, its "transistor". So the "neurons" and "networks" of those models are idealized fantasies designed to support mathematically elegant theories, and have not helped to explain real brains.

There is an abundance of research on brains, discovering which areas light up when you solve certain problems, which chemicals are outside and inside neurons, what drugs change one?s moods, all amounting to thousands of research papers, thousands of researchers.

But there remain two huge mysteries: what the brain's neural circuit really is, and what it does.

Neurons

Neurons

Here's what we already know about brain circuitry.

We know that neurons are little tree-shaped cells with tiny branches gathering electrochemical input, and a relatively long output cable which connects to other neurons. Neurons are packed cheek-to-jowl in the brain: imagine a rainforest in which the trees and vines are so close their trunks all touch, and their branches are completely intertwined. A big spaghetti mess, with each neuron sending to and receiving from thousands of others.

It's not a total hash; like a rainforest, that neural tangle has several distinct layers. And fortunately, those layers look pretty much the same everywhere in the main part of the brain, so there is hope that whatever one layered circuit does with its inputs (say, visual signals), other layers elsewhere may be doing something similar with their inputs.

OK, so we know what a brain circuit looks like, in the same sense that we know what a CPU layout looks like. But for several reasons, we don't know how it works,.

First, we don't know how a single neuron works. Sure, we know that in general a neuron produces more output pulses when it gets more inputs. But we don't know crucial details: depending on dozens of (mostly unknown) electrochemical properties, that neuron might be adding up its inputs, or multiplying them, or responding to averages, or responding to sudden changes. Or maybe doing one function at some times, on some of its branches, and other functions on other branches or at other times. For now, we can't measure brain neurons well enough to more than guess their input/output behavior.

Second, we can't tell how the neurons are connected. Sure, neurons are connected to neighboring neurons. But that isn't very helpful. It's like saying that chips in a computer are connected to neighboring chips. It doesn't explain the specific circuitry. The best biologists can do is trace connections between handfuls of neurons at a time in a dead brain, and if they're lucky, they can even record the simultaneous outputs from a handful of neurons in a live brain. But all the interesting neural circuits contain thousands to millions of neurons, so measuring just a few is hopelessly inadequate, like trying to understand a CPU by measuring the connections between - or the voltages on - a few random transistors.

Third, we don't understand neurons' electrical code. We do know that neurons communicate by brief pulses, and that the pulses from any one neuron occur at unpredictable times. But is that unpredictability a random noise, like the crackle of static, or a richly-encoded signal, like the crackle of a modem? Must the recipient neurons average over many input pulses, or does each separate pulse carry some precise timing?

Finally, we don't know how brains learn. We're pretty sure that learning mostly involves the changes in connections between neurons, and those connections form and strengthen based on local voltages and chemicals. But it's devilishly hard even to record from two interconnected neurons, much less watch the connection change while knowing or controlling everything affecting it. And what about the factors which create brand-new connections, or kill off old ones? Those circuit changes are even stronger, yet nearly impossible to measure.

So here's what we don't know about brain circuitry: we don't know what a single neuron does, what code they use, how they are connected, or how the connections change with learning. Without such knowledge, we can't reverse-engineering brains to deduce their function from their structure.

Next page: Neural behavior

Other stories you might like

  • Despite global uncertainty, $500m hit doesn't rattle Nvidia execs
    CEO acknowledges impact of war, pandemic but says fundamentals ‘are really good’

    Nvidia is expecting a $500 million hit to its global datacenter and consumer business in the second quarter due to COVID lockdowns in China and Russia's invasion of Ukraine. Despite those and other macroeconomic concerns, executives are still optimistic about future prospects.

    "The full impact and duration of the war in Ukraine and COVID lockdowns in China is difficult to predict. However, the impact of our technology and our market opportunities remain unchanged," said Jensen Huang, Nvidia's CEO and co-founder, during the company's first-quarter earnings call.

    Those two statements might sound a little contradictory, including to some investors, particularly following the stock selloff yesterday after concerns over Russia and China prompted Nvidia to issue lower-than-expected guidance for second-quarter revenue.

    Continue reading
  • Another AI supercomputer from HPE: Champollion lands in France
    That's the second in a week following similar system in Munich also aimed at researchers

    HPE is lifting the lid on a new AI supercomputer – the second this week – aimed at building and training larger machine learning models to underpin research.

    Based at HPE's Center of Excellence in Grenoble, France, the new supercomputer is to be named Champollion after the French scholar who made advances in deciphering Egyptian hieroglyphs in the 19th century. It was built in partnership with Nvidia using AMD-based Apollo computer nodes fitted with Nvidia's A100 GPUs.

    Champollion brings together HPC and purpose-built AI technologies to train machine learning models at scale and unlock results faster, HPE said. HPE already provides HPC and AI resources from its Grenoble facilities for customers, and the broader research community to access, and said it plans to provide access to Champollion for scientists and engineers globally to accelerate testing of their AI models and research.

    Continue reading
  • Workday nearly doubles losses as waves of deals pushed back
    Figures disappoint analysts as SaaSy HR and finance application vendor navigates economic uncertainty

    HR and finance application vendor Workday's CEO, Aneel Bhusri, confirmed deal wins expected for the three-month period ending April 30 were being pushed back until later in 2022.

    The SaaS company boss was speaking as Workday recorded an operating loss of $72.8 million in its first quarter [PDF] of fiscal '23, nearly double the $38.3 million loss recorded for the same period a year earlier. Workday also saw revenue increase to $1.43 billion in the period, up 22 percent year-on-year.

    However, the company increased its revenue guidance for the full financial year. It said revenues would be between $5.537 billion and $5.557 billion, an increase of 22 percent on earlier estimates.

    Continue reading

Biting the hand that feeds IT © 1998–2022