ARM chip OG Steve Furber: Turing missed the mark on human intelligence

Ten mice and a million cores are going to prove it


Your neural network can recognise cats, huh? That's cute

Yes, about those "spikes"...

"Biological neurons in the brain communicate principally by emitting pure electrical impulses for which 'spikes' is the shorthand. 'Action potentials' is the posh phrase.

"Basically all a neuron does is go 'ping' every so often and there is no information, as far as we can tell, in the shape. The information is entirely in when the thing happens, the timing, and of course you get a stream of pings in the rate at which it happens.

"Continuous networks, second-generation networks, use outputs which range continuously and capture the rate at which that neuron is firing. But the rate completely ignores the timing of an individual spike.

"Depending on its role in the brain, that timing may or may not be significant. It's clear you can't completely ignore it.

"Of course, the machine-learning community isn't interested in understanding the brain: they want machine-learning systems for work and there they've made huge strides. But I think they are going to have to keep going back to biology."

Furber gives the example of Google's much-publicised triumph when its network, having been shown ten million pictures of cats, became "good" at recognising cats.

"I claim that you could take a two-year-old human, show them one cat, and they'll recognise cats for the rest of their life.

"Of course the Google network starts with a completely random pattern; the human two-year-old starts with two years of experience of the world and 3D spaces and how shapes fit into this. So the cat is just a new instance of a general shape that it's got some way of assessing.

"So the training of current machine-learning systems is probably more expensive than it needs to be – if only we could understand how biology did this one-shot learning."

What could have been will never be

Using ARM chips to model a human brain seems a far cry from their 1980s origins when Furber designed them for Acorn's educational desktop computers. But he reveals that Acorn could have owned the handheld and smartphone market right from the start. If only, if only...

"We had a concept at Acorn that we played with but never really put resource into: doing an educational machine that was more like what we now know as an iPad – a tablet machine. Of course, in the 1980s, it would have been an inch thick and weighed three kilos."

Heh, like the Apple Newton, then? ARM was originally set up as a joint venture with Apple specifically to develop chips for the Newton.

"No, it was more of a tablet size, whereas the Newton was a portable PDA format. Of course, the Newton was in some sense a first go at an iPhone but without the comms. And the iPod was a specialised product.

"But I think [Apple] realised if you took the iPod concept, put it together with the Newton functionality, and a standard bit of smartphone comms interface, then you get something interesting."

Returning from this digression into computing history's ships-in-the-night, Furber notes that the present day Human Brain Project now has his team working closely with research groups across Europe.

TU Dresden is their strongest current collaborator on the hardware design side. On the software side, CNRS in France and TU Munich in Germany are strong partners, plus what Furber calls "a whole diaspora that goes out and tails off slowly".

The involvement of original collaborator Andrew Brown is no longer as direct as it once was. Brown runs his own SpiNNaker drive object called POETS (Partially-Ordered Event-Triggered Systems), a program for developing the system into other applications, while Furber's team remains focused on the brain model.

The SpiNNaker chip design has also attracted interest from organisations wanting to use them in embedded neurorobotic systems. Lacking the resources to keep lending them out to all and sundry, the project has started selling them commercially.

Half a million cores at your – yes, your – fingertips

That said, anyone – including Register readers – are welcome to access SpiNNaker's half-million-core brain model in Manchester for free and run programs on it online.

To do this, join the Human Brain Project community to get the logins and access the neural model platforms, write a page about what you're planning to do, then you can just submit jobs.

The number of jobs you can submit is currently limited to neural network models described in PyNN, the Python Neural Networks language.

"It's working much more reliably at the hardware level and the software is much more robust than it was," says Furber. "A user is less likely to hit software gremlins or hardware bugs.

"The hardware bugs aren't a major problem for people doing small models. But if people take my encouragement to do big models, the hardware's really got to be seriously robust otherwise they'll spend all their time fighting with hardware bugs.

"We ran 1,200 jobs over the last year, all tiny. We need people to think big – and scale up." ®

Similar topics


Other stories you might like

  • Google Pixel 6, 6 Pro Android 12 smartphone launch marred by shopping cart crashes

    Chocolate Factory talks up Tensor mobile SoC, Titan M2 security ... for those who can get them

    Google held a virtual event on Tuesday to introduce its latest Android phones, the Pixel 6 and 6 Pro, which are based on a Google-designed Tensor system-on-a-chip (SoC).

    "We're getting the most out of leading edge hardware and software, and AI," said Rick Osterloh, SVP of devices and services at Google. "The brains of our new Pixel lineup is Google Tensor, a mobile system on a chip that we designed specifically around our ambient computing vision and Google's work in AI."

    This latest Tensor SoC has dual Arm Cortex-X1 CPU cores running at 2.8GHz to handle application threads that need a lot of oomph, two Cortex-A76 cores at 2.25GHz for more modest workloads, and four 1.8GHz workhorse Cortex-A55 cores for lighter, less-energy-intensive tasks.

    Continue reading
  • BlackMatter ransomware gang will target agriculture for its next harvest – Uncle Sam

    What was that about hackable tractors?

    The US CISA cybersecurity agency has warned that the Darkside ransomware gang, aka BlackMatter, has been targeting American food and agriculture businesses – and urges security pros to be on the lookout for indicators of compromise.

    Well known in Western infosec circles for causing the shutdown of the US Colonial Pipeline, Darkside's apparent rebranding as BlackMatter after promising to go away for good in the wake of the pipeline hack hasn't slowed their criminal extortion down at all.

    "Ransomware attacks against critical infrastructure entities could directly affect consumer access to critical infrastructure services; therefore, CISA, the FBI, and NSA urge all organizations, including critical infrastructure organizations, to implement the recommendations listed in the Mitigations section of this joint advisory," said the agencies in an alert published on the CISA website.

    Continue reading
  • It's heeere: Node.js 17 is out – but not for production use, says dev team

    EcmaScript 6 modules will not stop growing use of Node, claims chair of Technical Steering Committee

    Node.js 17 is out, loaded with OpenSSL 3 and other new features, but it is not intended for use in production – and the promotion for Node.js 16 to an LTS release, expected soon, may be more important to most developers.

    The release cycle is based on six-monthly major versions, with only the even numbers becoming LTS (long term support) editions. The rule is that a new even-numbered release becomes LTS six months later. All releases get six months of support. This means that Node.js 17 is primarily for testing and experimentation, but also that Node.js 16 (released in April) is about to become LTS. New features in 16 included version 9.0 of the V8 JavaScript engine and prebuilt Apple silicon binaries.

    "We put together the LTS release process almost five years ago, it works quite well in that we're balancing [the fact] that some people want the latest, others prefer to have things be stable… when we go LTS," Red Hat's Michael Dawson, chair of the Node.js Technical Steering Committee, told The Register.

    Continue reading

Biting the hand that feeds IT © 1998–2021