This article is more than 1 year old

ARM chip OG Steve Furber: Turing missed the mark on human intelligence

Ten mice and a million cores are going to prove it

"Brains are massively parallel. We each have just under 100 billion neurons inside our heads, all running at the same time. And they are hugely connected, with 1015 synapses connecting the neurons together. The way forward in computing is parallelism. There is no other option."

Professor Steve Furber, one of the designers of the original Acorn RISC Machine (better known as the ARM chip) in the mid-1980s, is giving The Register an update on his work within the European Union's Human Brain Project.

It's a rare opportunity to collar the legend in person to expound some of his views on the march of artificial intelligence and why it still struggles – quite badly – to match the human mind for performance and learning.

The last time we reported on Furber's work under the HBP umbrella at the University of Manchester was back in 2011. At the time, he had recently published a joint paper with Andrew Brown at the University of Southampton, proposing a research project to assemble a million ARM processors into a single neural network machine mimicking the human brain. Or at least 1 per cent of it.

The SpiNNaker (Spiking Neural Network Architecture) project was just getting started, having packed an initial 100,000 ARM cores into a rack and literally opened up a mini-brain for business across the internet.

So how far has his team come over the last six years?

"The big machine is at half a million cores," he reveals. "We have populated three more racks but they're not wired in yet. Otherwise, another two to go."

Youtube Video

The accumulation of core numbers isn't random but an exponential of what's possible as you cram more and more chips on to each board. The SpiNNaker chip looks like a conventional parallel machine except the packet-switching router sits in the middle, simplifying the connections between the components to no more than three square millimetres of silicon.

Put four chips on a board and you get 72 ARM cores, which equates to the brain of a pond snail. Put 48 chips on it and you get 864 cores, equivalent to the brain of a small insect.

Now once you hook these together with some creative wiring, it's possible to combine up to 20,000 cores in a box to approximate a frog brain. Cram 100,000 cores into a full 19-inch rack and – hurrah! – you have one mouse brain.

I'm starting to see the difficulty here

To reach 1 per cent capacity of a human brain, you need approximately ten mice brains – hence the million-core target.

Furber chose ARM chips pretty much for the same reasons that they are preferred by smartphone manufacturers: they are small and power efficient. The latter is fundamental to his theory of how brains might work, a conceit worth entertaining given that no one else knows for certain either.

"In his paper of 1950 [the one outlining his 'imitation game'] Alan Turing reckoned all you need for a computer to have the potential of artificial intelligence is enough memory – about a gigabyte, in fact. He thought human intelligence was basically logic and algorithms. And computers are very good at that.

"It turns out human intelligence is not about that. We're still not quite sure what it is about. However, we do know the brain is formidably power efficient."

Furber gives the example of exascale computing: to model a human brain, a theoretical exascale super computer would have to execute 1018 – that is, a million million million – operations per second. This would dictate an energy requirement of 20 to 30 megawatts.

"Your brain runs on 20 watts," he notes, drily.

"So what happens if we try to make the processes very efficient rather than very fast? Then we could achieve performance through using lots of them. Neurons are scalable: there aren't many in a worm, lots in a human."

Focusing on efficiency over power doesn't fall in line with most current AI research. So what does Furber think about other neural network projects now that neuromorphic development is very much in fashion at the moment?

"Ah, there's an interesting story. We've been building SpiNNaker for ten years as a neural model platform focused on biological modelling. In parallel with that, there's been this huge development in machine learning.

"That has ultimately led to hardware devices that are quite different from ours because they've come from a second-generation continuous network view of the world. The hardware they're building is largely integer matrix multipliers, 8bit multipliers, whereas biology is alway sparse: you just don't get dense matrix connections in biology.

"And so we built a hardware-software system that has good support for sparse connectivity. We're very focused on spiking networks whereas machine learning almost completely ignores spikes."

More about


Send us news

Other stories you might like