Video games were 'Eureka!' moment that helped boffins simulate neural activity on a single commodity GPU

'Procedural generation' might offer a way forward for projects queueing up to use a supercomputer

Researchers at the UK's University of Sussex have developed a way to boost neural activity simulations without reaching for expensive and scarce supercomputing resources.

Taking inspiration from the games industry, research fellow James Knight and professor of informatics Thomas Nowotny have shown how a single GPU can be used to model a macaque visual cortex with 4 million neurons and 24 billion synapses, a feat only previously possible using a supercomputer.

The challenge in modelling brain activity is not just in the neurons, but in the synapses that connect these biological processing nodes, Knight told The Register.

"Synapses tend to outnumber neurons by a factor of 1,000 or even 10,000. So, if you have a model that's even relatively large, you have to have a lot of memory to store the synapses. This is why, typically, most people simulate models of scale in our paper using supercomputers. It's not the actual processing requirements; it's because they need to distribute the model across a distributed system to get enough memory."

Supercomputers are expensive and there is often a long queue of researchers waiting to run their models in these environments, putting limits on more widespread computational neuroscience.

Before going into academia, Knight was a games developer, having worked as a software engineer for Ideaworks Games Studio adapting Call of Duty for mobile platforms. As such, he thought the common technique of procedural content generation in games development might help address the memory problem in representing synapses. "From my background, I knew procedural content is a classic way of saving memory in your game," he said.

Knight started out by simply setting the model up on GPUs, something that can take a while with CPUs. But the work led to a light-bulb moment.

"We realised that you can do this on the fly so whenever those [synapse] connections are needed, you can regenerate them on the [16GB] GPU," Knight said. "It saves a vast amount of memory. This model would take factors 10 times more memory than it currently does if you did it the traditional way, and it wouldn’t fit on a CPU."

The model neurons are held on the GPU, but because they are "spiking neurons" – more closely related to biological neurons than their ML counterparts – they only transmit data via synapses when they have reached a certain level of activity, at which point the GPU generates the necessary synaptic connections from memory.

"This is particularly well suited to GPU architectures," Knight said.

A similar approach was used by Russian mathematician and neuroscientist Eugene Izhikevich for simulating a large cortical model on a CPU cluster in 2005. But as it was not written up, it was difficult to know how he achieved the result and the work has not been applied to modern hardware, Knight said.

While "procedural connectivity" - as the researchers call it - vastly reduces the memory requirements of large-scale brain simulations, the GPU code generation strategies do not scale well with a large number of neurons. To address this second problem, the team developed a "kernel merging" code generator, as described in the researchers' paper in Nature Computational Science.

The neurological model used to demonstrate the power of this approach is the macaque visual cortex developed by the Jülich Supercomputing Centre SimLab. In the name of open science, it is available on GitHub.

A lot of attention in neuroscience has focused on grand projects, such as the Human Brain Project coordinated by the École Polytechnique Fédérale de Lausanne and largely funded by the European Union. It has a troubled history and critics have leapt on its lack of results.

Knight showed researchers can effectively model neural activity on a commodity GPU workstation – in this case an Nvidia Titan RTX – which is available for a few thousand pounds. The hope is the development will allow more researchers to build and test a greater number of large models, improving our understanding of how brains work.

"There's a huge lack of really large-scale models," he said. "Some brain activity patterns only emerge when you have a suitably large model. Our hope is that [our work] will allow a wider range of computational neuroscience researchers to start experimenting with large brain models. The main people working on it right now are those with the expertise and the access to the supercomputers." ®


Similar topics


Send us news

Other stories you might like