This article is more than 1 year old
All those new '5G standards'? Here's the science they rely on
Radio professor tells us how wireless will get faster in the real world
The 5G arms race has commenced, but beneath the duelling “my 5G is faster than your 5G” demos, there's serious work going on – and whatever the future of 5G, that work will change the future of mobility one way or the other.
With that in mind, The Register spoke to Professor Eryk Dutkiewicz of Macquarie University. In May, Macquarie (with professor Dutkiewicz at the helm) was tapped as one of number of universities, all but three of them in America, to form the backbone of Intel's 5G development efforts.
Macquarie's brief – to work on how cognitive radio plays into the embryonic 5G future – puts the research group near the centre of many of the big questions surrounding the future of mobile comms: from the spectrum crunch to antenna technology, from the silicon to the batteries.
The accepted wisdom the world over is that there's a two-pronged attack on the spectrum available for mobile communications: the number of users is exploding, and new technologies devote more radio spectrum to each user to increase throughput.
There are two dominant ways to re-use a slice of radio spectrum, both in the spatial domain, and both of interest to Dutkiewicz's group: one is to make cells smaller (so that a given transmission frequency of, say, 2.3 GHz can be re-deployed nearby without interference); the other is to use MIMO (multiple in, multiple out) antenna technologies.
However, as Dutkiewicz explained, neither of these are as simple as they seem.
It's obvious that if your base station has a coverage footprint of 30km or so (common in the days of analogue mobile technologies and still common on highways), you need a significant amount of separation between masts before a frequency can be reused by another base station without interference.
If your base station's range is only in the hundreds of metres – or tens of metres in a world of nanocells and “hetnets” – a given transmission frequency could easily find itself in use in dozens or hundreds of base stations in relatively near proximity.
If that can be done without the next mobile standard requiring a new air interface, carriers will be very happy indeed: having gone through the cycle of AMPS-to-digital, then GSM to CDMA, then CDMA to OFDM, operators would be very happy if “they don't have to throw out what they have right now”.
While operators might feel constrained not to say so publicly, the idea of deploying their 4G networks today and throwing them away tomorrow is, Dutkiewicz says, something they're afraid of.
However, the proliferation of base stations creates “very complex management issues and interference issues that need to be solved”, he said.
MIMO, on the other hand, creates its own – and completely different – set of issues.
The value of MIMO is twofold. The earliest research focussed on pure spatial multiplexing – the same signal following different paths (therefore having a different travel time), meaning different data can be encoded onto each spatial path.
In this use-case the amount of multiplexing is limited mostly by the number of antennas at the receiver, since the receiver is a more constrained physical environment than the transmitter.
The other “trick” MIMO offers is beam-forming – using the way that signals cancel each other or reinforce each other to shape different beams towards different clients of the same base station.
Both MIMO applications, Dutkiewicz explains, bump into the same basic problem: you can only optimise spectrum you have the right to use.
“The problem is that now we're looking at the spectrum map, and we're running out. Spectrum's a limited resource and it is not possible to just say 'give me some more', because there isn't.
“And of course governments and military people have grabbed chunks for themselves, and it's very expensive. Some of it is very precious because the signal propagation is much nicer for some frequencies, like around 800 to the gigahertz range.”