Tomorrow's wireless world will be fatter, faster, and creepier

Video didn't kill the radio star – it made it stronger than ever


Feature With demand for airborne bandwidth at an all-time high, thanks to ever-flashier streaming and our new Zoomified lifestyles, researchers, standards bodies, and equipment makers are keen to push on and create more capable radio systems.

What these will look like, and what they'll do, depends on new technology. The big move promised in the next 10 years is towards the terahertz spectrum, in particular the area between 100GHz and 1THz – although the more adventurous are looking even higher. Generically known as Extra High Frequency bands (EHF), they are only just coming into their own.

Until recently, commercial interest in wireless largely halted in the millimetre wavebands – 30 to 100GHz, or wavelengths between 10mm and 3mm. This was for multiple reasons; electronic components capable of operating here were expensive and exotic, many design techniques and materials familiar to wireless engineers stopped working, and the basic physics wasn't appealing.

A truism of RF design is that the higher the frequency, the greater the path loss – the reduction in signal strength purely as a function of the distance it travels. In air, absorption by things like water molecules causes particular problems at various points between 100GHz and 1THz, but even without those the trend is grim.

Recent developments in materials and production technology have upended these assumptions. Some of that is basic semiconductor advances – you can now buy a 75GHz transistor for 30 cents, unthinkable 20 years ago.

The same antenna physics that make EHF a contender for real-life long-distance links means the old assumption that it'll be effectively shielded by walls no longer holds

But the real kicker is in what happens to the antenna. Path loss calculations are made assuming a single omnidirectional antenna, but as frequency goes up antenna size goes down – so you can cram ever more into the same area with ever more gain of their own, more than compensating for path loss. Experimenters at New York University's Wireless research unit under Professor Ted Rappaport say that it is possible to build both urban cells and long-distance links in a range of EHF bands where path loss factors are not significantly worse than those already in use at lower frequencies for 5G.

The significance of this is that systems built here have access to enormous amounts of bandwidth. The starting gun for commercial exploitation of the new EHF bands was fired by the American regulator, the Federal Communications Commission, in March 2019, when it ratified its "Spectrum Horizons" proposal to open up frequencies between 95GHz and 3THz. As well as a liberal experimental licensing regime for researchers and industry, it made 21.2GHz of bandwidth available for unlicensed use on much the same terms as the existing 60GHZ Wi-Fi 6 standard, at 116-123GHz, 174.8-182GHz, 185-190GHz, and 244-246GHz. This compares to the existing 60GHz 802.11ad standard, which has 4-12GHz of bandwidth depending on region.

There are significant caveats. Although these frequencies aren't used much on the ground, they are used by earth-sensing satellites to monitor atmospheric, oceanic, and environmental conditions, which has led to UK regulator Ofcom, for example, mirroring the FCC's experimental licences but not as yet the unlicensed bands. Proponents of terrestrial use of the spectrum say that studies show that properly designed antenna that don't radiate upwards can remove this interference, and the FCC has added conditions that all access points in the new bands be made non-weatherproof and unable to run on batteries, to enforce indoor use only where floors and roofs can block signals. If the satellite interference concerns are mollified, global harmonisation of the new bands is highly likely as this is new ground for all countries.

The new bands have implications for many areas, but Wi-Fi has many good examples.

What's in the works for Wi-Fi?

The IEEE 802.11 Wi-Fi standards group has some 10 groups working on new and enhanced Wi-Fi standards, alongside others doing the endless political and technical business of liaising with 5G standards bodies about interoperability.

The Wi-Fi working groups cover areas as prosaic as how to wake a sleeping wireless client remotely, better privacy, and using light instead of wireless – a capability the original 802.11 had in 1997, which nobody wanted then and probably won't want now. There are more useful future extensions.

802.11be, Extremely High Throughput or EHT, will probably get the Wi-Fi 7 badge, with final approval due in 2024. It incorporates the newly approved 6GHz band. It is intended to support 320MHz bandwidth channels and better aggregation of multiple channels, multiple bands, and multiple access points. It'll support up to 15 spatial streams through enhanced Multi-In Multi-Out antenna array management, as well as very low latency for real-time use. Expect practical speeds around two or three times better than 802.11ax, mostly due to the 6GHz band addition, which unlike 2.4 and 5GHz won't be affected in real life by older clients running slower protocols. Also, the multi-band aggregation will ease the rapid adoption of the 100GHz and up new unlicensed bands, if and when they become available.

802.11az, due for final approval in 2023, is an extension to Wi-Fi that adds positioning by FTM, Fine Time Measurement. This finds the distance to an access point by the time taken for messages to travel between it and the client. Although not the first such standard, there's been virtually no adoption because of poor performance, typically 75 per cent chance of being within four metres of your measured position. Iterative improvements by techniques such as combining signals from multiple access points and using higher bandwidth channels – more samples per second, more accuracy – have produced figures like 90 per cent chance of being with a metre, which means you can probably find your lost phone or the office robot can be sure it's reached your desk. Experiments at NYU Wireless at 140GHz promise accuracies to within a few centimetres.

802.11bf started its work in October 2020. It's devoted to Wi-Fi Sense, where analysis of existing signals at a location reveals a radar-like picture of where physical objects are and how they're moving. Potential resolution depends on the bandwidth of signal being picked up, with a common-today 100MHz signal able to sense things down to around a metre, and a five-years'-time 10GHz wide signal getting a spatial resolution of about a centimetre. The applications the working group envisage are security, asset tracking, telemedicine for the elderly, gesture control, gaming, and so on, with the higher resolution modes being able to spot things like your rate of respiration, finger position, and body type, potentially around corners or through walls – the same antenna physics that make EHF a contender for real-life long-distance links means the old assumption that it'll be effectively shielded by walls no longer holds.

Others are advancing the idea in other directions, with researchers at the University of Washington 3D-printing conductive plastic parts designed to be easily sensed in different positions at a high resolution. This can integrate very robust control panels, content sensors or other complex mechanisms cheaply into objects, fully integrated into a Wi-Fi network without them needing any power of their own.

Quick, scatter

Those last two examples are part of a more general technique called backscatter. Originally conceived during Second World War radar research, backscatter involves sensing a signal from an object that's triggered by or a direct modification of a signal received at that object. Typically all the energy in the returned signal is supplied by the original signal. RFID security tags for retail use can be thought of as using backscatter. It's not a new idea.

However, the recent massive increase in radio networks around the globe, the equally striking advances in low-power circuits, and the planet's obsession with the Internet of Things have combined to push a major growth in backscatter research and deployment. In general, data speed and distance are traded off against each other depending on available power, which in backscatter systems is very limited – but potentially available for as long as needed provided the host signal keeps arriving

Researchers from China and the UK have identified a number of existing and proposed backscatter technologies, all of which are developing rapidly.

Ambient backscatter relies on signals not intended for its use, and piggy-backs on them. Typically, an ambient backscatter device either reflects or absorbs an incident signal, thus imposing the 1s and 0s of data transmission.

The reader detects this reflected signal and decodes the changes of intensity. As this is always in the presence of the much stronger incident signal, the reader has to build up the modulation by correlation, which is a slow process of searching for weak patterns in strong noise. Other techniques include harvesting the energy from the incident signal and using that to boost the transmission.

Ranges of up to 20 metres and speeds of 20Kbps have been achieved for devices with no actual transmitters taking power. Variants based on radio technologies such as LoRa which are already designed to be particularly sensitive and robust can stretch this even further, to nearly 500 metres.

Full-duplex backscatter uses similar ideas, but the reader is built into whatever's creating the incident signal in the first place – typically a Wi-Fi access point.

Because this knows all the details of the signal it's transmitting, it can electronically cancel that out from the returned signal, leaving just the tiny changes the backscatter device has imposed. This is far more efficient and thus capable of longer ranges or higher data speeds, again from completely passive devices that are not powering their own transmissions.

The most futuristic technique is Large Intelligent Surface (LIS)-aided Backscatter. This can be thought of as an array of radio mirrors, potentially as large as the entire side of a building or an internal wall, made out of metamaterials.

Metamaterials are nanoengineered compounds with unusual physical characteristics, in this case designed to be able to change the phase of an incoming signal and reflect it with great efficiency. The array is under software control, which enables a large variety of behaviours – very high bandwidth, for example, or beam-forming to focus the reflected energy and achieve a very efficient link.

Although the current iterations do need power for operation, they don't put any of it into the radio signal, promising startling efficiency and flexibility.

But what about 6G?

6G is more controversial within the wireless industry than many will publicly admit. The UK industry body Cambridge Wireless found that many of its members found the prospect of 6G as a concept unwelcome, given that previous "generations" were often far more marketing-led than engineering, with substantial improvements happening within each generation. In particular, 5G has major improvements built into its roadmap, with the big steps forward in lower latency, higher speeds, more edge processing, a wide variety of use models, and the ability to absorb new bands and technologies.

Also, most of the future technologies detailed above are agnostic about the framework they find themselves in. Many of the 5G models of lots and lots of very small feature-rich cells with intelligence replicate the way 802.11 is going, often with virtually identical ideas. The two industries, the cellular networks and the local area specialists, often maintain an artificial divide based on business model and that age-old mistrust between telecoms companies and the upstart IT crowd.

The shape of future wireless is clear. No longer just a way to push data around, it will sense its environment, manage itself, create bandwidth on demand from the resources available, and extend connectivity deep into the fabric of everyday life. It will be fabulously fast, and fabulously efficient. What it will be called doesn't matter in the slightest. ®


Other stories you might like

  • IT staffing, recruitment biz settles claims it discriminated against Americans
    Foreign workers favored over US residents because that's what clients wanted, allegedly

    Amtex Systems Incorporated, an IT staffing and recruiting firm based in New York City, has agreed to settle claims it discriminated against American workers because company clients wanted workers with temporary visas.

    The US Department of Justice on Wednesday announced the agreement, which followed from a US citizen filing a discrimination complaint with the DoJ's Civil Rights Division’s Immigrant and Employee Rights Section (IER).

    "IT staffing agencies cannot unlawfully exclude applicants or impose additional burdens because of someone’s citizenship or immigration status," said Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division, in a statement. "The Civil Rights Division is committed to enforcing the law to ensure that job applicants, including US workers, are protected from unlawful discrimination."

    Continue reading
  • Will this be one of the world's first RISC-V laptops?
    A sneak peek at a notebook that could be revealed this year

    Pic As Apple and Qualcomm push for more Arm adoption in the notebook space, we have come across a photo of what could become one of the world's first laptops to use the open-source RISC-V instruction set architecture.

    In an interview with The Register, Calista Redmond, CEO of RISC-V International, signaled we will see a RISC-V laptop revealed sometime this year as the ISA's governing body works to garner more financial and development support from large companies.

    It turns out Philipp Tomsich, chair of RISC-V International's software committee, dangled a photo of what could likely be the laptop in question earlier this month in front of RISC-V Week attendees in Paris.

    Continue reading
  • Did ID.me hoodwink Americans with IRS facial-recognition tech, senators ask
    Biz tells us: Won't someone please think of the ... fraud we've stopped

    Democrat senators want the FTC to investigate "evidence of deceptive statements" made by ID.me regarding the facial-recognition technology it controversially built for Uncle Sam.

    ID.me made headlines this year when the IRS said US taxpayers would have to enroll in the startup's facial-recognition system to access their tax records in the future. After a public backlash, the IRS reconsidered its plans, and said taxpayers could choose non-biometric methods to verify their identity with the agency online.

    Just before the IRS controversy, ID.me said it uses one-to-one face comparisons. "Our one-to-one face match is comparable to taking a selfie to unlock a smartphone. ID.me does not use one-to-many facial recognition, which is more complex and problematic. Further, privacy is core to our mission and we do not sell the personal information of our users," it said in January.

    Continue reading
  • Meet Wizard Spider, the multimillion-dollar gang behind Conti, Ryuk malware
    Russia-linked crime-as-a-service crew is rich, professional – and investing in R&D

    Analysis Wizard Spider, the Russia-linked crew behind high-profile malware Conti, Ryuk and Trickbot, has grown over the past five years into a multimillion-dollar organization that has built a corporate-like operating model, a year-long study has found.

    In a technical report this week, the folks at Prodaft, which has been tracking the cybercrime gang since 2021, outlined its own findings on Wizard Spider, supplemented by info that leaked about the Conti operation in February after the crooks publicly sided with Russia during the illegal invasion of Ukraine.

    What Prodaft found was a gang sitting on assets worth hundreds of millions of dollars funneled from multiple sophisticated malware variants. Wizard Spider, we're told, runs as a business with a complex network of subgroups and teams that target specific types of software, and has associations with other well-known miscreants, including those behind REvil and Qbot (also known as Qakbot or Pinkslipbot).

    Continue reading
  • Supreme Court urged to halt 'unconstitutional' Texas content-no-moderation law
    Everyone's entitled to a viewpoint but what's your viewpoint on what exactly is and isn't a viewpoint?

    A coalition of advocacy groups on Tuesday asked the US Supreme Court to block Texas' social media law HB 20 after the US Fifth Circuit Court of Appeals last week lifted a preliminary injunction that had kept it from taking effect.

    The Lone Star State law, which forbids large social media platforms from moderating content that's "lawful-but-awful," as advocacy group the Center for Democracy and Technology puts it, was approved last September by Governor Greg Abbott (R). It was immediately challenged in court and the judge hearing the case imposed a preliminary injunction, preventing the legislation from being enforced, on the basis that the trade groups opposing it – NetChoice and CCIA – were likely to prevail.

    But that injunction was lifted on appeal. That case continues to be litigated, but thanks to the Fifth Circuit, HB 20 can be enforced even as its constitutionality remains in dispute, hence the coalition's application [PDF] this month to the Supreme Court.

    Continue reading

Biting the hand that feeds IT © 1998–2022