Feature With demand for airborne bandwidth at an all-time high, thanks to ever-flashier streaming and our new Zoomified lifestyles, researchers, standards bodies, and equipment makers are keen to push on and create more capable radio systems.
What these will look like, and what they’ll do, depends on new technology. The big move promised in the next 10 years is towards the terahertz spectrum, in particular the area between 100GHz and 1THz – although the more adventurous are looking even higher. Generically known as Extra High Frequency bands (EHF), they are only just coming into their own.
Until recently, commercial interest in wireless largely halted in the millimetre wavebands – 30 to 100GHz, or wavelengths between 10mm and 3mm. This was for multiple reasons; electronic components capable of operating here were expensive and exotic, many design techniques and materials familiar to wireless engineers stopped working, and the basic physics wasn’t appealing.
A truism of RF design is that the higher the frequency, the greater the path loss – the reduction in signal strength purely as a function of the distance it travels. In air, absorption by things like water molecules causes particular problems at various points between 100GHz and 1THz, but even without those the trend is grim.
Recent developments in materials and production technology have upended these assumptions. Some of that is basic semiconductor advances – you can now buy a 75GHz transistor for 30 cents, unthinkable 20 years ago.
The same antenna physics that make EHF a condenser for real-life long-distance links means the old assumption that it’ll be effectively shielded by walls no longer holds
But the real kicker is in what happens to the antenna. Path loss calculations are made assuming a single omnidirectional antenna, but as frequency goes up antenna size goes down – so you can cram ever more into the same area with ever more gain of their own, more than compensating for path loss. Experimenters at New York University’s Wireless research unit under Professor Ted Rappaport say that it is possible to build both urban cells and long-distance links in a range of EHF bands where path loss factors are not significantly worse than those already in use at lower frequencies for 5G.
The significance of this is that systems built here have access to enormous amounts of bandwidth. The starting gun for commercial exploitation of the new EHF bands was fired by the American regulator, the Federal Communications Commission, in March 2019, when it ratified its “Spectrum Horizons” proposal to open up frequencies between 95GHz and 3THz. As well as a liberal experimental licensing regime for researchers and industry, it made 21.2GHz of bandwidth available for unlicensed use on much the same terms as the existing 60GHZ Wi-Fi 6 standard, at 116-123GHz, 174.8-182GHz, 185-190GHz, and 244-246GHz. This compares to the existing 60GHz 802.11ad standard, which has 4-12GHz of bandwidth depending on region.
There are significant caveats. Although these frequencies aren’t used much on the ground, they are used by earth-sensing satellites to monitor atmospheric, oceanic, and environmental conditions, which has led to UK regulator Ofcom, for example, mirroring the FCC’s experimental licences but not as yet the unlicensed bands. Proponents of terrestrial use of the spectrum say that studies show that properly designed antenna that don’t radiate upwards can remove this interference, and the FCC has added conditions that all access points in the new bands be made non-weatherproof and unable to run on batteries, to enforce indoor use only where floors and roofs can block signals. If the satellite interference concerns are mollified, global harmonisation of the new bands is highly likely as this is new ground for all countries.
The new bands have implications for many areas, but Wi-Fi has many good examples.
What’s in the works for Wi-Fi?
The IEEE 802.11 Wi-Fi standards group has some 10 groups working on new and enhanced Wi-Fi standards, alongside others doing the endless political and technical business of liaising with 5G standards bodies about interoperability.
The Wi-Fi working groups cover areas as prosaic as how to wake a sleeping wireless client remotely, better privacy, and using light instead of wireless – a capability the original 802.11 had in 1997, which nobody wanted then and probably won’t want now. There are more useful future extensions.
802.11be, Extremely High Throughput or EHT, will probably get the Wi-Fi 7 badge, with final approval due in 2024. It incorporates the newly approved 6GHz band. It is intended to support 320MHz bandwidth channels and better aggregation of multiple channels, multiple bands, and multiple access points. It’ll support up to 15 spatial streams through enhanced Multi-In Multi-Out antenna array management, as well as very low latency for real-time use. Expect practical speeds around two or three times better than 802.11ax, mostly due to the 6GHz band addition, which unlike 2.4 and 5GHz won’t be affected in real life by older clients running slower protocols. Also, the multi-band aggregation will ease the rapid adoption of the 100GHz and up new unlicensed bands, if and when they become available.
802.11az, due for final approval in 2023, is an extension to Wi-Fi that adds positioning by FTM, Fine Time Measurement. This finds the distance to an access point by the time taken for messages to travel between it and the client. Although not the first such standard, there’s been virtually no adoption because of poor performance, typically 75 per cent chance of being within four metres of your measured position. Iterative improvements by techniques such as combining signals from multiple access points and using higher bandwidth channels – more samples per second, more accuracy – have produced figures like 90 per cent chance of being with a metre, which means you can probably find your lost phone or the office robot can be sure it’s reached your desk. Experiments at NYU Wireless at 140GHz promise accuracies to within a few centimetres.
802.11bf started its work in October 2020. It’s devoted to Wi-Fi Sense, where analysis of existing signals at a location reveals a radar-like picture of where physical objects are and how they’re moving. Potential resolution depends on the bandwidth of signal being picked up, with a common-today 100MHz signal able to sense things down to around a metre, and a five-years’-time 10GHz wide signal getting a spatial resolution of about a centimetre. The applications the working group envisage are security, asset tracking, telemedicine for the elderly, gesture control, gaming, and so on, with the higher resolution modes being able to spot things like your rate of respiration, finger position, and body type, potentially around corners or through walls – the same antenna physics that make EHF a condenser for real-life long-distance links means the old assumption that it’ll be effectively shielded by walls no longer holds.
Others are advancing the idea in other directions, with researchers at the University of Washington 3D-printing conductive plastic parts designed to be easily sensed in different positions at a high resolution. This can integrate very robust control panels, content sensors or other complex mechanisms cheaply into objects, fully integrated into a Wi-Fi network without them needing any power of their own.
Quick, scatter
Those last two examples are part of a more general technique called backscatter. Originally conceived during Second World War radar research, backscatter involves sensing a signal from an object that’s triggered by or a direct modification of a signal received at that object. Typically all the energy in the returned signal is supplied by the original signal. RFID security tags for retail use can be thought of as using backscatter. It’s not a new idea.
However, the recent massive increase in radio networks around the globe, the equally striking advances in low-power circuits, and the planet’s obsession with the Internet of Things have combined to push a major growth in backscatter research and deployment. In general, data speed and distance are traded off against each other depending on available power, which in backscatter systems is very limited – but potentially available for as long as needed provided the host signal keeps arriving
Researchers from China and the UK have identified a number of existing and proposed backscatter technologies, all of which are developing rapidly.
Ambient backscatter relies on signals not intended for its use, and piggy-backs on them. Typically, an ambient backscatter device either reflects or absorbs an incident signal, thus imposing the 1s and 0s of data transmission.
The reader detects this reflected signal and decodes the changes of intensity. As this is always in the presence of the much stronger incident signal, the reader has to build up the modulation by correlation, which is a slow process of searching for weak patterns in strong noise. Other techniques include harvesting the energy from the incident signal and using that to boost the transmission.
Ranges of up to 20 metres and speeds of 20Kbps have been achieved for devices with no actual transmitters taking power. Variants based on radio technologies such as LoRa which are already designed to be particularly sensitive and robust can stretch this even further, to nearly 500 metres.
Full-duplex backscatter uses similar ideas, but the reader is built into whatever’s creating the incident signal in the first place – typically a Wi-Fi access point.
Because this knows all the details of the signal it’s transmitting, it can electronically cancel that out from the returned signal, leaving just the tiny changes the backscatter device has imposed. This is far more efficient and thus capable of longer rangers or higher data speeds, again from completely passive devices that are not powering their own transmissions.
The most futuristic technique is Large Intelligent Surface (LIS)-aided Backscatter. This can be thought of as an array of radio mirrors, potentially as large as the entire side of a building or an internal wall, made out of metamaterials.
Metamaterials are nanoengineered compounds with unusual physical characteristics, in this case designed to be able to change the phase of an incoming signal and reflect it with great efficiency. The array is under software control, which enables a large variety of behaviours – very high bandwidth, for example, or beam-forming to focus the reflected energy and achieve a very efficient link.
Although the current iterations do need power for operation, they don’t put any of it into the radio signal, promising startling efficiency and flexibility.
But what about 6G?
6G is more controversial within the wireless industry than many will publicly admit. The UK industry body Cambridge Wireless found that many of its members found the prospect of 6G as a concept unwelcome, given that previous “generations” were often far more marketing-led than engineering, with substantial improvements happening within each generation. In particular, 5G has major improvements built into its roadmap, with the big steps forward in lower latency, higher speeds, more edge processing, a wide variety of use models, and the ability to absorb new bands and technologies.
Also, most of the future technologies detailed about are agnostic about the framework they find themselves in. Many of the 5G models of lots and lots of very small feature-rich cells with intelligence replicate the way 802.11 is going, often with virtually identical ideas. The two industries, the cellular networks and the local area specialists, often maintain an artificial divide based on business model and that age-old mistrust between telecoms companies and the upstart IT crowd.
The shape of future wireless is clear. No longer just a way to push data around, it will sense its environment, manage itself, create bandwidth on demand from the resources available, and extend connectivity deep into the fabric of everyday life. It will be fabulously fast, and fabulously efficient. What it will be called doesn’t matter in the slightest. ®