Historically, the driving force in public mobile networks for speed and capacity has been the downlink. The average user downloads much more data than they upload. Given the one-to-many nature of email, it will always be downlink heavy. In video, now the major load on public networks, most of us watch far more than we originate.
Until 5G, almost all networks used a technology known as Frequency Division Duplex (FDD), in which equal chunks of radio spectrum are used for uplink and downlink. This leads to a conclusion that, in FDD networks, some of the uplink spectrum is being wasted. Along with 5G came a new technology, Time Division Duplex (TDD). In TDD a single block of spectrum is time-shared between uplink and downlink. This offers some flexibility in the allocation of resources between the two.
Unfortunately, there are some limitations to that flexibility. The path loss between nearby base stations can be very small, as they may be both in high locations within sight of each other. This leads to the likelihood that base stations on different but adjacent channels may interfere with each other if one is transmitting while the other is receiving. For this reason, Ofcom effectively mandates that all public 5G networks in UK shall use a 3:1 ratio of timeslots between downlink and uplink and be synchronised in time . In practice, a similar approach is adopted globally.
At first sight, this makes much more efficient use of the spectrum if it matches the downlink dominated traffic demand. Ironically, a majority of our trials in the DCMS 5G programme seem to have use cases that are actually uplink-heavy, typically involving some form of video capture. These applications range from live filming of sports in the Live + Wild and Connected Cowes projects through monitoring of industrial processes,
ensuring safety of passengers on public transport, finding parking spaces and monitoring road traffic in our wide range of testbeds in the West Midlands (WM5G) to scanning of crops in the 5G RuralDorset and MONeH projects. The exceptions are those projects involving AR and VR - such as the 5GEM and 5G Encode projects - which also have heavy downlink loads. This suggests that traffic in private 5G networks may be less asymmetric than in previous networks and may even swing the other way, with uplink dominating. Time will tell whether this will also be the case in public networks.
So why not just change the timeslot configuration to match the application? In principle this is possible in a private network using an Ofcom Shared Access Licence. At the moment there is plenty of spectrum available in the 3.8-4.2GHz band and private networks are scattered across the country. They mostly operate over shorter ranges with lower power than public networks and will often be providing mainly indoor coverage in private buildings. In most cases, the chances of interference between networks are small and nearby users can probably be well separated in frequency. At the top end of the band, the frequency separation from public networks will minimise the potential for mutual interference.
A potential problem lies in the small print. While Ofcom allows users to move away from synchronisation and the standardised 3:1 downlink:uplink ratio, there is a clause in the Shared Access Licence Guidance Document (p.26) which reads: “We’re not planning on imposing a synchronisation requirement in the 3.8-4.2 GHz band. However, we reserve the rights (sic) to mandate synchronisation at a later date if this turns out to be necessary to ensure spectrum is being used efficiently.” They also reserve the right to re-allocate networks to a different frequency.
If a private network chooses to switch to a symmetrical or uplink biased timeslot configuration (both of which are supported by 3GPP specifications), it is more likely to be a victim of interference than a cause. This is because the dominant cause of interference is from base station to base station rather than mobile to mobile. If the private network is using more timeslots for the uplink, it may be listening when a nearby public network is transmitting, and thus suffer interference. However as long as frames are aligned, it will always be transmitting when the public network is also transmitting and thus be no threat to the public network. So locking to a common timing reference may be more necessary than having identical uplink and downlink slots.
Should a private network adopt a different timeslot configuration if it is designed to serve an uplink heavy use case? While there is a modest risk that performance may be degraded in the future, the benefit is potentially massive, with an increase in uplink capacity of at least a factor of 3 available for the taking. A number of our DCMS 5G testbeds have investigated the use of different timeslot configurations with encouraging results. Most equipment seems capable of being reconfigured, however some vendors are continuing to concentrate on the standard downlink heavy split in their product plans. This may be an issue in choosing a vendor for a private network. So far, terminals seem to support different splits, but time will tell whether this is universally the case and whether increased heat dissipation from more transmitting may be an issue.
While private networks are well dispersed, the risk of interference seems small, but users may be unwilling to invest in a network unless they can have certainty that it will continue to be able to deliver the performance they require without risk of later degradation. This suggests that applications for a spectrum licence should include a request for a certain uplink/downlink timeslot balance as well as a given location and bandwidth. An automated allocation system should be able to allocate a suitable channel or channels to support the requested configuration in the chosen location.
Some work will be needed before Ofcom will be able to use such a sophisticated method. Ofcom’s Plan of Work 2020/21 includes the statement “We will look at implementing an automated authorisation approach for access to the shared bands to ensure that the shared spectrum is used effectively and efficiently”, and the licence conditions already require users to be prepared to change frequency to enable Ofcom to maximise use of the spectrum. It would be desirable for work on this to happen as soon as possible and to take into account different timeslot configurations, alongside any other issues identified.There was a lot of work done on adjacent channel interference in 3GPP in the early days of 4G, when TDD operation was first being taken seriously. This, and subsequent work on 5G, should help to determine the spacing in geography and frequency needed between systems with different timeslot configurations.
In conclusion, a more flexible approach to the allocation of uplink / downlink timeslots offers the potential to improve the uplink capacity of a private 5G network by at least a factor of 3 for use cases where uplink traffic dominates. This is required to enable absolute data rates capable of enabling some key video based applications. Clearly this approach brings additional risks of interference but these could be mitigated by appropriate frequency allocations. Ofcom’s above planned work to automate the relevant frequency planning and licensing process offers the opportunity to tackle this challenge and enable a licensing arrangement that fits the evolving scenarios that such 5G networks will cater for in the future.