The arrival of fixed and ever faster mobile broadband prompts some to ask whether television should move entirely online. The answer will depend on what the questioner means by “television”.
Like 4G, and to a lesser extent 3G before it, 5G can and will be used to access video content. But “television” is not just a content-distribution technology. Television plays a unique societal role as it brings countries together to share national events, news and live sport. In an increasingly globalised world, television reflects and helps to preserve national and regional identities and even local languages. In most countries, TV helps to underpin the performing arts sector.
In the UK, the BBC, ITV, C4, Channel 5 and the Welsh-language channel S4C are designated as public-service broadcasters, or PSBs, and have been granted access to spectrum in return for delivering against a range of public-service obligations defined by parliament and monitored by Ofcom. For the cost of a TV licence, any viewer can watch any channel on any television.
PSBs have obligations to serve remote rural communities as well as densely populated urban areas. According to the corporation’s annual report, BBC television reaches 98.5 per cent of UK households via the Freeview digital terrestrial transmission (DTT) infrastructure. Ofcom’s 2020 Connected Nations report reveals that 13 per cent of households don’t have fixed internet access, and 190,000 premises can’t access either fixed or mobile broadband. The 4G coverage of mobile operators ranges from 79 per cent to 85 per cent of UK geography. Through the shared rural network, the government and all four mobile network operators will jointly invest over £1bn to increase 4G mobile coverage throughout the UK. With funding now available, the government and the operators remain confident that combined coverage will be delivered to 95 per cent of UK geography by the end of 2025, with areas around the UK starting to see improvements to 4G coverage long before completion.
It’s not as if television has stood still since the first demonstration of it by John Logie Baird. Until 1964, the BBC and the regional ITV companies occupied 174 MHz of VHF spectrum to deliver two black and white 405-line channels nationally. BBC2 was launched onto a new 625-line, UHF broadcast system in 1964. Under the channel’s then controller, David Attenborough, BBC2 pioneered colour television from July 1967. In its early days, UHF transmissions between 471 and 853 MHz had a noticeably smaller coverage footprint than the older VHF service (41-215 MHz). The shortfall was progressively fixed with additional transmitters.
The 384 MHz of UHF spectrum supported 44 transmission channels that enabled four broadcast channels, BBC1, BBC2, ITV and Channel 4, to achieve national coverage. When Channel 5 launched in 1997 using a handful of previously unallocated transmission channels, frequency planning and re-use complexities considerably limited the newcomer’s coverage.
Digital switchover, which was completed in 2012, replaced the previous analogue system with more efficient digital transmission and released spectrum for the “European digital dividend” band 20, which was re-assigned to 4G LTE.
Even after the most recent spectrum clearance programme, which has released 120 MHz of the 700 MHz band for the imminent Ofcom auction, the UK DTT infrastructure transmits a mix of 30 standard- and high-definition PSB channels nationally and another 70 commercial channels to around 90 per cent of households in just 216 MHz of spectrum.
The UK’s terrestrial television technical specification, the “D-Book”, is maintained by Digital TV Group, or DTG. It draws heavily on international standards, such as DVB-T and DVB-T2, for high-definition and so enables UK consumers to benefit from international economies of scale in screen manufacture. It’s a sophisticated specification. Broadcast channels are multiplexed, which allows bit rates for each channel to be adjusted automatically to deliver the best possible image quality across the multiplex. Throughput per broadcast channel can vary from 3Mbps to 17Mbps: streams with greater image detail are dynamically given greater bandwidth than simpler, less detailed, images.
The D-Book now includes broadband connectivity to enable broadcasters’ catch-up services to be integrated into the large-screen TV experience. More than 50 per cent of UK households now have a smart TV, while in two thirds of households, at least one TV is connected to the internet.
The global audience that is accessible online has attracted new entrants into the content distribution marketplace. Streaming services tend to focus on drama: good stories have universal appeal. News, sport and live events in which interests, tastes and rights are more fragmented have largely been avoided.
Traditional broadcasters have responded by making their content available via online catch-up services to those viewers who want greater control over what they watch and when. BBC iPlayer registered 4.8 billion programme requests in 2019-20.
Business models depend on the aggregation and refreshing of content libraries that will attract and retain long-term subscribers. There is clear evidence that subscribers churn when they believe they have seen everything currently of interest. Ongoing success for both traditional broadcasters and new entrants remains dependent on securing a steady flow of quality, engaging content. As Joel Espelien from US media advisory firm TDG noted in January, “content matters. Delivery technology doesn’t”.
Interest in the concept of broadcasting to mobiles has been building since the mid-2000s, when a plethora of mobile broadcast standards that included DVB-H, DMB, DAB-IP, ISDB-T and MediaFLO were proposed. But all these standards struggled with the same issue. Each was premised on building a dedicated mobile broadcast network that transmitted to compatible handsets. For smartphone manufacturers, whose business models rest on offering common products to multiple markets, diverse candidate technologies and no harmonised spectrum did not make an attractive opportunity. Without scale, handsets compatible with any of the available options would be expensive.
It was also never clear who should invest in the mobile broadcast network. In the broadcast world, the transmission network is a “neutral host” and viewers can use any TV set to watch their channel of choice. Building a parallel duplicate network just to reach mobile users, most of whom wouldn’t have a compatible handset, was a problematic business case.
Was there an opportunity for mobile operators? The need for premium-priced compatible handsets would place an operator at a disadvantage in their core business of providing mobile communications. Also, operators had no experience of curating and sourcing content and there was little consumer appetite for an incremental pay-TV subscription either.
Acknowledgement of the weakness of the business case coincided with the launch of the iPhone in 2007 and its app store: the smartphone has radically shifted mobile-phone usage. It was also at about this time that 3GPP, the body that develops and maintains global mobile technology standards, was looking to improve on the lacklustre mobile broadband capabilities of 3G.
LTE, or 4G, was introduced in Release-9 of 3GPP’s standards in 2010 along with a new feature: eMBMS – evolved multimedia broadcast multicast services, or “LTE-Broadcast”. This feature would enable multiple users in the same cell to access the same livestream, rather than the network having to serve multiple identical unicast streams. With LTE Broadcast, operators could serve larger audiences more efficiently and reduce their carbon footprint.
Further enhancements have been introduced in subsequent releases: for example, MooD, which is MBMS operation on demand and which dynamically switches from unicast to multicast as the number of users who consume the same stream in a cell reaches a defined threshold.
To date, LTE-Broadcast has had limited take-up: only a handful of commercial services have launched, and there are none in Europe. The technology has been constrained because operators were reluctant to deploy unless compatible smartphones were readily available, while handset manufacturers were reluctant to implement the feature if services weren’t being offered.
Eased access to processing power has been the catalyst for new virtual and augmented reality experiences. At the (virtual) 2021 Consumer Electronics Show, Steve Koenig, Vice-President of Research at the US Consumer Technology Association, forecast that VR/AR spending over five years would grow at a compound annual rate of 54 per cent. However, he cautioned that many consumers were uneasy about being “cut-off” and that there was more interest in wearables, which could be worn most of the time, than in VR headsets.
3DTV provides a recent example of the importance of consumer sentiment. Great 3DTV experiences, either active shutter or polarised, were promoted in the early 2010s. However, incremental production and delivery costs, combined with the expense of compatible screens and accessory glasses, meant the technology didn’t progress from novelty to scale. Those services that were launched were withdrawn by the middle of the decade.
The media industry was one vertical sector the needs of which were taken into consideration during the development of the 5G standards. Throughput and low latency are of obvious benefit for video delivery.
A decade ago, content creators were innately resistant to the concept of the cloud. As security concerns have been addressed, more and more editing and production processes have been virtualised and transformed into cloud services. The trend has been accelerated by the pandemic as there has been a massive shift to remote production. This suggests big opportunities for 5G’s combination of mobile-edge computing, high throughput and low-latency connectivity.
Network slicing will appeal for use cases that require guaranteed quality of service. Rather than going to the expense of booking a staffed satellite-link van, a TV news crew could purchase a slice to carry a live interview back to the broadcast centre.
The arrival of 5G has also led to the introduction of more flexible shared and local spectrum licensing regimes, which will facilitate new use cases such as the deployment of localised private networks for use by closed user-groups. And then there’s feMBMS, the 5G update to MBMS and LTE Broadcast.
A study by the European Broadcast Union last year investigated the applicability of 5G to the delivery of linear broadcast and on-demand non-linear services by PSBs and commercial broadcasters. It concluded that, technically, 5G Broadcast could fulfil many broadcaster requirements but that structural barriers had to be overcome.
Could 5G Broadcast replace DTT? It’s a massive question. The answer would need to be based on a root-and-branch review of UK broadcasting.
Society would need to decide whether the concept of PSBs or the licence fee should be retained. Some sort of neutral-host infrastructure would have to be deployed: economics would dictate that the system should be accessible to any mobile user, irrespective of the network that they were subscribed to.
But what spectrum would it use? If there was a universal service obligation, the 700 MHz spectrum that has just been relinquished by DTT would be better suited than the 3.6-4.2 MHz band currently earmarked for 5G: the much more limited propagation characteristics of the higher frequency would make universal coverage prohibitively expensive.
A body would need to be created to identify suitable broadcasters and allocate the finite capacity of the system. Consumers would need compatible handsets and would have to upgrade or adapt all their existing screens. It seems a lot of disruption for little obvious benefit to society.
One initiative that could have a significant impact on the speed of 5G roll-out, and its applicability to the media industry, is OpenRAN, or the open radio access network. At present, operators are largely restricted to using proprietary hardware from a single vendor across large areas of their footprint. Mergers and acquisitions, and more recently security concerns, have effectively reduced the mobile infrastructure supply sector to a duopoly.
OpenRAN is founded on virtualisation, in which network functions delivered by dedicated hardware are replaced by software applications that run on commercial off-the-shelf hardware.
The OpenRAN initiative aims to break the stranglehold of the incumbent vendors by creating tighter specifications for key interfaces between different functions within the radio access network. This will enable new vendors to enter the market with subsystems rather than having to supply a full end-to-end system.
OpenRAN is expected to reduce operators’ capital and operational costs, especially in rural areas, which are less densely populated. New sources of OpenRAN-compliant 5G radio systems could also accelerate the deployment of private and local-area 5G networks that could find numerous media use cases. Virtualisation would also give operators the ability to run base-station applications that were tailored to specific user needs.
The 5G Create projects supported by DCMS are exploring real-life applications of 5G technologies that are relevant to important sectors of the economy. Five of the nine new projects that were announced on 13 January are directly relevant to broadcast, media and entertainment. Three of the projects are applications that build on the principles of OpenRAN.
DTG, custodian of the D-Book, leads the 5G Vista project, which will explore how feMBMS might be used to enhance stadium experiences by providing fans with access to a variety of content channels, replays and match stats. This localised use case for 5G Broadcast will explore the business case for stadium owners to offer in-venue services to all mobile users, irrespective of their networks, through a neutral-host deployment. Low latency will be important to ensure that broadcast streams are in-sync with the action in the stadium. Edge computing may be deployed for AR or VR capabilities. The project will also look at integration of official broadcast feeds into in-stadium services.
An end-to-end system demo is planned in July, when live camera feeds will be broadcast to commercial feMBMS-capable devices. Pandemic permitting, a large-scale live demonstration will be held at a major sporting event in February 2022. The project will provide evidence to device manufacturers of 5G Broadcast’s appeal to consumers.
Leeds-based aql is a wholesale telecommunications provider that runs data centres and a carrier-grade national network, which has hundreds of points of interconnect to other mobile and fixed networks. An Ofcom-licensed mobile operator, aql is supporting three trials that are exploring the use of 5G to transfer footage from a camera to a production hub or live broadcast. All three projects use OpenRAN principles.
Five projects are using aql technology: Eden Universe, Live and Wild, Mobile Access North Yorkshire, Connected Cowes and Factory of the Future. 5G cell sites that will be deployed by its project partners will connect to the aql core network, which currently has a non-standalone architecture. All the projects make use of Ofcom’s relatively new shared spectrum licensing regime.
The Eden Universe project will use 5G and 360-degree video to enhance the on-site visitor experience at Eden Project Cornwall and open up the world’s largest captive tropical rain forest to virtual visitors. In the first instance, a single macro 5G base station will be deployed. Supplementary small cells may be added if any not-spots emerge once the macro cell is transmitting. The Superfast Cornwall broadband network will connect the cell site to aql’s core networkwith content produced on-site reaching virtual audiences worldwide.
With their very large sensors, each 360-degree camera will generate upwards of 100 Mbps; as they will be live streaming for long periods, the cameras will provide a stern test for the throughput and resilience of 5G. Project partner Meta Camera is working on an even higher resolution 13K model to enable viewers to zoom in ever closer to reveal more and more detail.
The low-latency attributes of 5G will be important: live streams and AR graphical overlays will be cloud-rendered in real time.
The 5G infrastructure will also be used to provide real-time data on energy and water management across the site. This will help to sustain the unique ecosystem and to improve the visitor experience and the attraction’s environmental performance.
Connected Cowes aims to exploit 5G to expand the appeal of sailing by creating an immersive yacht-racing experience; 360-degree video will be livestreamed over 5G from yachts that compete in the annual week-long regatta, which was founded in 1826. With daily races that take place across 42 boat classes and 20 square miles of water, Cowes Week typically attracts 1,000 yachts, around 8,000 competitors and 100,000 or more spectators.
A cross-section of boats in each race will be equipped with a 360-degree video camera and 5G-connected router in a waterproof casing. A Wi-Fi 6 bubble that the router will create around the boat will connect the camera to the 5G network. The feeds from all the boats will be streamed live via YouTube and through CowesLive.co.uk. Spectators ashore will be able to scroll around a 360-degree view from a boat of their choice.
There has been enthusiastic support for the initiative. Local sailing clubs have come forward to provide three of the five cell sites that have been secured to cover the Solent.
A climbing adventure on the sea cliffs of Anglesey in North Wales, scheduled for the end of April, will be the first of a series of challenging Live and Wild filming scenarios that will test the utility of 5G to documentary film makers.
The technological hub for the outside broadcast will be a Land Rover that will be specially equipped with a telescopic mast for the 5G base station. Backhaul will be provided by a 5GHz or 60GHz line-of-sight, point-to-point radio link. The receiving end for the backhaul will be a local wireless internet service provider, which will transfer the video content to the aql core network.
LiveU-bonded cellular field units will be used to connect broadcast cameras and action cams to the 5G network. Smartphone cameras will also be used during shooting. Resilience and image quality will be monitored closely as high-end video is livestreamed over 5G. The team will experiment with mixing shots taken on location and remotely. It will evaluate the time taken to transfer raw batch footage from location to the post-production hub.
Leaky feeders will be used for a future, literally ground-breaking, assignment when the production team will provide live coverage of caving from the Yorkshire Dales. A couple of arduous endurance events in mountainous terrain are currently being scoped for future Live and Wild treatment. Production company Candour TV will produce a long-form documentary and “how we did it” video clips to share their 5G experiences with the wider UK TV production community.
Augmentation of The Green Planet with 5G will create an AR app to complement an upcoming BBC One series. The Green Planet will be presented by Sir David Attenborough and will kick-off the BBC’s centenary celebrations next year. The app will exploit 5G throughput, low latency and edge computing to enable gatherings at designated locations to enjoy holographic video and other enhanced interactive experiences.
5G Edge XR is probing mobile-edge computing and the ultra-low latency capabilities of 5G in eight extended reality trials. Three of the use cases are directly relevant to sports broadcasters. A graphics processing unit edge node, which will be made up of 12 high-performance GPUs, will be the technical heart of the project.
While extended reality can be consumed and manipulated on smartphones and tablets, headsets deliver the most immersive experiences but they are expensive. To democratise extended reality, this project aims to undertake as much processing as possible in the network edge rather than in the processor of a headset. The look, feel and weight of headset devices will be a key to consumer acceptance.
The team has selected a lightweight headset from Nreal, which can render a video stream and provide simultaneous location and mapping data, called SLAM data, that enables a server to work out what the user is looking at.
One use case will complement conventional TV coverage of boxing with augmented elements. By surrounding the ring with up to 32 volumetric video cameras, a viewer who wears smart glasses while watching the bout on TV could also be presented with a 3D rendering of the boxers on their coffee table. If a viewer were to walk around the table, their view of the 3D action would be adjusted in real time to their specific perspective. Other information could also be displayed virtually around the main TV.
5G’s low latency is critical to volumetric video. As a viewer moves, the edge reacts to the changing SLAM data and renders a new view. Successive received images must be in sync with the brain’s perception of its viewpoint, otherwise the viewer may experience nausea.
The project will also look at 3D circuit maps of Moto GP races and other live race streams to augment broadcast coverage of these sports. For rights owners and broadcasters, anything that increases audience share in a TV sports market that is dominated by soccer is commercially attractive.
Later in the year the team will develop an extended reality app to create an immersive football experience with spatial audio, in order to understand how edge-rendering compares with BT Sport’s existing 8K 360-degree offering.
The business case for mobile-edge computing will depend on offering a portfolio of use cases that serve customers around the clock, not just on match days. One educational trial will involve school pupils being taught dance by a live volumetric video representation of an expert instructor at a remote location. Other trials will look at the presentation of interactive virtual 3D models in construction, retail and medical environments.
High throughput and low latency are also qualities that will be key to the 5G Festival. This project, which culminates in a multi-venue arts and music festival in early 2022, aims to enable a guest appearance by an artist at, say, the Metropolis recording studio in west London with an act on stage at the Brighton Dome. Live video of the remote artists will be displayed on screens that will be incorporated into the live stage set. By wearing AR glasses, the artists at both locations will be able to see how they are appearing together, even though they are 45 miles apart. Professional live audio and high-quality video are bandwidth-hungry: low latency will be key to ensuring that the remote artists remain in sync with each other.
A VR experience will be available to online audiences. With appropriate headsets, audience members will be able to experience 360-degree video and object-based audio. Smartphone or tablet users will be able to scroll around to change their points of view. The 5G Festival will include spaces where the immersive VR and audio can be fully experienced on site.
The number of disciplines involved in the project highlights the diversity of the performing arts eco-system. Trials of a new blockchain-based ticketing system will contribute insights for new business models that appropriately reward all stakeholders.
High throughput, low latency, edge computing and the convenience of wireless over physical connections all point to a bright future for 5G in the broadcast, media and entertainment sectors. These DCMS-supported hands-on trials will provide valuable insights to help stakeholders to identify where they can make use of 5G to facilitate creativity, reach new audiences, or offer new content.