Though mobile operators have been deploying 5G wireless networks for the last few years, most of these have been non-standalone (NSA) networks leveraging existing 4G LTE infrastructure. 5G NSA deployments combine a mix of 4G LTE base stations and supporting core infrastructure with 5G radios to enable 5G services. While NSA has enabled many telecom companies to deploy 5G quickly, this approach has not lived up to the promise of delivering “hyper-connectivity speeds” to consumers. At the same time, the higher bandwidth and faster speeds of 5G requires improved timing precision, accuracy, and reliability.
Only recently have these operators begun to lay the groundwork for true 5G standalone (SA) networks, which require more than a simple radio or firmware upgrade. All components of a 5G SA network, including base stations, core, and radios, are built to the 3GPP 5G specification and are optimized to deliver increased download speeds and ultra-low–latency services.
In 2020, for example, T-Mobile started to launch the world’s first nationwide SA network — and achieved a record speed of almost 5 Gbits/s ON its commercial 5G SA network in late 2021. And Verizon plans to leverage CBRS spectrum to deploy its own SA 5G core network this year.
This competitive rush to deploy the “next-gen solution” isn’t new to the industry. Mobile networks have introduced a new generation roughly every 10 years since the ’80s. Each new generation supports faster speeds and more advanced functionality compared with the prior — and this is no different for fifth-generation networks. As 5G is intended to allow for real-time interactivity and enable advanced AI capabilities across many more connected devices, it needs much higher bandwidth and drastically faster speeds.
The challenge for operators is that all of this enforces much more stringent performance requirements on the wireless network and the timing solutions that support it. In fact, the timing requirements for 5G networks are exceptionally stringent in comparison with other high-performance environments like data centers. The significant bandwidth and data rates required by true 5G require improved timing precision, accuracy, and reliability compared with legacy networks.
5G vs. 4G
Let’s start by quickly reviewing the most common network architectures for 5G versus 4G.
While 4G relies on a baseband unit (BBU) to provide connectivity between radio units (RUs) and the evolved packet core, 5G may split this functionality between two new, separate units: the centralized unit (CU) and the distributed unit (DU). Figure 1 shows the architecture differences from 4G LTE networks (top) to 5G SA networks (bottom).
Figure 1: Architecture differences between 4G LTE networks (top) and 5G SA networks (bottom) (Source: Skyworks Solutions Inc.)
One of the significant differences between 4G and 5G is that 5G uses time-division duplexing instead of frequency-division duplexing for data transmission. This means that the entire network needs to be synchronized, not just in frequency but also in time. And this applies across the complete architecture — from the 5G core all the way to the RUs at the edge. Hence, these 5G SA networks necessitate much higher performance frequency and phase-synchronization solutions for the entire network.
Unsurprisingly, these performance requirements can’t be reached by NSA networks that are restricted by the hardware and performance of their underlying 4G LTE hardware. The fronthaul network timing requirements for 5G SA deployments are particularly rigorous. In these applications, there can be many RUs connected to one DU, and the time alignment between the radios are even more demanding. These stringent specifications are required to enable advanced radio features like carrier aggregation and distributed MIMO.
The strict time alignment requirements of 5G fronthaul networks played a pivotal role in the creation of the O-RAN Alliance. The aim of this organization is to create standards-compliant telecom equipment that can be mixed and matched for the DU, RU, and the fronthaul network. The idea is that any O-RAN–compliant hardware will work together with any other O-RAN–compliant hardware for an operator’s fronthaul network needs, including connectivity and synchronization between the equipment.
All of this leads to the core question: How did operators solve this timing and synchronization challenge? The answer lies in the IEEE 1588 standard.
The PTP of it all
In prior networks where performance needs were less demanding, synchronization and timing relied heavily on GPS technology. Today, however, GPS isn’t as viable a solution for the latest 5G deployments. Not only does GPS have well-known issues in terms of security, but the hardware is relatively expensive and not very reliable in dense, urban environments. This last point is particularly problematic for mobile operators looking to launch 5G small-cell networks in stadiums, concerts, airports, and other high-density locations.
Thankfully, the telecoms industry came together to find a solution to this problem and develop an innovative replacement for GPS. The industry created a new method to synchronize all the different equipment across the wireless infrastructure network using IEEE 1588v2. IEEE 1588v2, also known as Precision Time Protocol (PTP), is a standard for synchronizing timing through the data packet layer itself. PTP enables various timing systems with clocks of different precisions, resolutions, and stabilities to synchronize to a single grandmaster clock.
The PTP is based on a two-way exchange of timing messages, not only distributing timing from the grandmaster clock but also estimating and accounting for path delay. Path delay and packet delay variation are two of the reasons that 5G timing environments are so complex.
Telecom networks often need to account for transmission delays due to different hardware, varying fiber lengths, network congestion, and asynchronous network traffic — each of which can contribute to notable delays. Any source of asymmetrical delay can cause errors in the time-synchronization distribution. All of these effects need to be considered when architecting an IEEE 1588–compliant network.
Ultimately, IEEE 1588v2 was purpose-built to meet the complex timing needs of 5G networks. IEEE 1588 can provide system-wide synchronization accuracy in the sub-microsecond range across the entire network while reducing reliance on GPS. 5G SA networks support multiple timing-synchronization deployment scenarios. These include new greenfield network deployments, where every node in the network supports IEEE 1588 synchronization, and the grandmaster to overlay network deployments, where IEEE 1588 packet synchronization information is passed through existing networks to the edge of the network. IEEE 1588 can then be used to synchronize the edge network elements with the system’s grandmaster clock. This latter deployment scenario is often referred to as a Partial Timing Support (PTS) network. GPS can optionally be used to assist IEEE 1588 to provide higher time accuracy at the edge of the network, which is aptly named Assisted Partial Timing Support (APTS). The flexibility of IEEE 1588 and its ability to work in conjunction with GPS enables system operators to deploy 5G reliably across a range of deployment scenarios.
An operator perspective
It’s undeniable that the standardization created by IEEE 1588 and O-RAN are clearly game-changers for the mobile industry. These technologies work in concert to simplify the deployment of 5G networks. They also give mobile service providers more choice. Ultimately, however, mobile operators and deployers are looking for network equipment and software solutions that enable them to improve performance and reduce costs.
While there are various ways to do this, two primary hardware considerations are power consumption and board footprint. Less power utilized by board-level components and more PCB space means more devices can be added to improve performance and support additional functionality. For these reasons, vendors and manufacturers favor newer timing solutions that allow for more precise synchronization while improving overall power efficiency and bill-of-materials footprint. There are also new IEEE 1588 software solutions available that provide superior time-error performance in PTS and APTS networks, de-risking the design and deployment of 5G overlay networks.
Carriers can opt to deploy traditional RRH + BBU architectures, leveraging proven, end-to-end solutions from large telco equipment vendors who have extensive historical knowledge and technical expertise. Carriers can alternatively transition to a lower-cost O-RAN deployment model and leverage radio, DU, and CU solutions from a mix of suppliers. Regardless, new timing-synchronization solutions are available that are standards-compliant and support the entire range of different deployment scenarios that a mobile network operator is considering.