Application Notes
This application note discusses practical strategies to overcome RF and microwave interference challenges in the field using real-time spectrum analysis (RTSA). Learn about the different types of interference encountered in commercial and aerospace and defense (A/D) wireless communication networks. Uncover the drawbacks associated with traditional interference analysis and get an in-depth introduction to RTSA. Understand why RTSA is valuable for troubleshooting interference in today’s networks with bursty and elusive signals.
Introduction
Increasing wireless technologies in communication networks bring one inherent challenge — interference. Regardless of the type of network, the noise level in the system always limits performance. The generation of noise is internal or external.
The level of interference management determines the quality of service. For example, uplink noise level management of an LTE network dramatically improves its performance. Proper channel assignment and reuse in an enterprise wireless local area network (LAN) assures the planned connection speed, and optimized antenna location/pattern in a satellite earth station contributes to the reliability of communication under all weather conditions.
A real-time signal analysis (RTSA) capability is necessary for detecting demanding signals and troubleshooting network issues in the field. In this paper, we will look at interference in various networks, discuss RTSA technology and its key performance indicators, and explore applications to troubleshoot RADAR, electronic warfare (EW), and interference issues in communication networks.
Table of contents
Review of RF and MW Interference Issues
Wireless interference challenges
In commercial digital wireless networks, the key challenge is to provide as much capacity as possible within the available spectrum. This design goal drives much tighter frequency reuse and wider channel deployment. Because cell sites are so close to each other, and base stations transmit at the same time, this creates a much higher noise level on the downlink (direction from the base station to the mobile). The higher noise level on the downlink at the mobile antenna triggers the mobile to increase its output power. In turn, this leads to increased uplink (direction from the mobile to the base station) noise level at the base station antenna. The higher level of noise at the base transceiver station (BTS) antenna reduces cell site capacity. These are examples of internal network interference.
In addition to internal interference, external interference is more prevalent now; this is due to tight frequency guard bands between network operators, poor network planning, network optimization, and illegal use of spectrum.
Interference issues in LTE networks
The LTE network is noise limited. It has frequency reuse of one, which means every cell site uses the same frequency channel. For an LTE network to work properly, it must have a sophisticated and efficient interference management scheme.
On the downlink, LTE base stations rely on channel quality indicator (CQI) reports from the mobile to estimate the interference in the coverage area. CQI is a measure of the signal-to-interference ratio on the downlink channel or on certain resource blocks. It is a key input for the base station to schedule bandwidth and determines the throughput delivered to the mobile. The interference is an aggregation of noise generated inside the cell site and interference from external transmitters. If there is external interference on the downlink, it drives CQI lower and prompts the retransmission of data, which in turn decreases network speed. Downlink interference is one of the most challenging situations to deal with because there is no direct feedback from the base station to indicate that interference is present.
Precise power control plays a critical role in LTE interference management because the serving cell and neighbor cells share the same frequency channel. The network needs to minimize interference at the edge of the cell, and at the same time, provide enough power to the edge users to get good service quality. An LTE base station provides full-spectrum at lower power in the center of the cell. At the edge of the cell, it allocates fewer resource blocks (subcarriers) but delivers more power (Figure 1). This approach improves overall cell throughput and minimizes interference.
What are you looking for?