Base Station Antenna Tilt and Path Loss

Path loss is basically the difference in transmit and receive powers of a wireless communication link. In a Free Space Line of Sight (LOS) channel the path loss is defined as:

L=20*log10(4*pi*d/lambda)

where ‘d’ is the transmit receive separation and ‘lambda’ is the wavelength. It is also possible to include the antenna gains in the link budget calculation to find the end to end path loss (cable and connector losses may also be factored in). Antenna gains are usually defined along a horizontal plane and vertical plane passing through the center of the antenna. The antenna gain can then be calculated at any angle in 3D using the gains in these two planes.

Although 3D antenna gains are quite complex quantities simplified models are usually used in simulations e.g. a popular antenna Kathrein 742215 has the following antenna gain models [1] along the horizontal and vertical planes:

Gh(phi)=-min(12*(phi/HPBWh)^2, FBRh)+Gm

Gv(theta)=max(-12*((theta-theta_tilt)/HPBWv)^2, SLLv)

where

Gm=18 dBi
HPBWh=65 degrees
HPBWv=6.2 degrees
SLLv=-18 dB

We are particularly interested in the gain in the vertical plane and the effect of base station antenna tilt on the path loss. We assume that the mobile antenna station has uniform gain in all directions. The path loss can be then calculated as:

L=20*log10(4*pi*d/lambda)+Gv(theta)+Gh(phi)

where we have assumed that Gh(phi)=0 for all phi (this is a reasonable simplification since changing the distance along the line of sight would not change Gh(phi) ). Using the above expression the path loss in free space is calculated for a frequency of 1805 MHz, base station antenna height of 30 m and an antenna tilt of 5 degrees.

Effect of Antenna Tilt on Path Loss
Effect of Antenna Tilt on Path Loss

It is observed that there is a sudden decrease in path loss at distances where the antenna main beam is directed. If the antenna tilt is increased this behavior would be observed at smaller distances. Since we have used a side lobe level that is fixed at -18 dB we see a rapid change in behavior at around 100 m. If a more realistic antenna model is used we would see a gradual decrease in path loss at this critical distance.

[1] Fredrik Gunnarsson, Martin N Johansson, Anders Furuskär, Magnus Lundevall, Arne Simonsson, Claes Tidestav, Mats Blomgren, “Downtilted Base Station Antennas – A Simulation Model Proposal and Impact on HSPA and LTE Performance”,
Ericsson Research, Ericsson AB, Sweden. Presented at VTC 2008.

Author: Yasir Ahmed (aka John)

More than 20 years of experience in various organizations in Pakistan, the USA, and Europe. Worked as a Research Assistant within the Mobile and Portable Radio Group (MPRG) of Virginia Tech and was one of the first researchers to propose Space Time Block Codes for eight transmit antennas. The collaboration with MPRG continued even after graduating with an MSEE degree and has resulted in 12 research publications and a book on Wireless Communications. Worked for Qualcomm USA as an Engineer with the key role of performance and conformance testing of UMTS modems. Qualcomm is the inventor of CDMA technology and owns patents critical to the 4G and 5G standards.

0.00 avg. rating (0% score) - 0 votes

2 thoughts on “Base Station Antenna Tilt and Path Loss

Leave a Reply

Your email address will not be published. Required fields are marked *