Home / SureCall Support Wiki / Does the length of cable between the signal boosters and antennas decrease signal strength?

Does the length of cable between the signal boosters and antennas decrease signal strength?


Yes, cable length does decrease signal strength. Typically, for every 100 ft. of cable it decreases decibel, or dB, strength by 3-4dBs on the 800 Cellular band and 7dBs on the 1800/1900 PCS band. As you can see, the higher the frequency, the more the signal decreases. For the Verizon and AT&T 700 LTE bands, it would decrease signal strength 3-4 dBs as on the Cellular band. For the T-Mobile AWS 4G band, signal strength would decrease similar to the PCS band, 7dBs, for every 100 ft. of cable length.


If a long length of cable is needed between a cell amplifier and antenna, in-line boosters with lower signal strength are sometimes used to extend the signal along the cable line.


Attenuation of CM400:

30 MHz =  0.7dB of loss per 100 feet

50 MHz =  0.9dB of loss per 100 feet

150MHz = 1.5dB of loss per 100 feet

220MHz = 1.9dB of loss per 100 feet

450MHz = 2.7dB of loss per 100 feet

900MHz = 3.9dB of loss per 100 feet

1500MHz (1.5GHz) = 5.1dB of loss per 100 feet

2400MHz (2.4GHz) = 6.65dB of loss per 100 feet

5800MHz (5.8GHz) = 10.8dB of loss per 100 feet


So, for an example, 100 ft. of cable from antenna to amplifier would decrease the 1800/1900 MHz PCS frequency by approx. 7-8dBs and decrease the 800 MHz Cellular frequency by 3-4dB. The same could be said about the 700 LTE frequency. 


The Times Microwave site has a nifty little dB attenuation cable calculator online.




     RSS of this page