[Antennas] Signal loss (path loss)

A10382 a10382 at snet.net
Mon Aug 27 14:56:45 EDT 2007


For those of you that like to calculate radio propagation:
The electro-magnetic strength of an rf signal decreases the further you get 
from the transmitting antenna.
The formula, which is simple, escapes me at the moment.
We usually think (and calculate) in terms of miles (or even hundreds and 
thousands of miles).
Think of the path loss of a signal for the following distance:
 7.9 BILLION MILES !
The spacecraft Voyager was launched quite a few years ago to research the 
cosmos. It is still sending back usable data that is received by NASA - and 
now it's about 7.9 BILLION miles from earth.
Even if the transmitter is operating at 100W (Voyager is totally solar 
powered and solar energy also decreases the further you get from the Sun). 
The signal strength that finally reaches the antenna here on earth must be 
tiny - to say the least ! The 'gain (focus) at both ends has to be 
tremendous and the receiver extremely sensitive. The frequency used must be 
one that has a very low noise floor and is one not widely used by other 
naturally occurring items.
If anyone has the path loss formulae handy, they would be appreciated.
============


More information about the Antennas mailing list

AltStyle によって変換されたページ (->オリジナル) /