Could one of you experts help me with the following please:
 A short vertical monopole antenna over perfect ground has a gain relative 
to isotropic of 4.8 dB.
A half-wave dipole in free space has a gain rel. isotropic of 2.15 dB
Therefore, a monopole should have a gain of 2.65 dB over a dipole.
So the theory goes............
 But look at the qualifier on the short vertical gain - it has to be 
operating over "perfect ground". No amateur has "perfect ground"; at least 
not that I am aware of. I haven't heard of anyone laying out 36 radials 550 
metres long under his antenna (not even G3KEV.......yet!)
 So nearly all the energy that goes into the ground is dissipated and does 
not return to the feedpoint.  Therefore it cannot reinforce the radiation 
pattern.  In that case, does the theoretical gain still hold?
 Gain is only obtained from directivity.  Directivity can be calculated from 
physical considerations but the equation to obtain gain from directivity 
is  G = e*D , where G = power gain, D = directivity, and e = radiated 
power/total power.  The "gains" quoted above are actually theoretical 
directivity figures but they assume that e = 1, that is, that there are no 
ground losses (as the definition states) and that accordingly gain is the 
same as directivity.
 Not so in an average amateur situation, where e = 1/1000 (1w radiated for 
1000w input) so G = 0.001*4.8 = .0048 dB. In other words, the average 
amateur LF antenna is no better than isotropic.
Or should I be ignoring earth losses and only counting copper losses?
Walter G3JKV.
   
 
 |