I am currently updating some of the antenna material in the RSGB Radio
Communications Handbook.
I have a question about SWR meters.
In my early days of amateur radio I didn't have an SWR meter. I obtained
some idea of antenna/feeder mismatch by using a multiple of half wavelength
of coax and measuring the antenna feed Z using the Pi output capacitor
(suitably calibrated) of the transmitter.
At about the same time I was a RADAR technition in the RAF. When a magnetron
was replaced on 3cm equipment (H2S) the SWR had to be measured using SWR
meter. This comprised a small red neon tube, the sides of which were
calibrated, like a thermometer, in SWR. As the tube was moved in a slot
along the waveguide the maximum and minimum voltage could be seen (VSWR).
The coax SWR meter can only detect voltage, current and phase. If you
replace the antenna with a 200ohm resistor it will register an SWR of 2:1
even though there is no transmission line. From this only conclusion I can
come to is that the standard SWR meter measures voltage, current and phase
from which SWR is implied.
So of all the radial scaled parameters to be found on the cursor of a Smith
Chart why did someone settle for SWR?. Was a true SWR meter used in the days
when radio engineers started feeding transmitters via long lengths of open
wire feeder and became the standard measurement of antenna/feeder mismatch?
Regards,
Peter, G3LDO
e-mail <[email protected]>
Web <http://web.ukonline.co.uk/g3ldo>
|