which raises the question I have often wondered about.
When I started in ham radio all coax was post-war surplus 75 ohm. I
assumed
in my youth it was to match a free-space dipole !.
When I came back to radio after a thirty-year absence in 1990 all was
50
ohm.
Who, when and why did someone decide to change to 50 ohm ??
Dear Bryan, LF Group,
We had this discussion before on the LF group - In summary, due to the
138log b/a relationship for characteristic impedance, coax cables with
Zo much less than 25ohm are not practical due to the small spacings
required between conductors, while much larger than 100ohm is difficult
due to the very thin centre conductor required. A 75ohm air-spaced
copper conductor coax theoretically has lower loss than any other
impedance, while 50ohm has higher power-handling ability. If the 75ohm
air line is filled with polythene insulation, you come out with
something near 50ohm, which will then come out with the lowest loss for
a solid cable, assuming the conductor losses dominate, which they
usually do. So practically, coax cables always come out somewhere in
this fairly narrow range - whether it is 50, 60 or 75ohms is mostly down
to preference.
Cheers, Jim Moritz
73 de M0BMU
|