A recent discussion with G3PLX made something click in my mind about the
widespread adoption of Wolf for LF communications, and is it really the
right way to go ?
A heavily coded relatively wideband data signal such as this exhibits a
pronounced threshold effect, such that copy is either good or fails
completely. Coding can do an excelent job against interference such as
burst errors but is not so effective against a signal that is just too
weak to start with.
Consider this situation :
An uncoded non-differential BPSK waveform at 10 bits/s is used. From
standard graphs of error rate vs. Signal / Noise ratio, in the 10 Hz
bandwidth ideally needed for this signalling waveform, would require
about 7dB S/N for good copy, assuming 1 error per 1000 bits
transmitted If S/N fell to 4dB, raw bit error rate will fall to 1 per
100 (10^-2) and at 2dB S/N falls to 1/30. At 1db S/N is may be
possible to achieve an error rate in the region of 1/10, but the curves
in text books rarely go back this far because for all practical purposes
the link is unusaeble at this level !
A heavily coded scheme such as Wolf can take a moderately poor error
rate and make it better by repeats of the data and error correction.
But this presupposes a certain BER to start with. Bear in mind that
random noise, will give an error rate of 0.5, since there is as likely a
probability of a bit chosen at random of being correct as not. I have
not even tried to work out what raw error rate (at 10b/s) is needed for
Wolf to work "reasonably well", but assume Wolf can cope perfectly with
1 bit in 6 being in error - just a gut feeling based on it being a rate
1/6 code, but it does not matter too much what figure is chosen here.
1/6 error rate comes from a S/N of around 0dB (plus / minus a dB or so).
Now, each character in Wolf, which corresponds to 5.33 bits of data,
effectively takes 64 bits to send - 960 bits taking 96 seconds for a
full 15 character message.
Now consider a lower rate signalling waveform - say 1b/s. Now, good
copy at an error rate of 1/1000 again occurs at a S/N of 7dB, but this
time in a bandwidth of 1Hz, a 10dB lower noise level. This level of
signal would correspond to -3dB S/N in the original 10Hz bandwidth.
Even without coding of the 1b/s signal we have gained something like 3dB
signalling performance, and furthermore, keeping with 5.33 bits per
character, we can now send the complete message in 80 seconds.
This pre-supposes the RF link can support the narrower modulation, ie
the link is stable. What results there are for 137 across the Atlantic
does seem to support this view, no spreading of signals has been
observed over bandwidths much more than a 1/10 Hz. It might be argued
that a higher base modulation rate allows for faster and easier lock up
of demodulators, but the requirement for very high stability seems to be
necessary for Wolf anyway. Lower modulation rates are easier to
synchronise to UTC (even by hand if need be !) which is fortunate, as
they would otherwise lead to long lock up times.
To summarise :
Heavy coding of a waveform that considerably reduces the data throughput
suffers from an abrupt threshold as S/N ratio falls. An uncoded
waveform at a lower data rate then has a higher chance of working.
This subject is so fundamental that it feels that I am missing
something, everyone now seems to rely too heavily on coding. For
amateur purposes, are we doing the right thing just following the herd
where modest error rates (like 1/1000) are perfectly acceptable ?
Peter, G3PLX, went through similar arguments when designing PSK31 and
ended up with the narrowest signals ever to have beeen heard on HF.
The Information contained in this E-Mail and any subsequent correspondence
is private and is intended solely for the intended recipient(s).
For those other than the recipient any disclosure, copying, distribution,
or any action taken or omitted to be taken in reliance on such information is
prohibited and may be unlawful.