Hi Andy and all,
A recent discussion with G3PLX made something click in my mind about the
widespread adoption of Wolf for LF communications, and is it really the
right way to go ?
WOLF is far from an ideal system, and presently performs worse than many
simpler systems under some real world conditions. But I believe that its
principles are sound, and with a great deal of evolution, it will become
A heavily coded relatively wideband data signal such as this exhibits a
pronounced threshold effect, such that copy is either good or fails
completely. Coding can do an excelent job against interference such as
burst errors but is not so effective against a signal that is just too
weak to start with.
It is true, and can even be mathematically proven, that all coding systems
exhibit an S/N threshold, below which the output BER is worse than an
uncoded system would deliver. But, with a decent coding system, the
threshold occurs at a point where, say, only 20% of the received characters
are correct. It is rarely the case that such copy can be considered useful,
and the coded system can deliver nearly perfect copy when the input S/N is
only about 1 dB above this threshold. This is why nearly all modern digital
RF communication systems use one or more forms of forward error correction.
Consider this situation :
An uncoded non-differential BPSK waveform at 10 bits/s is used. From
standard graphs of error rate vs. Signal / Noise ratio, in the 10 Hz
bandwidth ideally needed for this signalling waveform, would require
about 7dB S/N for good copy, assuming 1 error per 1000 bits
transmitted If S/N fell to 4dB, raw bit error rate will fall to 1 per
100 (10^-2) and at 2dB S/N falls to 1/30. At 1db S/N is may be
possible to achieve an error rate in the region of 1/10, but the curves
in text books rarely go back this far because for all practical purposes
the link is unusaeble at this level !
A heavily coded scheme such as Wolf can take a moderately poor error
rate and make it better by repeats of the data and error correction.
But this presupposes a certain BER to start with. Bear in mind that
random noise, will give an error rate of 0.5, since there is as likely a
probability of a bit chosen at random of being correct as not. I have
not even tried to work out what raw error rate (at 10b/s) is needed for
Wolf to work "reasonably well", but assume Wolf can cope perfectly with
1 bit in 6 being in error - just a gut feeling based on it being a rate
1/6 code, but it does not matter too much what figure is chosen here.
1/6 error rate comes from a S/N of around 0dB (plus / minus a dB or so).
Most modern systems use "soft decisions", which means that the detector
output levels are measured, rather than being treated as "0" or "1".
A bit corrupted by noise is likely to be near the decision threshold,
while one far above or below the threshold is more likely to be correct.
The decoder uses this information to determine the most likely transmitted
message. This is not new to the amateur world, Both PSK31 and Africa
use this technique. Perfect copy is possible, even when 40% or more of
the received bits would be "wrong", had hard decisions been used.
Now, each character in Wolf, which corresponds to 5.33 bits of data,
effectively takes 64 bits to send - 960 bits taking 96 seconds for a
full 15 character message.
Now consider a lower rate signalling waveform - say 1b/s. Now, good
copy at an error rate of 1/1000 again occurs at a S/N of 7dB, but this
time in a bandwidth of 1Hz, a 10dB lower noise level. This level of
signal would correspond to -3dB S/N in the original 10Hz bandwidth.
Even without coding of the 1b/s signal we have gained something like 3dB
signalling performance, and furthermore, keeping with 5.33 bits per
character, we can now send the complete message in 80 seconds.
Let's ignore coding for a moment and consider which is better, sending
an 80 bit message once at 1 bps, or sending it 10 times at 10 bps.
Assume both systems transmit the same power, and of course, they both
transmit for 80 seconds.
Under textbook conditions of perfect stability and timing, and no QRM
(the only impairment is Gaussian noise), the performance of both systems
is identical, if soft decision coherent detection is used. This should
not be too surprising, because in both systems each bit is being sent for
In the real world things are much different. For example, if you can
find enough interference-free spectrum for the 1 bps signal, but not
for 10 bps, then the 10 bps system might not work at all. To some extent,
on 136 kHz, this is the situation with WOLF, because e.g. LORAN QRM will
always be present. However, in most cases the QRM is completely
predictable, and could be removed with a suitable dynamic filter. I have
be working on such a system, but have had difficulty finding sufficient
time to devote to it.
But there are two major advantages to the faster system. First, if a
few seconds of the transmission are unusable, e.g. because of QSB or
static crashes, the 10 bps system suffers only a minor reduction in S/N.
On the other hand, the slower system will lose a few data bits, and
copy will be incorrect.
Also, it gives the opportunity to use a low rate code. One way of looking
at this is that, instead of just repeating the message, we send it in
many different forms, so that messages which are similar in one form are
very different in the others, and are unlikely to be mistaken.
For example, suppose you are trying to read some digits over a noisy phone
link. If you are speaking English, a 'one' might be confused with a 'nine',
because the sounds are quite similar, while it's unlikely that 'two' be
confused with 'three'. In German, however, 'zwei' and 'drei' are similar
but 'eins' and 'neun' are quite different. So saying the number in English
and then in German (assuming that the receiver expects this), will be
more robust than simple repetition in either language. You can think of
WOLF as sending the message in six different languages.
This pre-supposes the RF link can support the narrower modulation, ie
the link is stable. What results there are for 137 across the Atlantic
does seem to support this view, no spreading of signals has been
observed over bandwidths much more than a 1/10 Hz. It might be argued
that a higher base modulation rate allows for faster and easier lock up
of demodulators, but the requirement for very high stability seems to be
necessary for Wolf anyway.
If you want to coherently integrate over the entire transmission, the
stability requirement is the same, regardless of the modulation rate used.
Lower modulation rates are easier to
synchronise to UTC (even by hand if need be !) which is fortunate, as
they would otherwise lead to long lock up times.
Indeed true. But, to get a decent transmission rate, e.g. fast enough
to complete a QSO in an hour, you need an m-ary scheme such as Steve
Olney's AFK. Unfortunately, most amateurs will need special (but simple)
hardware to send it. And, with such hardware, including accurate
synchronization to UTC is not difficult. So I'm not sure there is a
big advantage to hand sync.
To summarise :
Heavy coding of a waveform that considerably reduces the data throughput
suffers from an abrupt threshold as S/N ratio falls. An uncoded
waveform at a lower data rate then has a higher chance of working.
IMO, not true. Most coding gain (all but about 1 dB), can be had with
a rate of 1/2. So, if the LORAN filter never succeeds, or if folks decide
that WOLF simply hogs too much spectrum, a system at, say 2 bps with a
rate 1/2 code may be the answer. It should be almost 5 dB better than
an uncoded 1 bps system. If you insist on 1 bps, then we should take
160 seconds to send the message, and still use rate 1/2 coding.
This subject is so fundamental that it feels that I am missing
something, everyone now seems to rely too heavily on coding. For
amateur purposes, are we doing the right thing just following the herd
where modest error rates (like 1/1000) are perfectly acceptable ?
Peter, G3PLX, went through similar arguments when designing PSK31 and
ended up with the narrowest signals ever to have been heard on HF.
PSK31 is indeed impressive. But I think that Peter balanced social
issues with technical ones when he made his design choices. It is
well documented that the amount of data which can be sent with a
given energy improves as you 'waste' bandwidth. The LF community
will need to decide on an appropriate tradeoff.