Return to KLUBNL.PL main page

rsgb_lf_group
[Top] [All Lists]

GPS pulse timing (was: Re: LF: Idiot's guide to receiving Ebnaut?)

To: [email protected]
Subject: GPS pulse timing (was: Re: LF: Idiot's guide to receiving Ebnaut?)
From: Wolfgang Büscher <[email protected]>
Date: Sun, 30 Oct 2016 18:22:14 +0100
In-reply-to: <[email protected]>
References: <[email protected]> <[email protected]> <[email protected]> <[email protected]> <[email protected]> <[email protected]> <[email protected]> <[email protected]> <[email protected]> <[email protected]> <[email protected]> <[email protected]> <[email protected]>
Reply-to: [email protected]
Sender: [email protected]
User-agent: Mozilla/5.0 (Windows NT 6.3; WOW64; rv:45.0) Gecko/20100101 Thunderbird/45.4.0
Hello Paul,

Thanks for the details. I guess the pulse jitter cannot go down much further, unless the GPS receiver can "pull" its own oscillator in frequency, which is definitely not the case with the Garmin (but maybe with the trimble). The Garmin receiver can only shift the output pulses in multiples of its internal CPU clock, which iirc was 40 MHz, and thus 25 ns "stepwidtdh" (can easily be seen with an oscilloscope triggered by a good GPSDO).


you wrote:
Wolf wrote:
> measured the standard deviation in the GPS sync pulse
> timing. Used an E-MU 0202 at 192 kSamples/second.

Ran a quick trial here.

With the Trimble 10uS directly into an E-MU 0202 at 192k/sec I
get 100nS std dev of the raw intervals.  The timing system here
uses smoothed intervals which reduces the jitter to below 30nS.

Same setup at 48k/sec is useless, the raw std dev is around
6uS.  Watching the pulse on vtscope, the waveform changes
significantly from one pulse to the next.  This is so at both
sample rates but seems worse at 48k.   Looks like waveshape
depends on just when the pulse arrives with respect to the
soundcard's internal A/D sampling.

Yes, but that's not a bad waveform - it *must* be like that if the card uses a good delta-sigma converter. And that's what my interpolation algorithm actually relies on. By interpolating those ugly waveforms, and running them through a windowed sinc filter (with cutoff frequency at half the original sampling rate) we get a *constant* waveform again, which only shifts along the timescale depending on the relative pulse time (within one ADC sample). But the peak of the reconstructed pulse at f_sample_interpolated = 4 * 192 kHz is always identical, even if the waveforms *before* the interpolation look very different. The principle would completely fail with a simple (non delta-sigma) ADC as built inside many microcontroller. Consider it this way: The "frontend" of the ADC is a one-bit A/D converter running at a sample rate of dozens of Megahertz, and the output at the end of the decimator chain is perfectly predictable, even if the waveform looks strange and seems to vary. The interpolator / windowed sinc lowpass in the software delivers a signal at a higher sampling rate than delivered by the soundcard, but the information about the precise GPS pulse time "is still in there". It only depends on the number of sigma-delta stages and the ADC's "oversampling" factor. Example: 24 stages (for up to 24 bit resolution) plus 8-fold oversampling : 24 * 8 * 192 kHz = 36.864 MHz for the "one-bit" input stage, which is a crystal frequency seen in many soundcards. This should be good enough for a time resolution of 27 nanoseconds (if there was no noise and other imperfections) -> std dev around 14 ns. Add the jitter from the GPS itself, say another 25 ns for the GPS18LVC, and -i guess- even less for 'better' receivers.
With the Trimble GPS, you already arrive there, which is remarkable !

All the best,
  Wolf .



<Prev in Thread] Current Thread [Next in Thread>