Well Alberto, this morning I have egg on my face, as we
English-speakers sometimes say :-)
Late last night (when I was half asleep) I posted a small .jpg
showing 2 cropped screen captures, top using 16 bits of ADC data,
bottom using 12 bits. The image seemed blurry compared to what was
actually on the SL screens, maybe due to the lossy .jpg
compression. Today I made a clearer picture: ran two SL analyses,
one from the original 16-bit recorded data, the other covering the
same time interval but chopped to 12-bit ADC values. Last night's
comparison used one real-time screen capture and one re-processed
capture - the time spans were not identical and I tried (apparently
unsuccessfully) to get the SL brightness and contrast sliders for the
re-processed screen to match those that were currently in effect on
the real-time screen. This time I made both captures by analyzing
the recorded data sets (16-bit and 12-bit ADC values, otherwise
identical files). I adjusted the brightness/contrast to my taste
after the 16-bit file was finished (before taking that screen
capture), then made very sure the settings were left untouched while
I processed the 12-bit file. Today I used .BMP screen captures to
avoid the .jpg smearing, then made a composite image and converted it
to .GIF format (lossless compression) so it would go through the
reflector. Now the two images look exactly like they appeared on the
SL screens. Guess what? Although there must be subtle differences
because the cropped .BMP files have different CRC's, I cannot see any
differences with my old eyes but then again I am not an image
afficionado :-) The attached file 16_vs_12.gif has 16-bit ADC
samples on top, 12-bit ADC samples on bottom. No significant visual
difference!
I guess my question has been answered. Thanks again for the discussion.
73,
Bill VE2IQ
16_vs_12.gif
Description: Binary data
|