1. The Measurement of Digital TV Signals
2. Time required to take a Bit Error Ratio measurement
Currently when analogue signals are measured,
with a spectrum analyser or signal level meter, the peak level
is measured in dBmV or dBm V. This measurement is usually taken
from the vision carrier. Diagram 1
shows the spectrum of a typical analogue vision carrier, trace
1 shows 0 % modulation (white) and trace 2 shows 70 % modulation
(black). N.B. the sync pulse is 100%. Since the spectrum analyser
is a peak reading instrument in this case, with a bandwidth of
300 kHz, it reads a narrow peak around the vision carrier, including
the sync pulse. Changing the depth of modulation does not affect
the peak reading.
However, if the same signal is measured using a power meter a very different result is obtained, as shown in diagram 2. Eventually a typical picture with varying video content would approximate to a picture of 50% luminance.
Why is it that the spectrum analyser indicated a constant level and the power meter a level varying with picture content? The answer lies in the bandwidth of the test instrument and the nature of the detector used.
The power meter has a bandwidth that is very much wider than the signal being measuring and therefore collects all the energy in the television signal unlike the narrowband signal level meter or spectrum analyser which only responds to a limited part of the bandwidth.
The spectrum analyser has a narrow bandwidth and a peak detector, which indicates the peak voltage level irrespective of the depth of the video modulation. The power meter has an average reading detector.
A broadband signal can therefore be defined as one in which the bandwidth is wider than bandwidth of the instrument being used to measure the signal.
If a broadband signal is applied to a spectrum analyser then the read level is dependent on the bandwidth of the spectrum analyser intermediate frequency (IF) filter (called resolution bandwidth).
Diagram 3 shows a broadband signal. White noise has been generated from a 1 - 1000 MHz noise generator, with a series of different levels which were measured at spectrum analyser IF bandwidths of 1MHz, 100 kHz, 10 kHz & 1 kHz.
The vertical axis is 10 dB per division and the frequency span 15 MHz.
By studying the diagram it can be seen that reducing the bandwidth of the spectrum analyser by a factor of 10 reduces the level indicated on the spectrum analyser by an average of 10 dB.
The results shown in diagram 3 are relevant
to digital signal measurement. Unlike an analogue television signal
a digital signal is not made of a vision carrier (and sound carrier)
but of a stream of digital "bits"; of constant amplitude.
Effectively because of the varying phase and time at which the
bit stream is generated there is not a single modulated carrier
but a signal that appears like a block of broadband noise.
Diagram 4 shows a simple experiment using an analogue simulator. One signal generator is connected to an oscilloscope while the other 5 are disconnected. The oscilloscope shows the resulting sine wave.
The second input of the oscilloscope is fed
from a simple diode detector. The second trace shows the DC level
at the output of the detector diode.
In diagram 5 all six signal-generators have been switched on, trace 1 shows the result of feeding 6 signals of different frequency and phase into the oscilloscope. Although the signals are all fed in at the same amplitude the resultant trace on the oscilloscope is not of a constant amplitude or frequency. The DC level out of the detector shown on trace 2 has increased by a factor of about 3 times and with only 6 signals results is what appears to be noise. Due to the fast changing amplitude of the signal the detector output voltage is varying.
The Measurement of Digital TV Signals
Diagram 6: COFDM and Analogue Signals Measured with Different IF Bandwidths
Diagram 6 shows the COFDM digital test transmission from Croydon on channel 34 and on the left can be seen the NICAM sound and the FM sound carrier of Channel 33.
This diagram shows a series of plots measured on a spectrum analyser set to 1 MHz, 100 kHz, 10 kHz and 1 kHz IF bandwidth.
Varying the bandwidth of the spectrum analyser does not change the indicated level of the FM sound carrier. There is a small increase at the highest bandwidth due to the power level in the NICAM carrier affecting the level reading.
However the digital TV signal level increases as the bandwidth of the analyser is increased. The NICAM signal is also a digital signal and behaves in the same way as the digital TV signal.
Clearly if technicians are sent out onto a network to measure digital signals with instruments of different bandwidths then they will come back with different results!
It is very important to understand why a digital signal gives this result on a narrow band signal level meter or analyser.
Thinking back to the simulation in diagrams 4 and 5, if there were, for example 6 carriers in an 8 MHz TV channel, these would add in a random way to produce a noise like signal. By restricting measurement to only two of the carriers by using a narrow bandwidth instrument, the instrument, because it registers only a part of the total energy available, would sum these to a lower level in its detector and gives a lower reading. It should also be observed that the design of the detector itself and in particular the time constant of the components following the detector will also affect the result.
This can be illustrated by measuring a digital signal on different
|Type of instrument||
|Swires TVA 97 Spectrum Analyser in analogue mode (reference)||
|Swires IM96 Installer Meter||
|Swires SA97 Spectrum Analyser||
|Swires SA87 Spectrum Analyser||
|Swires SL93 Signal Level Meter||
|Unaohm EP 598||
|Unaohm EP741 Professional Signal Level Meter||
All the above instruments either have a fixed bandwidth of approximately 250 kHz or if the bandwidth was variable, as for the spectrum analysers, it was set to 250 kHz.
The above results show that the bandwidth of the instruments alone does not determine the level reading. The shape of the IF filter and the design of the detector also affect the result.
If existing analogue instruments are to be used for measuring digital signals they must be individually calibrated and where different IF bandwidths are available then the bandwidth at which they must be used for digital measurements will have to be clearly stated on the calibration.
The new generation of test instruments claims to be "Digital Ready". Only by reading the small print carefully can it be seen if this so, or just a label.
The digital TV signal must be measured taking into account the bandwidth of the digital signal and an automatic computation must be made to produce a reading of the true power of the digital signal by measuring the true power in the digital envelope. For this reason, in the authors view, the opportunity should be taken to express the levels of digital signals in decibel milliwatts (dBmW).
The signals are bandwidth dependent and measuring voltage without clearly stating the bandwidth of the measuring instrument will lead to confusion.
Eventually all signals will become digital and if this opportunity to move to power measurements is missed an incorrect measuring system will be left forever.
To set up a mixed analogue and digital system it will be necessary to measure the true power of the analogue signal and the true power of the digital signal at the headend and make these power levels match. Any offset to optimise the balance between digital signals and analogue signals can then be added.
Never mind the quality,
feel the width!
Knowing the quality of our digital signals is very important.
Diagram 7, shows the theoretical bit error rate before and after the set-top boxes forward error correction system.
It is well known that the digital signal does not degrade gracefully - At one moment it can be almost perfect and then fail totally without warning. This point of system failure has been accurately determined as the second half of the world cup final! Scientifically it occurs when the uncorrected Bit Error Rate (BER) falls below 1 x 10-4.
To remedy this each field instruments could be equipped with a QAM demodulator to indicate bit error rate. Though the demodulator would have to be of a very high quality with very low phase noise, if it is to be of value as a measuring tool, and adding a demodulator adds to the cost, power consumption and weight of the instrument.
If measurements were also to be made of terrestrial COFDM and satellite QPSK digital signals, separate demodulators would also have to be added for each of the standards.
As an alternative to BER, Signal to Noise Ratio (SNR) measurements can be made. By far the greatest cause of degradation of BER of a digital signal passing through a network is noise degradation.
Measuring the SNR has the advantage of showing the quality of the cable network rather than the quality of the digital signal itself. Should the digital signal contain degradation due to a problem in the head-end this would show in the BER measurement taken down the network even though the network is operating perfectly.
A caution should be inserted at this point. Many analogue signal level meters have a carrier to noise or signal to noise button. When this automated measurement is used it makes a correction for the bandwidth of the TV signal. This correction is not required for digital signals as noise to noise signals are being measured and the bandwidth effectively cancels out. A wrong reading will result if an instrument with an in-built signal to noise measurement system designed for analogue TV use is employed for digital signal to noise measurements.
BER is not easy to relate to the margins that are required on a cable system, the numbers are cumbersome and 1.0 * 10-9 does not intuitively indicate a network margin.
Even the best quality demodulators will only enable measurements down to 1.0 * 10-10. This represents a signal to noise ratio of only 29 dB and at a 25 dB SNR the decoder may fail (a margin of only 5 dB.) So if a network has a SNR of 29 dB or 40 dB the bit error rate reading will show 1.0 * 10-10.
The time required to make significant BER measurements
The number of bit errors per second can be readily computed:
Errors per second = data rate * bit error rate.
At a bit error rate of 1.0 E-4 the generally agreed quasi error free point (just before it all falls over);
= 38.15 E 6 * 1.0 E-4 = 3,815 errors per second.
N.B. 38.15 E6 bits per second has been taken as a typical
QAM data rate
However at a bit error rate of 1.0 E-10
= 38.15 E 6 * 1.0 E-10 = 0.0038 errors per second or 13 errors per hour.
To get a statistical significant number an error count of at
least 50 errors is needed.
With a BER of 1.0 E-9 there are 2.28 errors per minute, so about 20 minutes to collect significant data is needed, or 3.8 hours with a BER of 1.0 E-10.
Diagram 8 shows that the difference in signal to noise ratio between the maximum error rate of 1.0 E 10-4 and 1.0 E 10-9 is only a signal to noise ratio difference of 4.2 dB. So all that is known from the 20 minute wait for a BER reading is that the noise margin on our system is better than 4.2 dB. No matter how good the system, the margin is not known.
Clearly, although BER is a valuable method in the lab or at the headend where equipment can be set up for several hours if necessary, it does not tell us enough about the system margin down the network to enable preventative maintenance.
A great deal of research has shown that by far the primary cause of degradation of BER is due to noise and that the relationship between SNR and BER is reliable.
The author therefore feels that measuring the signal to noise ratio is the quickest and most meaningful way of knowing the margins available on a cable network.
In the latest instrument from Swires Research a new measurement called Digital Quality Margin (DQM) has been introduced. DQM is the noise margin expressed in dBs before the system is likely to fail. This is believed to be a meaningful measurement, as a system with a DQM of 8 dB shows immediately that the signal to noise margin before system failure is 8 dB.
The author would like to see the Digital Quality Margin measurement adopted by other manufacturers, however it is, in the end, down to the systems engineer to evaluate its usefulness.
The bandwidth of the measuring instrument must be known when measuring digital signals. Instruments specifically designed to measure digital signals compensate for bandwidth and should measure true power of the digital signal.
2. Intermediate Frequency (IF) Filter
Instruments with the same bandwidth can give altered readings when measuring digital signals due to the different shapes of intermediate frequency filters.
3. Bit Error Rate (BER)
The bit error rate takes a long time to measure accurately and has a very limited operating window.
4. Signal To Noise Ratio (SNR)
Network measurements of signal to noise ratio give a good indication of the quality of the cable network.
5. Digital Quality Measurement (DQM)
A new measurement based on Signal to Noise Ratio has been proposed for use as an alternative to Bit Error Rate.