Many people are surprised that OTA TV (Over The Air Television) is still a thing. I am here to say that there are lots of TV stations still broadcasting. OTA is alive and well, especially around big cities. To wit; I noticed this older TV antenna on the roof of a transmitter building in Lodi, NJ. Being curious, I connected an ATSC 1.0 TV to the antenna lead in the kitchen. One scan captured 62 TV channels and sub-channels OTA in the NYC market.
That site is 10 miles northwest of the Empire State Building.
I also noted that the satellite dishes on site have had Terrestrial Interference (TI) filters on the LNB’s for many years. Recently, 5G filters were installed as well. Thus, I added a 5G/LTE filter made by Channel Master (part number CM-3201) to the TV antenna splitter. A rescan captured 79 channels. Interesting.
I began ordering TV receiver filters and testing them with my network analyzer. There are many different units made by different manufacturers. The smaller, cheaper units do not have as good performance as the larger, more expensive ones. Go figure.
Here are a few sweeps of various filters:
There is also an FM band-stop (Channel Master CM-3202), which is effective for blocking out 87 to 113 MHz.
Sometimes I get questions from non-technical readers, thus for the uninitiated; these sweeps are return loss. The higher the line on the right-hand graph, the less signal will get through the filter. A flat line at 0dB means that little or no signal is getting through on those frequencies.
These filters are helpful, especially with inexpensive consumer-grade TV receivers. If you live near an FM transmitter site, then an FM band-stop filter may help, especially with the low and high-band VHF stations. If you live anywhere near a cell site (and most of us do) then a 5G/LTE filter will likely help.
Recently, I pried open my wallet and plunked down the sum of $150.00 for one of these little devices. Now, to be certain, this is not a replacement for a real VNA, especially at a high power broadcast site. However, it can be used for basic trouble shooting and I have had a good deal of fun fooling around with it.
First, a few quick specifications:
Type: SSA-2N NanoVNA V2.2
Frequency range: 50 KHz to 3 GHz
Power output: -50 to +10 dB
Measurement points: 201 (or 1024 with software and computer)
Measurement types: S11, S12 and S21
Screen Size: 4 inch touch screen
Traces: up to 4
Battery: 3000mAh Lithium Ion
Software OS (VNA-QT): Win 7, Win 10, Linux, MacOS
The unit I purchased came with a small carrying case, calibration loads and test jumpers. The software is downloadable and is easily configured.
What I really like about it is the internal battery and the touch screen.
So what can it be used for?
Test a coaxial cable
Measure the length of a coaxial cable
Figure out what frequency an antenna is designed for
Tune a 1/4 wave stub to make a notch filter
Measure the characteristics of a crystal/holder
Measure a capacitor
Measure an inductor
Tune a parallel resonant LC circuit to make a notch filter
Tune a filter can
Test a high pass, low pass or band pass filter
Sweep an antenna (Simple AM, FM, RPU, STL, WiFi)
Check isocouplers for proper circuit functioning
Pretty much anything you need to know about RF antennas, filters and transmission lines can be learned with a VNA. One thing to keep in mind; the measurement points are limited, especially in the stand alone mode. Thus, the smaller the frequency span, the better the measurement resolution will be.
While this is a very inexpensive device designed mainly for Amateur Radio, it can be useful to diagnose antenna and transmission line problems. Would I depend on it to make precise measurements? No. Especially things required by the FCC like base impedance measurements on an AM tower or channel filter measurements for a TV station. Would it work at a high RF transmitter site with multiple AM/FM/TV transmitters? No and chances are you might burn out the front end. Those types of things are best done with professional equipment that has much better accuracy and resolution.
It is a pretty good little tool for basic troubleshooting. One can look at the individual components of an AM ATU for example, or measure the input impedance to see if there has been a shift (should normally be 50 ohms). It is small enough that it can be included in a basic tool kit. It is self powered. Not bad at all for the price.
With the approval from the FCC for all digital broadcasting on the Standard Broadcast (AKA AM, Medium Wave, Medium Frequency) band, it might be interesting to dissect Xperi’s HD Radio MA3 (HDMA3) standard a little bit. It might also be interesting to compare that to DRM30 which has been in use in many other places around the world for several years now.
First, I will dispense with the givens; HD Radio sounds better than its analog counterpart. I have also listened to DRM via HF, and that too sounds better than its analog counterpart. Of interest here is whether or not either digital modulation scheme improve reception reliability and coverage area. Medium Wave has a distinct difference from other frequency bands as it can cover vast areas. Something that has been dismissed in recent years as unneeded due to reduced maintenance schedules and the cost of keeping directional antenna systems in tolerance (thus increasing skywave interference).
Secondly; after reading several studies of HDMA3 and DRM30, I will concede that both systems perform betterAnnex E, Ref 2; Section III para C, Ref 6 than their analog counterparts in a mixed digital analog RF environment. Both systems have features which can be used to improve reception during night time operation. Skywave exists, whether or not people want it. If it is not desired as a reception mode, it still has to be dealt with from an interference perspective.
The two main complaints against Medium Wave broadcasting is perceived reduced audio quality (over FM) and interference. The interference comes in two flavors; electrical impulse noise and broadcast (co-channel and adjacent channel AM stations). Both are problematic. To some extent; both can be somewhat mitigated by an all digital transmission. However, if the interference noise becomes too high, the program will simply stop as the data loss becomes too great to reconstruct the audio program.
Of further interest here is the technical aspects of both systems and whether or not one would be superior to the other for Medium Wave broadcasting. I found this comment on a previous post to be particularly interesting:
DRM and HD both use OFDM, but the parameters are quite different, eg. the length of cyclic prefix which determines the performance in sky/ground wave interference are different by a factor of 9 (0.3ms vs 2.66ms). That is why DRM is much robust than HD.
First of all, is this a true statement? Secondly, does the cyclic prefix make a difference in sky wave to ground wave interference? Which system might work better in a broadcast service where there are 4560 stations transmitting (as of 9/2020) and creating interference to each other? Finally, could the implementation of either system make a worth while difference in the quality and reliability of Medium Wave broadcasting in the US?
To answer these questions, I decided to begin with the technical descriptions found in the definitive documents; NRSC-5 D 1021s Rev GRef 1 for HDMA3 and ETSI ES 201 980 V4.1.1Ref 2 for DRM30.
There are many similarities between the two systems; both use COFDM modulation schemes, both have various bandwidth and data rates available, both use audio codecs that similar, both have some type of FEC (Forward Error Correction) system. I prepared a chart of these characteristics:
Both systems have 10 and 20 KHz channels available. This could be one feature used to mitigate adjacent channel interference, especially at night. In the US, physical spacing of transmitter sites helps prevent adjacent channel interference during the day. However, at night, half of the 20 KHz wide analog channel is in somebody else’s space and vice versa. Switching to 10 KHz mode at night would prevent that from happening and likely make the digital signal more robust.
DRM30 has additional advantages; multiple operating modes, protection classes and CODECs are available. Another advantage is the number of studies performed on it in varying environments; The Madrid Study,Ref 3 The All India Radio Study,Ref 5 Project Mayflower, Ref 4 and others.
Lets answer those questions:
Are HDMA3 and DRM30 different? Yes, as the commenter stated, both use COFDM however, there are major differences in carrier spacing, symbol rate, and FEC. DRM30 has been designed at tested on HF, where phasing issues from multi-path reception are common. There are many configurable parameters built into the system to deal with those problems. My calculations of the Cyclic Prefix Length came out differently than those stated (I may have done it wrong), however, they are indeed different.
Does the Cyclic Prefix Length make a difference in ground/sky wave interference? This is more difficult to answer. I would postulate that all of the configurable parameters built into DRM30 make it more robust. The various operating modes help mitigate phasing issues and the various protection modes help mitigate multipath reception issues. The only way to know that for certain is to do a side by side test.
Which system would work better in high broadcast interference environments? Again, it is difficult to tell with out a side by side study. There have been numerous studies done on both systems; Madrid,ref 3 Project Mayflower, Ref 4 All India,Ref 5 WWFDRef 6 etc. In order to conclusively determine, one would have to operated HDMA3 on a station for a week, then DRM30 for a week on the same antenna system, with the same environmental conditions. Extensive measurements and listening tests would need to be performed during those tests.
Is it worth it? Possibly. The big issue is the availability of receivers for both systems. Currently, only HD Radio receivers come as stock items in US automobiles. There are current and planned chipsets that have all of the digital radio formats built in (HD Radio, DRM+, DRM30, DAB/DAB+). If consumers want the service, manufactures will make the receivers. It would take a lot of effort to get this information in front of people and offer some type of programming that was highly desirable and available only on the radio. That is a big stretch.
Objectively comparing those two systems, I can see that both systems have advantages and disadvantages. There are some common items required for both systems; a reasonably well maintained transmitter plant, a newer solid state transmitter, and an antenna system with enough bandwidth so as not to distort the digital signal.
There are more receivers available for HD Radio, especially in cars. HD Radio MA3 is less configurable and therefore less likely to be misconfigured. There has been a lot of ink spilled in recent years about the declining number of radio engineers and the increased work load they are facing. Are there enough people with sufficient technical skills to implement and maintain even a basic all digital system? A topic for another post.
DRM30 is more flexible. Operating modes, protection modes and CODECs can be adjusted according to goals of station owners. There has been more testing done with all digital transmission of DRM30 using Medium Wave.
Are there enough reasons to allow a test of all digital Medium Wave DRM30 in the US?
Why not allow both systems and let the Software Defined Receiver decide?
HD Radio Air Interface Design Description Layer 1 AM Rev. G December 14, 2016
Digital Radio Mondiale (DRM) System Specification, ETSI ES 201 980 V4.1.1 January 2014
Digital Radio Mondiale DRM Multi-Channel simulcast, Urban and indoor Reception in the Medium Wave Band, Document 6A/73-E September 19, 2008
Project Mayflower, The DRM Trial Final Report, BBC, April 2009
Results Of DRM Trials In New Delhi: Simulcast Medium Wave, Tropical Band, Nvis And 26 Mhz Local Broadcasting, Document 6D/10-E March 28, 2008
All-Digital AM Broadcasting; Revitalization of the AM Radio Service, FCC Fact Sheet, MB Docket Nos. 19-311 and 13-249, October 19, 2019
The internet is being relied upon for many different functions. One thing that I am see more of is STL via the public network. There are many ways to accomplish this using Comrex Bric links, Barix units or simply a streaming computer.
We often can take for granted the infrastructure that keeps our connection to the public network running. Cable modems are very common as either primary or backup devices at transmitter sites, homes, offices, etc. The basic cable modem uses some type of DOCSIS (Data Over Cable Service Interface Specification) modulation scheme. This system breaks up the bandwidth on the coaxial cable into 6 MHz channels for downstream and upstream transmission. Generally, downstream transmission is 16 channels of 256-QAM signals. Upstream is 4 channels of QPSK or up to 64-QAM signals. Depending on your traffic shaping plan with the cable company, this will allow up to 608 Mbps down and 108 Mbps up. Those speeds also can change due to network congestion, which is the bane of coaxial cable based internet service.
The internet should now be considered a public utility. Especially after the COVID-19 emergency, distance learning, telecommuting at all the other changes we are experiencing. I know in the past, ISPs were reluctant to accept that role, as there are many responsibilities. That being said, when the public network goes down, many things grind to a halt.
Sometimes the problem is at the cable office or further upstream. Loss of a backbone switch, trunk fiber, or DOCSIS equipment will cause widespread outages which are beyond anything a field engineer can deal with.
Then there are the times when it is still working, but not working right. In that situation, there are several possible issues that could be creating a problem. A little information can go a long way to returning to normal operation. One thing that can be done with most newer cable modems, log into the modem itself and look at the signal strength on the downstream channels. Again, most cable modems will use 192.168.100.1 as their management IP address. The user name and password should be on the bottom of the modem. I also Googled my modem manufacture and model number and found mine that way.
Navigate around until you find a screen that looks like this:
There is a lot of helpful information to look at. The first thing is the Pwr (dBmV) level. DOCSIS 3 modems are looking for -7 dBmV to +7 dBmV as the recommended signal level. They can deal with -8 to -10 dBmV / +8 to +10 dBmV as acceptable. -11 to -15 dBmV / +11 to + 15 dBmV is maximum and greater than -15/+15 dBmV is out of tolerance.
The next column to look at is the SNR (Signal to Noise Ratio). DOCSIS 3 needs to be greater more than 30 dB and preferably 33 dB or greater.
The last two columns are the codeword errors. This is a Forward Error Correction (FEC) system which verifies the received data and attempts to correct any corrupted bits. The lower the codeword error number, the better the data throughput. Codeword errors are often due to RF impairments and can be a strong indicator of cable or connector issues. Another possible cause is improper signal strength, which can be either too high or too low.
Upstream data is transmitted on 4 channels.
The only statistic that is useful on the upstream channels is the Pwr, which should be between 40 and 50 dBmV.
I have found a few simple parts and tools can sometimes restore a faltering cable connection. First, I have several attenuator pads; 3dB, 6dB and 10 dB with type F connectors. This has actually cured an issue where the downstream signal was too hot causing codeword errors. Next, some good Ideal weather proof crimp on F connectors for RG-6 coax and a good tool should also be in the tool kit. I have had to replace mouse chewed RG-6 from the outside cable drop into the transmitter building. Fortunately, there was some spare RG-6 in the transmitter room.
If these attempts do not fix the issue, then of course, be prepared to waste a day waiting for the cable company to show up.