Had one of these units go bad recently. These are the OEM power supplies for the BE FM C series solid state transmitters.
This series of transmitters has been extremely reliable over the years. Rarely have I encountered an issue, other than a cabinet fan going bad, that has caused an off air incident.
They seem to be a fairly standard medium voltage high current power supply. I think these run at 48 volts and can put out a maximum of 42 amps.
In a clever BE design feature; all of the supplies are paralleled onto one DC buss which feeds all of the RF modules. The current from all the supplies is balanced with a single wire current sharing circuit. This means that the loss of any one supply does not cause a complete shut down of an RF module, which in turn would cause an imbalance in the RF output combiner causing a lot of wasted power being dissipated in various reject loads. Rather, if a power supply is lost, the overall DC current to each RF module is reduced. The transmitter power may go down, depending on the TPO setting, but it does not dump a bunch of heat into the room.
This power supply has been repaired and returned to the client at a significant reduction in cost and time. It takes some degree of knowledge and fortitude to dig into the guts of a high current switching power supply. After all, anybody (or almost anybody) can be a module swap guy, although some people can’t even to that right. Many broadcast engineers these days are running around in circles trying to get everything done that their employer demands of them. Not the best environment for learning and growing.
With the approval from the FCC for all digital broadcasting on the Standard Broadcast (AKA AM, Medium Wave, Medium Frequency) band, it might be interesting to dissect Xperi’s HD Radio MA3 (HDMA3) standard a little bit. It might also be interesting to compare that to DRM30 which has been in use in many other places around the world for several years now.
First, I will dispense with the givens; HD Radio sounds better than its analog counterpart. I have also listened to DRM via HF, and that too sounds better than its analog counterpart. Of interest here is whether or not either digital modulation scheme improve reception reliability and coverage area. Medium Wave has a distinct difference from other frequency bands as it can cover vast areas. Something that has been dismissed in recent years as unneeded due to reduced maintenance schedules and the cost of keeping directional antenna systems in tolerance (thus increasing skywave interference).
Secondly; after reading several studies of HDMA3 and DRM30, I will concede that both systems perform betterAnnex E, Ref 2; Section III para C, Ref 6 than their analog counterparts in a mixed digital analog RF environment. Both systems have features which can be used to improve reception during night time operation. Skywave exists, whether or not people want it. If it is not desired as a reception mode, it still has to be dealt with from an interference perspective.
The two main complaints against Medium Wave broadcasting is perceived reduced audio quality (over FM) and interference. The interference comes in two flavors; electrical impulse noise and broadcast (co-channel and adjacent channel AM stations). Both are problematic. To some extent; both can be somewhat mitigated by an all digital transmission. However, if the interference noise becomes too high, the program will simply stop as the data loss becomes too great to reconstruct the audio program.
Of further interest here is the technical aspects of both systems and whether or not one would be superior to the other for Medium Wave broadcasting. I found this comment on a previous post to be particularly interesting:
DRM and HD both use OFDM, but the parameters are quite different, eg. the length of cyclic prefix which determines the performance in sky/ground wave interference are different by a factor of 9 (0.3ms vs 2.66ms). That is why DRM is much robust than HD.
First of all, is this a true statement? Secondly, does the cyclic prefix make a difference in sky wave to ground wave interference? Which system might work better in a broadcast service where there are 4560 stations transmitting (as of 9/2020) and creating interference to each other? Finally, could the implementation of either system make a worth while difference in the quality and reliability of Medium Wave broadcasting in the US?
To answer these questions, I decided to begin with the technical descriptions found in the definitive documents; NRSC-5 D 1021s Rev GRef 1 for HDMA3 and ETSI ES 201 980 V4.1.1Ref 2 for DRM30.
There are many similarities between the two systems; both use COFDM modulation schemes, both have various bandwidth and data rates available, both use audio codecs that similar, both have some type of FEC (Forward Error Correction) system. I prepared a chart of these characteristics:
Both systems have 10 and 20 KHz channels available. This could be one feature used to mitigate adjacent channel interference, especially at night. In the US, physical spacing of transmitter sites helps prevent adjacent channel interference during the day. However, at night, half of the 20 KHz wide analog channel is in somebody else’s space and vice versa. Switching to 10 KHz mode at night would prevent that from happening and likely make the digital signal more robust.
DRM30 has additional advantages; multiple operating modes, protection classes and CODECs are available. Another advantage is the number of studies performed on it in varying environments; The Madrid Study,Ref 3 The All India Radio Study,Ref 5 Project Mayflower, Ref 4 and others.
Lets answer those questions:
Are HDMA3 and DRM30 different? Yes, as the commenter stated, both use COFDM however, there are major differences in carrier spacing, symbol rate, and FEC. DRM30 has been designed at tested on HF, where phasing issues from multi-path reception are common. There are many configurable parameters built into the system to deal with those problems. My calculations of the Cyclic Prefix Length came out differently than those stated (I may have done it wrong), however, they are indeed different.
Does the Cyclic Prefix Length make a difference in ground/sky wave interference? This is more difficult to answer. I would postulate that all of the configurable parameters built into DRM30 make it more robust. The various operating modes help mitigate phasing issues and the various protection modes help mitigate multipath reception issues. The only way to know that for certain is to do a side by side test.
Which system would work better in high broadcast interference environments? Again, it is difficult to tell with out a side by side study. There have been numerous studies done on both systems; Madrid,ref 3 Project Mayflower, Ref 4 All India,Ref 5 WWFDRef 6 etc. In order to conclusively determine, one would have to operated HDMA3 on a station for a week, then DRM30 for a week on the same antenna system, with the same environmental conditions. Extensive measurements and listening tests would need to be performed during those tests.
Is it worth it? Possibly. The big issue is the availability of receivers for both systems. Currently, only HD Radio receivers come as stock items in US automobiles. There are current and planned chipsets that have all of the digital radio formats built in (HD Radio, DRM+, DRM30, DAB/DAB+). If consumers want the service, manufactures will make the receivers. It would take a lot of effort to get this information in front of people and offer some type of programming that was highly desirable and available only on the radio. That is a big stretch.
Objectively comparing those two systems, I can see that both systems have advantages and disadvantages. There are some common items required for both systems; a reasonably well maintained transmitter plant, a newer solid state transmitter, and an antenna system with enough bandwidth so as not to distort the digital signal.
There are more receivers available for HD Radio, especially in cars. HD Radio MA3 is less configurable and therefore less likely to be misconfigured. There has been a lot of ink spilled in recent years about the declining number of radio engineers and the increased work load they are facing. Are there enough people with sufficient technical skills to implement and maintain even a basic all digital system? A topic for another post.
DRM30 is more flexible. Operating modes, protection modes and CODECs can be adjusted according to goals of station owners. There has been more testing done with all digital transmission of DRM30 using Medium Wave.
Are there enough reasons to allow a test of all digital Medium Wave DRM30 in the US?
Why not allow both systems and let the Software Defined Receiver decide?
HD Radio Air Interface Design Description Layer 1 AM Rev. G December 14, 2016
Digital Radio Mondiale (DRM) System Specification, ETSI ES 201 980 V4.1.1 January 2014
Digital Radio Mondiale DRM Multi-Channel simulcast, Urban and indoor Reception in the Medium Wave Band, Document 6A/73-E September 19, 2008
Project Mayflower, The DRM Trial Final Report, BBC, April 2009
Results Of DRM Trials In New Delhi: Simulcast Medium Wave, Tropical Band, Nvis And 26 Mhz Local Broadcasting, Document 6D/10-E March 28, 2008
All-Digital AM Broadcasting; Revitalization of the AM Radio Service, FCC Fact Sheet, MB Docket Nos. 19-311 and 13-249, October 19, 2019
The internet is being relied upon for many different functions. One thing that I am see more of is STL via the public network. There are many ways to accomplish this using Comrex Bric links, Barix units or simply a streaming computer.
We often can take for granted the infrastructure that keeps our connection to the public network running. Cable modems are very common as either primary or backup devices at transmitter sites, homes, offices, etc. The basic cable modem uses some type of DOCSIS (Data Over Cable Service Interface Specification) modulation scheme. This system breaks up the bandwidth on the coaxial cable into 6 MHz channels for downstream and upstream transmission. Generally, downstream transmission is 16 channels of 256-QAM signals. Upstream is 4 channels of QPSK or up to 64-QAM signals. Depending on your traffic shaping plan with the cable company, this will allow up to 608 Mbps down and 108 Mbps up. Those speeds also can change due to network congestion, which is the bane of coaxial cable based internet service.
The internet should now be considered a public utility. Especially after the COVID-19 emergency, distance learning, telecommuting at all the other changes we are experiencing. I know in the past, ISPs were reluctant to accept that role, as there are many responsibilities. That being said, when the public network goes down, many things grind to a halt.
Sometimes the problem is at the cable office or further upstream. Loss of a backbone switch, trunk fiber, or DOCSIS equipment will cause widespread outages which are beyond anything a field engineer can deal with.
Then there are the times when it is still working, but not working right. In that situation, there are several possible issues that could be creating a problem. A little information can go a long way to returning to normal operation. One thing that can be done with most newer cable modems, log into the modem itself and look at the signal strength on the downstream channels. Again, most cable modems will use 192.168.100.1 as their management IP address. The user name and password should be on the bottom of the modem. I also Googled my modem manufacture and model number and found mine that way.
Navigate around until you find a screen that looks like this:
There is a lot of helpful information to look at. The first thing is the Pwr (dBmV) level. DOCSIS 3 modems are looking for -7 dBmV to +7 dBmV as the recommended signal level. They can deal with -8 to -10 dBmV / +8 to +10 dBmV as acceptable. -11 to -15 dBmV / +11 to + 15 dBmV is maximum and greater than -15/+15 dBmV is out of tolerance.
The next column to look at is the SNR (Signal to Noise Ratio). DOCSIS 3 needs to be greater more than 30 dB and preferably 33 dB or greater.
The last two columns are the codeword errors. This is a Forward Error Correction (FEC) system which verifies the received data and attempts to correct any corrupted bits. The lower the codeword error number, the better the data throughput. Codeword errors are often due to RF impairments and can be a strong indicator of cable or connector issues. Another possible cause is improper signal strength, which can be either too high or too low.
Upstream data is transmitted on 4 channels.
The only statistic that is useful on the upstream channels is the Pwr, which should be between 40 and 50 dBmV.
I have found a few simple parts and tools can sometimes restore a faltering cable connection. First, I have several attenuator pads; 3dB, 6dB and 10 dB with type F connectors. This has actually cured an issue where the downstream signal was too hot causing codeword errors. Next, some good Ideal weather proof crimp on F connectors for RG-6 coax and a good tool should also be in the tool kit. I have had to replace mouse chewed RG-6 from the outside cable drop into the transmitter building. Fortunately, there was some spare RG-6 in the transmitter room.
If these attempts do not fix the issue, then of course, be prepared to waste a day waiting for the cable company to show up.
These are the stock power supply for 3rd and 4th generation Nautel V series FM transmitters, which were produced in the 00’s decade starting around 2005 but were discontinued sometime around 2009. First and second generation V series transmitters used Nautel made power supplies.
The OEM PA power supplies were made by Tectrol and were designed to put out 2120 watts per unit. The V-10 transmitters have eight PA supplies, one IPA supply with an option for a hot standby IPA supply. Like all such things, occasionally they fail for various reasons.
Unfortunately for Nautel, Tectrol stopped making these supplies and no longer supports them. Nautel won’t fix them either, however, they will sell a $3,200.00 (per supply) retrofit for a new supply.
We take care of seven of these transmitters and overall, they are fairly reliable. They are not terribly old either. However, spending $28,000.00 to replace the UG-39 power supplies seems… somewhat steep. One station uses four V-10 transmitters combined to make a 40 KW transmitter. For that station, it would cost $115,000.00 to replace all of the power supplies on a transmitter that is barely 13 years old. In this time of economic instability buying a new transmitter is not an option either.
Necessity being the mother of invention; we had a few of these defective power supplies kicking around, I decided to destructively reverse engineer one and determine the failure mode or modes. Special thanks to COVID-19 for giving us lots of spare time to do things with. Pete the Bench Guy, made up a test jig with a connector and some test points. With this, he can provide 240 VAC into the unit, feed 0 to +5VDC to the control pin, thereby vary the output voltage, look for faults, get ready indicators which the transmitter uses, etc.
Thus far, we have about a 50% 80% 90% success rate with these things. The failure modes vary from blow MOSFETS in the H bridge, bad PDM chips in the controller, fried resistors, a few other unusual things, etc. After repair, they will burn in for 24 hours in a nearby V-10 transmitter before we send the repaired unit off to wherever it is supposed to go.