Milbert Audiology Musicology Biology Technology Blog
Life vibrates. Don't overlook the articles.
The 600 Ohm Audio Standard, where did it go?
Sat 23 Sep
, , , , , , , ,
THE 600 OHM AUDIO STANDARD, WHERE DID IT GO?
THE 600 OHM AUDIO STANDARD, WHERE DID IT GO?
BY: Frank McClatchie
The standard for audio was born out of the radio industries need to set a standard, and has always been related to the 600 Ohm audio standard impedance with the level measured in dBm which refers to Decibel milli-Watts of power delivered to the "load". One milli-Watt = 0dBm. However in recent years the audio industry has changed it method of delivering audio power.
Most audio systems no longer adhere to the 600 Ohm standard and yet they still measure the audio level in terms of dBm as though it was still 600 Ohms.Let's look at where the 600 Ohm standard originated from. There is a lot of history behind the 600 Ohm audio transmission standard. It got started in the early days of radio when there was great difficulty with the loss of audio level being transmitted down a simple pair of twisted wires.
Early on, engineers discovered that that in order to deliver the maximum amount of audio power to the receiving terminal at all frequencies, it was necessary to match the end of line "load" to the characteristic impedance of a twisted pair wires and also match that impedance to the audio source driving impedance.
Long twisted pair 16 gauge audio transmission wires were found to exhibit impedances in the vicinity of 600 Ohms at voice and musical frequencies up to 15 KHz, so of course the source of the audio and the load at the receiving site must also be 600 Ohms in order to achieve maximum power transfer to the receiving equipment. Since at that time there was power loss on the twisted wire cable it was necessary to apply a vacuum tube amplifier at each end of the transmission wires. Yes, that's right the 600 Ohm standard originated in the days of vacuum tubes.
The main problem with vacuum tube amplifiers was that they had very high output impedances and even higher intrinsic input impedance and worked at much higher output voltages, which did not match the desired 600 Ohm impedance or the output levels at all. This then required the application of very expensive matching transformers to couple the signal efficiently to the 600 Ohm wires.
As the radio studios became more complicated, with dozens of microphones and considerable recording equipment and audio mixing panels the coupling between these various components became much more complicated. Large studios like CBS "Black Rock" in New York had to do something to simplify how large networks with long wire runs could be connected while at the same time being able to keep audio levels and frequency response correct.
They discovered that if the originating source impedance was kept very low (as close to zero as possible) and the end of line load impedance was made very high an audio signal could be transmitted over long distances without degrading the frequency response. Also because the end of line termination is very high impedance almost no power is absorbed at the receiver with miniscule loss.
Of course, now the concept of one milli-Watt of audio power delivery to a very high receiving impedance was completely "out the window", so it was decided to "pretend" that one milli-Watt was being delivered to the load, even though essentially zero power was being delivered. Basically you take the measurement as though it was terminated with 600 Ohms.
To improve things along came transistors that operated at much lower output impedances, so out went the necessary matching transformers that were necessary in vacuum tube amplifiers. Later on, Operation Amplifiers came on the scene that essentially provided near zero output impedance, completely eliminating any need for Output Matching Transformers.
So much for matching source and load impedances! The low impedance of the driving outputs simply negate the capacitive effects of the wires and the lack of termination eliminates the end of line losses.There is no such thing today as a milli-Watt of audio power delivered at all. There is an audio Voltage level, but essentially zero power, because the "Load" is a very high resistance that is essentially infinite compared with 600 Ohms, so that since P = E squared / R , and the load resistance is essentially infinite, then Power is effectively zero. In practical fact the received power is gone!
When the 600 Ohm impedance became zero at the sending end and infinity at the receiving location the 600 Ohm standard disappeared. The only remnant of the original standard is the Voltage that can be measured the same as though there was an actual 600 Ohm Source and Load. Thus even though no actual power is delivered to the load a level of 0.7746 Volts for 0dBm is present at the load. A zero dBm tone on a modern audio circuit only means that a Voltage of 0.7746 exists at that point, and not any particular power level as the dBm would indicate.
On an old 600 Ohm audio circuit with 600 Ohm source impedance and a 600 Ohm load impedance the actual voltage measured at the load would double when the load was removed.In a modern audio system, connecting or disconnecting a 600 Ohm load resistor would not change the terminating Voltage appreciably, which is one of the great advantages of the modern system. Since the receiving termination is very high, many terminations can be connected at the same location without changing the actual Voltage.
So now an audio transmission system neither delivers milli-Watts of Power to the receiver nor dose it have 600 Ohm input and output impedances, however you may see the 600 Ohm specification listed on the equipment spec sheet as a reference. They do that to reassure customers that their equipment is compatible with 600 Ohm systems. As for the dBm level measurement...
"Lets just pretend its OK".