Can coaxial digital cables be used in place of analog RCA in 2 channel audio?

sreekanth

New Member
Joined
Sep 19, 2008
Messages
1
Points
3
Location
Hyderabad
Hi,


Need some advice on cables.

I picked up an Audiolab 8200 CDQ player based on net reviews, mainly on recommendation of what hifi.

To my dismay, it did not sound good at all, even after 6 months of burning in, and i had almost given up on the player.

I recently picked up the Mogami 2549 RCA cables with Kondo silver terminations from Dr. Bass (FM), and they have made a huge change in the SQ from my Audiolab 8200cdq (CDP to NAD 165 preamp).

This made me want to pick up more Mogamis, for the pre to power amp connection.

I checked online, and could only find Mogami 2964, which are supposed to be 75 Ohm digital coaxial cables and not analogs.

My question is: Can i used digital coaxial cable for 2 channel audio in place of analog interconnects, and will it degrade the signal in any way?

Regards,

Sreekanth
 
If you want to know, whether you CAN, the answer is yes you CAN. It will not damage anything.

However, if you are asking whether it is a common practice, the answer is NO, it's not a common practice.

Give it a try. If you get good sound, keep it. If not, change it.
 
I checked online, and could only find Mogami 2964, which are supposed to be 75 Ohm digital coaxial cables and not analogs.

This raises the valid question of how much is the characteristic impedance for unbalanced audio interconnect cables. So my question is - how much is the characteristic impedance of an analog audio unbalanced interconnect cable?

As far as I know, the Neglex 2549 are balanced cables, meant for microphones, and therefore have a characteristic impedance of 600 Ohms. (Note that the characteristic impedance of digital "balanced" cables that carry AES/EBU signals is 110 Ohms. I have put balanced in quotes because the AES/EBU signal is not balanced in the analog sense).

When a balanced cable is used with an RCA connector which is inherently unbalanced, one can either ignore one lead, or use the two leads together to carry signal. I have never been able to figure out the characteristic impedance of RCA connectors but I know that they are being freely used for composite video or component video applications in TV sets or cable and DTH set top boxes. In these applications, they are 75 Ohms like the 75 Ohms BNCs favoured in pro applications.

FWIW, I have used - by mistake - a pair of 110 Ohms digital cables (model Belden 1800B) terminated in XLRs, to carry analog audio from pre to power, and it sounded just fine:) The more correct cable to use is 1800F (mic cable) or analog cable (8541).
 
I found this on Blue Jeans Cable website:
Digital Cables and Analog Cables -- What's the Difference?

The line i was looking for:
Turning back to the world of coaxial cable, we can now answer a few questions. Can analog cables be used in digital applications? Yes, up to a point; but the looser tolerances of older analog cable designs will limit their run lengths, at least when used in high-bandwidth applications like SDI video. Can digital cables be used in analog applications? Yes, absolutely; the same tight tolerances which make digital cables appropriate for digital applications make them superb for analog applications. One may not "need" the improvement, but it will never hurt, and can help
 
This raises the valid question of how much is the characteristic impedance for unbalanced audio interconnect cables. So my question is - how much is the characteristic impedance of an analog audio unbalanced interconnect cable?

Simplistic logic, but shouldn't interconnect impedance be ideally zero? Impedance mainly introduces attenuation (lowers signal strength) so IMHO, that should be a big thing to guard against. Since the signal coming out of an unbalanced RCA port is fairly weak, you would want to make sure you don't weaken the signal even more by adding a high resistance wire.

Of course, interconnect wires claim to guard against other things like external noise, crosstalk, capacitance etc, but purely from a impedance perspective, shouldn't we be choosing the wire with the smallest impedance unless the wire itself allows you to match impedance between the source/preamp and the amplifier?

Edit: I should have read the informative article before shooting off my mouth. The answer is a lot more subtle than I thought.
 
Last edited:
That argument is seemingly logical but every cable is treated as a "transmission" line in the larger scheme of electrical engineering. And every transmission line has a characteristic impedance, and it is not zero.

Why bother about matching characteristic impedances? Well, when a load sees a matching source, maximum power transfer happens (which incidentally is only half that is "outputted"). At all other impedance values, the signal loss is higher than half. Further, it produces something called standing waves, which is measured as a voltage standing wave ratio (lower is better).
 
The Marantz PM7000N offers big, spacious and insightful sound, class-leading clarity and a solid streaming platform in a award winning package.
Back
Top