USB Cable type AB recommendation

My criteria was simple:

1. Want a short length cable, if possible less than 1M.
2. Durable
3. Built to last and good quality

I am using it for last 4 to 5 months. No issues so far, music from PC is as good as from the Oppo 95. Since I haven't compared with other USB cable also I can't comment on it.
 
WOW - I still can't believe a thread has started for USB cable! :D

I agree with Venkat that we should go with the CHEAPEST cable with FERRITE CORE on both the ends. Thats the MAX you need!

USB cables from the right source with proper build and quality are available for as little as Rs.20 to 30 only!

A THICK HDMI cable with NYLON covering and ferrite cores on both the ends costs just Rs.150. So a decent USB cable with Ferrite cores at both ends should not cost more than Rs.100 to max 150. Do not throw more money than that, you may NEVER get the benefit out of them.

Infact any FREEBIE regular USB cable works very well even when you encounter EMI/RFI as it carries COMPUTER DATA which has CRC checks tightly integrated. So you may NEVER "hear" any difference between a Rs.20 1meter USB cable and the ones with ferrite cores or the MOST EXPENSIVE ONES available on this planet and beyond. Still can't swallow people bothering to care about USB cables by spending 6k or more, OUCH. Someone please wake me up!!!

Talking of durability and build quality, that Rs.20 cable or the freebie you got with devices may NEVER fail, EVER!!! I have a bunch of those lying here and there and NONE of them has ever shown any sign of wear despite me abusing me to the max as they are most sturdily built.
 
Last edited:
Guys I have been doing a little bit research on the technical side of USB cables (specifically audio applications) and here is an interesting link with some attempt at a technically objective answer from Gordon Rankin who is considered the inventor and authority in Asynchronus DAC tech. with his company wavelength audio. Of-course I do appreciate his caveat that one needs to listen and audition before proclamation that it is good or bad.

Best USB cable to use between computer and dac? | Computer Audiophile

and here is a heated discussion on Audio Asylum on the benefits of a $550 cable
Computer Audio Asylum: Synergistic Research USB Cable by Mercman

Anyways my decision is to stick with my no-name cables for now with a plan to get a cable in the 2k range soon. I will compare those two and if indeed I notice any positive difference with the higher priced cables, then I will plan to get the 6k and above one. This way I satisfy myself (who happens to be the most important person to me:eek:hyeah:). Thank you so far for all your opinions, comments and ridicule! - all taken in a positive spirit for the advancement of knowledge:clapping:
Cheers,
Sid
 
Last edited:
Further to my last post, here is a quick logical summation on why usb cables connected between hard drive and computer and computer and DAC have different requirements:
As quoted by Steve. N of Empirical Audio:
"The interface between HD and computer does not transfer data real-time, so jitter is not important. Only error-free data transfer is required.

The computer to DAC interface on the other hand transfers data real-time, so jitter matters. The timing accuracy at the D/A converter chip is critical to reproducing the original sampled recording with the same accuracy as it was recorded. If the timing accuracy (jitter) is poor, then distortion will result."

And putting other things aside, jitter is a bad thing in audio reproduction. Question is whether it is readily audible in a computer source based setup ?
Cheers,
Sid
 
Last edited:
Further to my last post, here is a quick logical summation on why usb cables connected between hard drive and computer and computer and DAC have different requirements:
As quoted by Steve. N of Empirical Audio:
"The interface between HD and computer does not transfer data real-time, so jitter is not important. Only error-free data transfer is required.

The computer to DAC interface on the other hand transfers data real-time, so jitter matters. The timing accuracy at the D/A converter chip is critical to reproducing the original sampled recording with the same accuracy as it was recorded. If the timing accuracy (jitter) is poor, then distortion will result."

And putting other things aside, jitter is a bad thing in audio reproduction. Question is whether it is readily audible in a computer source based setup ?
Cheers,
Sid

I am slowly transitioning to a computer based audio unit from a dedicated CDP. I have ripped about 50 CDs and couple DVD-A's and do not hear any audible distortion whatsoever.

My setup is HP Desktop -> MHDT Paradisea DAC -> (Vincent Hybrid / Musical Fidelity Solid State) Integrated -> LSA statement speakers.

I was hesitant initially about moving to computer based audio and through the process compared the output of the above system with CDPs (Rega Apollo, Audio Aero Prima, Musical Fidelity, Raysonic) and couldn't hear any differences (keyword here is differences since I don't know what jitter should sound like!!! :confused:

Theoretically what Steve is quoting above is true HOWEVER a high priced USB cable will not fix the jitter issue if it exists. It has to be either handled by the soundcard or the DAC from my understanding.
 
Theoretically what Steve is quoting above is true HOWEVER a high priced USB cable will not fix the jitter issue if it exists. It has to be either handled by the soundcard or the DAC from my understanding.

What I intrepreted from Steve's comment was that timing errors can be introduced by an inadequate USB cable. Of-course the sound quality will be determined by how well the DAC addresses this jitter, but I guess the question should be why introduce it in the first place if it can be avoided.

BTW - John Atkinson of Stereophile has this explanation for audibility of jitter:


"The audible effect of jitter suggested by these simulations would be to add a signal-related grundge and lack of resolution as the analog signal's noise floor rises and falls with both the signal and the jitter, while any periodicity in the jitterat the power-line frequency and its harmonics, for examplewill throw up frequency-modulation sidebands around every spectral component of the music. The "clean" nature of the original analog signal will be degraded, "fuzzed up" if you like, to produce the typical, flat-perspectived, often unmusically grainy CD sound."

Cheers,
Sid
 
I consider myself decently smart and very alert at this time but I cannot understand anything re: John's description below as much as I respect his knowledge level. Not surprisingly, this is in line with many other vague definitions of the phenomenon. Finally a ENT specialist friend of mine who is also an audio enthusiast told me that our hearing degrades from about 35 yrs of age or so. This also tends to have a significant impact on observing aural nuances.

The following article may be helpful for better understanding. Its been a while since I read it, something for tonite I guess :)

Jitter & the Digital Interface | Stereophile.com

What I intrepreted from Steve's comment was that timing errors can be introduced by an inadequate USB cable. Of-course the sound quality will be determined by how well the DAC addresses this jitter, but I guess the question should be why introduce it in the first place if it can be avoided.

BTW - John Atkinson of Stereophile has this explanation for audibility of jitter:


"The audible effect of jitter suggested by these simulations would be to add a signal-related grundge and lack of resolution as the analog signal's noise floor rises and falls with both the signal and the jitter, while any periodicity in the jitterat the power-line frequency and its harmonics, for examplewill throw up frequency-modulation sidebands around every spectral component of the music. The "clean" nature of the original analog signal will be degraded, "fuzzed up" if you like, to produce the typical, flat-perspectived, often unmusically grainy CD sound."

Cheers,
Sid
 
Actually, Marsilians, rather than reading descriptions, hearing it (jitter) would make things easier, and the Stereophile test Cd2 has a track with audible examples of jitter (i think track 24). I have listened to that and it does, at-least to my ears, have a detrimental effect that seems to degrade overall resolution. One thing I use as an yardstick (and I, by no means am an expert, just a novice who is a danger to himself and others:eek:hyeah: in audio) when judging a hi-fi system amongst other things, is overall system resolution and my poor, layman's understanding of jitter is that it will be a roadblock to achieving this. Hence my original quest of the right USB cable. In fact at this stage in the game I am contemplating ditching the USB approach entirely and using a MSB tech. USB to SPDIF bridge and running a 75ohm coax. to the DAC, but that is a $200 investment as well. So el-cheapo USB cable will be the first step and I will make informed upward revisions from there onwards.
Cheers,
Sid
 
Last edited:
Sid, may I ask what you plan to use the USB cable for? If it is to tap the digital input on the Ayon, RoC was raving about how great the M2Tech sounded on Vista/W7 as opposed to XP when he connected it to the CD1s. He had almost decided to sell it after using it on XP but now it is firmly back in his digital chain. May be worth your while to pursue that approach.
 
"The interface between HD and computer does not transfer data real-time, so jitter is not important. Only error-free data transfer is required. The computer to DAC interface on the other hand transfers data real-time, so jitter matters. The timing accuracy at the D/A converter chip is critical to reproducing the original sampled recording with the same accuracy as it was recorded. If the timing accuracy (jitter) is poor, then distortion will result."

I have been saying this for quite some time here and I have seen people get aggressive, screaming digital is digital. Jitter is quite easy to identify - it comes in many forms: a sudden change in speed of the song, a sudden gruffness in voice or instruments, a sudden skipping of a few notes (like a TT jumping a grove). These are effects you can hear. Jitter also works at a subtle level to change the soundstage, depth, and tonality of the instruments and/or voice. If you have heard the track a number of times, one can easily recognise there is something wrong. If you play 'Joy of Life' by Kenny G, you can easily discern the changes in note as he reaches peaks in scale and very quickly descends down. In a few parts, he does this very very rapidly, and a good ear can easily identify errors, if any, in the recordings or playback. Once when Capt. Rajesh and a few others was at my place and we were experimenting with cables, we could easily make out the difference. The Captain termed it best by saying the 'bite' was missing. In addition, the timing between one note changing to another is in split seconds, and the first note should not die too early.

Of course, to my ears, terms like 'Quantum Tunnelling' sound like complete hogwash. Something like a Wireworld cable would be quite satisfactory.

Ideally the DAC's clock must be completely disconnected from the source's clock, and the DAC must buffer and re-sample the whole song. This is expensive and laborious process.

In fact at this stage in the game I am contemplating ditching the USB approach entirely and using a MSB tech. USB to SPDIF bridge and running a 75ohm coax. to the DAC, but that is a $200 investment as well. So el-cheapo USB cable will be the first step and I will make informed upward revisions from there onwards.

Don't be in a hurry and take this step carefully. If you are using a sound card, you may not even need an USB cable. AND, if you are using an USB to transfer the songs, the DAC must have a respectable asynchronous processing inside. Of course, dCS is too expensive, but a lot of others have licensed the technology and are using it at affordable prices.

BTW, I have heard jitter even with Tara Labs S/PDIF cables!


Cheers
 
Last edited:
The following article from http://media.avguide.com/Digital_Source_Components_Buyers_Guide.pdf, by Alan Taffel may be good reading.

Alan Taffel / AVGuide said:
No matter what anyone - or any manual - tells you, USB is not plug-and-play. Not if you want to get the best sound from this interface. Overcoming your PC operating systems inherent limitations is the first challenge. If you are running windows XP and follow the (dCS's) Debussy manuals instructions, for instance, you will likely end up (knowingly or not) invoking windows direct sound, which means bits will pass through the dreaded kernel mixer. Following the manual to the letter, i achieved what I have come to view as typical USB sound: smeared rhythms, closed extension, and screechy strings. In short, yuck. Regardless of your Windows OS, what you want to do is bypass all its junk by using the far superior, professional standard ASIO driver set. (Another option is the recent WASAPI, but I did not have time to experiment with it.) Most music-playing software packages, like Media Monkey, support ASIO.

The problem is that the Debussy doesnt. However, a nifty, freely downloadable package called ASIO4ALL solves the problem. Not only are these drivers bit perfect, they dynamically adapt to the source materials sample rate. This is an important provision, because standard PC (and Mac) drivers asynchronously upsample data to the highest supported sample rate - a sonically injurious process. Asio4ALL will work with virtually any playback software except iTunes. Once youve heard the way ASIO restores USBs air and dynamics, you will never go back.

After sorting out software, there is still the matter of cables. As I have stressed before, USB cables make a demonstrable difference. For my tests with the Debussy, I experimented with five of them, ranging from the Brand X variety that comes with printers to audiophile models. The winner this round - just as in the last time I conducted a USB cable survey - was the unpretentious Belkin Gold series. The difference this cable makes is not remotely subtle. Depending on what you are comparing it to, it can be the difference between music and wallpaper. Here, blessedly, is one area of the high end that does not require spending a fortune; the Belkin costs under five dollars.
 
Bluu & Venkat - My idea of a computer based setup consists of a Xp based laptop (I want to use my Hp mini with an atom processor, not sure if that will work first of all) with an USB connection to the Ayon Cd2s which has an asynchronously implemented DAC with USB connection capability upto 24/96. Down the road I will upgrade to an Mac lap top or possibly even a mac mini. Another option is buy a SPDIF bridge and use that instead. Few users have reported that the SPDIF bridge with coax cable sounded superior to the USB option.
Cheers
Sid
 
Last edited:
Another option is buy a SPDIF bridge and use that instead. Few users have reported that the SPDIF bridge with coax cable sounded superior to the USB option.

Precisely! Stello, Lindemann, Halide-Bridge and M2Tech come to mind.
 
Precisely! Stello, Lindemann, Halide-Bridge and M2Tech come to mind.

Thanks Bluu - and of these the cheapest option would be the m2tech hiface usb bridge retailing for $190. So a no brainer, proven approach, with many pro-users right here on HFV, would be instead of spending $150 or so for a good Usb cable (with deeply divided opinions on benefits), spend $40 more and go for this. Another step up would be the hiface Evo. Anyways these options are in the back of my mind, and I will try to get one for an audtion before I pull the trigger.
Cheers,
Sid
 
Jitter is quite easy to identify - it comes in many forms: a sudden change in speed of the song, a sudden gruffness in voice or instruments, a sudden skipping of a few notes (like a TT jumping a grove). These are effects you can hear.
Some of that is not, and could not be "jitter".

Jitter also works at a subtle level to change the soundstage, depth, and tonality of the instruments and/or voice.
But that could be :)

Apparently it can cause actual wrong notes to appear in the music. I take a look at the Stereophile article, linked to above, from time to time, but have not yet managed to understand it. However, some of the diagrams reveal quite a lot, even to the innumerate such as myself. It seems that jitter can sometimes be heard as a hiss, or can even inject "wrong notes" into the music. The diagrams in the Stereophile article, as well as others, explain this a million times better than I ever could. Possibly this discordant-note injection might explain what you have perceived as a faulty pitch or incorrect tempo? Just making suggestions there.

It seems that the effects of jitter can be subtle, or unsubtle, but they do not extend to dropouts or skipping. I might be better able to suggest what does when I have managed to cure all the problems with my own system. I am fairly sure that it belongs in the areas of interrupt handling and buffering.

This stuff is hellishlishly difficult to trace, because, unlike in analogue, there is no way that we can listen to what is happening along the path to the DAC, and events that may make a very audible difference to our final sound may be happening in fractions of time that we can not see or measure, or detect with normal system monitoring software.

Jitter is not a PC-only problem. It potentially applies to any DA or AD conversion, and one of the things we will all be reading in the articles we find is that the data on our CDs may very well have been subject to jitter. Somehow, though, we never thought about it much before we started using the PC as source.

Another one for the "Link Library": Digital Problems, Practical Solutions, Sound On Sound Magazine. It's certainly off-topic to USB cables, but it reading it, along with listening to the provided sound samples, covers a lot of digital audio basics. I had no real idea, before this, for instance, what dithering is about. Because it illustrates it with sound, rather than maths (the graphs are there too, though) it makes it very clear.

Here's a thing: he regards jitter as a problem pretty much solved
A lot of fuss is still made about jitter, but while it is potentially a serious issue it's rarely a practical problem these days simply because equipment designers and chip manufacturers have found very effective ways of both preventing it and dealing with it.

...

If the clock jitter is entirely random, the resulting distortion will also be random, and a random signal is noise. Since a high-frequency signal changes faster than a low-frequency one, small timing errors will produce larger amplitude errors in a high-frequency signal. So random jitter tends to produce a predominately high frequency hiss. I've yet to hear that on any current digital system, though clocking circuits these days are just too good for this to be a practical problem.

On the other hand, if the jitter variations are cyclical or related to the audio, the distortions will be tonal (similar to aliasing) or harmonic, and they'd tend to be far more obvious and audible. But I've not heard that on any current digital audio system either: other than in very low cost equipment with extremely inferior clocking structures, A-D and D-A jitter just isn't a practical problem anymore.
OK, though, he goes on to say...
Another source of jitter (the strongest source these days) is cable-induced. If you pass digital signals down a long cable (or fibre), the nice square-wave signals that enter degrade into something that looks more like shark fins at the other end, with slowed rise and fall times. This is caused by the cable's capacitance (or the fibre's internal light dispersion), so the longer the cable, the worse the degradation becomes. That's why digital cables need to be wide-bandwidth, low-capacitance types.

This matters because most digital signals incorporate embedded clocks along with the audio data, and that clocking information is determined from the rise and fall between the data pulses. If the clocking edges are vertical, the clocking moments are obvious. However, if the clocking edges slope, the timing point becomes ambiguous and we now have embedded jittery clocks!

When passing digital audio between one system and the next, the precise clock timing actually doesn't matter that much, as long as the average sample rate is the same for both. All that's needed is to be able to determine at each clock moment what the binary value of each bit is in the binary word.

However, when sampling or reconstructing an analogue signal, the clocking moments are critically important, as explained. So if a D-A relies on using the jittery embedded clocking information from its input signal to reconstruct the analogue output, there could be a problem with jitter noise or distortions. Fortunately, most modern D-As incorporate sophisticated jitter-removal systems to provide isolation between the inherently jittery incoming clocks embedded in the digital signal, and the converter stage's own reconstruction clock.
But please note the last line! Modern? He was writing in 2008. :)

Frankly, though, I do wonder if PC architecture development isn't actually moving away from being suitable for audio. I have experienced many more problems with the PC equipment I've owned over the past five years than I did with what I had in, say, 2003 :( --- and that is including using the same sound card in those different computers.

Venkat, yes, digital is digital, but that absolutely does not mean that digital audio is without problems! :)


.
 
Last edited:
I have been saying this for quite some time here and I have seen people get aggressive, screaming digital is digital. Jitter is quite easy to identify - it comes in many forms: a sudden change in speed of the song, a sudden gruffness in voice or instruments, a sudden skipping of a few notes (like a TT jumping a grove). These are effects you can hear. Jitter also works at a subtle level to change the soundstage, depth, and tonality of the instruments and/or voice. If you have heard the track a number of times, one can easily recognise there is something wrong. If you play 'Joy of Life' by Kenny G, you can easily discern the changes in note as he reaches peaks in scale and very quickly descends down. In a few parts, he does this very very rapidly, and a good ear can easily identify errors, if any, in the recordings or playback. Once when Capt. Rajesh and a few others was at my place and we were experimenting with cables, we could easily make out the difference. The Captain termed it best by saying the 'bite' was missing. In addition, the timing between one note changing to another is in split seconds, and the first note should not die too early.

Of course, to my ears, terms like 'Quantum Tunnelling' sound like complete hogwash. Something like a Wireworld cable would be quite satisfactory.

Ideally the DAC's clock must be completely disconnected from the source's clock, and the DAC must buffer and re-sample the whole song. This is expensive and laborious process.



Don't be in a hurry and take this step carefully. If you are using a sound card, you may not even need an USB cable. AND, if you are using an USB to transfer the songs, the DAC must have a respectable asynchronous processing inside. Of course, dCS is too expensive, but a lot of others have licensed the technology and are using it at affordable prices.

BTW, I have heard jitter even with Tara Labs S/PDIF cables!


Cheers

Venkat - I respect your outlook. However the phenomenon we talk about in stretch is essentially speculation as below:

1. We assume that a "regular" USB (or any other application) cable would have "jitters" (despite tightly synchronized to a clock and having CRC mechanism in place). Now even if we assume that a CHEAP cable WILL have BAD conductors, insulation and connectors - how does it count to a "hold and proceed" kind of phenomenon which results in a jitter??? How a bad conductor or cable can cause the data transfer to get DELAYED when it is broadcasting it at a SYNCHRONOUS clock rate? If the cable can not pass data as fast as it is intended, it would simply COLLIDE with the subsequent datagrams and would eventually become completely USELESS! And why do we always tend to assume that a cheaper cable would have less conduction and speed than an expensive one?

2. We assume that such phenomenon MAY exist based on our listening tests instead of analyzing the logic behind such phenomenon.

3. As you have already indicated, its NOT NECESSARY that an "expensive" cable would give "better" results than the cheapest one! This applies to any regular product as well, and hence we tend rely on reviews and our own perception and analysis.

My contention here is that its not really wise to spend on something so speculative which REALLY do not have a BASE associated to it and is more based on perception.

Giving an example of SPDIF. My TS+HD STB has optical and coaxial outputs and I currently have connected BOTH to my AVR and extensively try testing them both by switching across. Outcome? While they both sound quite similar, I hear harsher and more disturbed harmonics especially around mid high frequencies and higher (mid high is harsher and high frequencies more rolled off). So my Rs.10 RCA cable performs better than a Rs.200 Optical cable I bought (I accept both were the cheapest I could find!).

The above difference could well be attributed to the ANALOG characteristics of my AVR (could be different preamp stages after DAC).

I have used several make shift RCA, 3.5mm headphone plug instead of RCA if having or connected common ground elsewhere, etc. and I never heard any difference in quality. Now even if that makes a difference, I am sure that it would not be something as big as the one I observe when using the HIGHEST QUALITY analog transfers between source to the AVR. So as long as I get far superior audio by using a Rs.10 RCA cable to transfer SPDIF data to the AVR reliably without any noticeable jitter (I never observed jitters like you are talking of - EXCEPT when I hear Tatasky which has jitters on EVERY CHANNEL!). Jitters and most other glitches are VERY easily heard through headphones listening and hence I can safely conclude that I never EVER heard them in the past several years of listening.

I recently went with a Rs.150 HDMI cable which looks very solidly built to last especially as it has a thick nylon mesh housing on the cable and ferrite loops on both ends. It works flawlessly when connecting with HD sources I have tried till date. So I am content that I saved HARD EARNED money to invest in a more SENSIBLE instrument which would DEFINITELY give me better quality like better speakers, AVR, etc. Or rather would want to spend it in quality PHILANTHROPY!
 
Last edited:
Thanks Bluu - and of these the cheapest option would be the m2tech hiface usb bridge retailing for $190. So a no brainer, proven approach, with many pro-users right here on HFV, would be instead of spending $150 or so for a good Usb cable (with deeply divided opinions on benefits), spend $40 more and go for this

You can also take a look at the Jkeny modded Hiface. Pretty steep in price, but just an interesting variant of the original Hiface - Hiface Modifications & Ancillaries

Review on head-fi (as always, take it with a pinch of salt) - Review: Jkeny’s modified Hiface
 
Prankey, we all seem to forget that all forms of digital transmissions is error prone. Cat 5, USB, SATA were all designed for speed or convenience and not for accuracy. Why? Simply because, digital transmission is a two way process, and both sides can talk to each other. I am sure you have heard of data loss? Accuracy is achieved though checksum and re-transmission. The receiving size can calculate and make an exact assessment whether the received packet is bit perfect. If not, it simply asks for a re-transmission.

In an digital audio/video system, the talking is one way. And that is what makes all the difference. Once received, the receiving station can do nothing but process the data. And that is when all trouble starts. Also remember, the USB is also carrying a 5V power and ground all the time. And the data carrying cable should be certified for data transmission speeds of up to 12MHz. Badly constructed cables can make the power bleed into the data and create interference. How? Simply by not being shielded well and not grounded well. If the data cables used are of low quality, the data speeds will be forced down, and and there will be data losses. In a computer to computer transmission, this does not matter. In a computer to DAC kind of environment, as I said, before, data lost is data lost.

Yes, I do agree that all this talk of 500-1000$ USB cables are all hogwash. Gold plating the connectors is the biggest con in the world, as the external part of the USB plug has no relevance for data transmission. The internal connectors, and the cables used make all the difference. These can be made well for a reasonable price, and that is all that matters.

Cheers
 
Last edited:
Also remember, the USB is also carrying a 5V power and ground all the time. And the data carrying cable should be certified for data transmission speeds of up to 12MHz. Badly constructed cables can make the power bleed into the data and create interference.
Cheers

Sorry, I need some education here. Not sure how USB power, which is DC, can bleed into data unless we are talking about a short circuit. It will just created a steady magnetic field. DC current does not leak electromagnetic energy. The data current can get affected by the steady magnetic field but I would want to see some real measurements. My hunch is that it should be minimal. Does any one have any real data?

12MHz USB is ancient (1.1 spec). Any cable sold now should be certified for 480 Mbps (USB 2.0) (or 4.8 Gbps USB 3.0)
 
Last edited:
bleeding power into digital data? Isn't digital data immune to interference? I mean, we have been hearing that 1s and 0s are 1s and 0s, no matter what.
 
Wharfedale Linton Heritage Speakers in Walnut finish at a Special Offer Price. BUY now before the price increase.
Back
Top