TAS reader and frequent AVGuide commenter ScottB submitted this piece, originally in response to a thread that developed around a USB cable review. I though he eloquently summarized a reasonable view of more broadly polarizing issues. This won't stop the food fight, but for those who actually want to think about what we know and how we know it, I thought it was a though-provoking read.
Here's Scott's piece:
Some relevant information about me: I'm an MSME who spent most of his career in software, culminating in 6 years as CTO of a F500 software company. I'm hardly a credulous subjectivist, either by training or by temperament. But a lot of the righteous certainty expressed (often hyperbolically) by the "objectivists" on this thread – and more generally, in this forum and others - is just plain fundamentalist ignorance, borne of oversimplification, misapplication of technical knowledge, and failure to think curiously and creatively.
The criticisms of the review fall into two camps that one often sees in other threads and comments: 1). You can't possibly have heard any differences, because it's impossible for cables to change the sound, and/or 2). the only valid way to determine audible differences is a double blind test (DBT).
Let's look at each of these in turn. First, as a number of posters have already pointed out, the USB audio protocol used by most DACs today is a synchronous protocol, very different from the packet-based protocols used to transmit data between a CPU and other types of peripherals. Synchronous digital transmission protocols, like SPDIF, AES, and synchronous USB, are transmitting both data and clock (timing) in the same analog-like waveform. Accurately recovering the data from that waveform is trivially simple compared with accurately recovering the timing. That is significant.
All digital audio systems exhibit some degree of timing error (aka jitter), which only becomes significant when the signal is transformed back to analog. Errors in timing of the data stream fed to the reconstruction filter(s) create distortion in the analog output. This distortion is enharmonic, and thus audible at much lower magnitudes than the harmonic distortion introduced by analog components and speakers. The magnitude of jitter-related distortion depends in a complex way on the characteristics of the jitter itself, as well as the encoded audio signal. None of this is controversial or unscientific; digital transmission jitter has been studied extensively, in both audio and other contexts.
Can cables impact jitter? Yes, in fact from the perspective of pure physics, it's impossible for a cable not to impact timing in a synchronous digital transmission system, because the signal itself is effectively like an analog waveform, containing very high frequency components. The interaction between the electrical characteristics of the cable, and the source and sink the cable is connected to, will change the shape of that waveform to some degree, and thus contribute to timing errors (jitter). How much those timing errors will impact the final analog signal depends in a very complex way on interactions between the cables, the transmitting and receiving circuits, the signal data itself, and the type and implementation of various clock recovery (de-jittering) schemes on the receiving end, but in general it is not possible to make a perfect de-jittering algorithm, and therefore not possible to make a synchronous digital transmission cable which will be perfectly "transparent" in the context of a complete system. So, in an absolute sense, the question of whether USB or SPDIF digital cables can impact the sound is pretty simple: of course they can. In fact, they almost must.
(BTW, for an object lesson in the dangers of oversimplifying systemic interactions between components, see the following set of measurements on speaker cables: http://www.audiodesignline.com/howto/showArticle.jhtml?articleID=201807390 in which the experimenters were shocked, shocked to find out that cables and amplifiers behave unexpectedly non-linearly when connected to a real load instead of a resistor).
OK, so yes, there can be differences in the analog audio signal resulting from "digital" cables. Are those differences audible? Well, that question can't be answered in general, because it's so dependent on both the specific system, and the listener. Here's where we get to that DBT part.
There is nothing wrong with applying DBT testing, per se, to audio equipment, just as long as we keep in mind what it is we are testing. Specifically, DBT testing as applied to music listening conflates four different tests into one test. If a double blind A/B/X test does not yield a statistically significant difference between A and B, at least these possibilities exist: 1). there are no differences in the physical sound between A and B, or 2). the differences in sound between A and B fall below the perceptual threshold of the test subject's hearing mechanism, or 3). A and B sound different at a level which could rise above the perceptual threshold, but the differences are obscured by the rest of the test environment (equipment, room, source content, etc), or 4). the differences between A and B rise above the perceptual threshold, but are not sufficiently recognized or remembered to enable the test subject to reliably identify the source.