Adul
Elite Member
Originally posted by: mshan
I would argue that my "premium" HDMI cable isn't enhancing the signal vs. basic Monoprice ones, it is just passing the signal with less degradation and distortion (thus my comment about how the Monoprice cable may look slightly coarse, but that it may not be something you realize till you compare it to one that is less price constrained in terms of construction than say basic Monoprice one).
It is hard to believe that HDMI interface, in real world applications, is perfect and completely distortion free, unless you have catatrophic all or nothing drop out.
If basic Monoprice cable ever so slightly blurs signal (jitter?), then a lower jitter cable (Wireworld Ultra-Violet) could pass a sharper, lower noise, higher resolution, seems like more right colors, signal simply because it is truer to original source.
The signal degradation would matter if it was analog, but this is a pure digital signal. What degradation is going to happen to a digital signal? It is not susceptible to noise or interference the same way an analog signal is.
But a digital signal, because of the way its information is stored, can be quite robust. While the signal will always degrade to some degree in the cable, if the receiving circuit can actually reconstitute the original bitstream, reception of the signal will be, in the end analysis, perfect. No matter how much jitter, how much rounding of the shoulders of the square wave, or how much noise, if the bitstream is accurately reconstituted at the receiving end, the result is as though there'd been no degradation of signal at all.