40
\$\begingroup\$

Assuming two cables both conform to the same standards (e.g. HDMI2.0, high-speed, ...) is it possible that a premium cable will outperform a standard cable when using an objective benchmark that measures the quality of picture/audio? Or what experiment would show that a premium cable has a benefit over an 'ordinary' cable?

\$\endgroup\$
13
  • 9
    \$\begingroup\$ When using a subjective measure, almost anything is possible. (-: \$\endgroup\$
    – Jim Mack
    Commented Oct 8, 2018 at 15:15
  • 1
    \$\begingroup\$ I hope this doesn't get closed. It raises issues related to high-speed cabling in general that also apply to things like eSATA, USB SuperSpeed, Ethernet, PCIe, etc. \$\endgroup\$
    – Dave Tweed
    Commented Oct 8, 2018 at 15:17
  • 2
    \$\begingroup\$ I don't mean to be defensive, but I don't think that this question is about use of electronic devices. Look at the currently leading answer to see how technical and engineering-related this topic is. \$\endgroup\$
    – zr.
    Commented Oct 8, 2018 at 17:02
  • 1
    \$\begingroup\$ @Chupacabras: See my answer. "Standard" cable is built just well enough to meet the given specifications at the given length(s). \$\endgroup\$
    – Dave Tweed
    Commented Oct 8, 2018 at 20:26
  • 2
    \$\begingroup\$ Most HDMI cables work fine for 1080p. However, I have found that when pushing past this (4k 60 fps, higher color depth, etc) you run into a lot of cables that don't. This is even more so the case with Displayport (anecdotal - my experience of dealing with a lot of cables), where I found that unless it is a cable from a reputable brand (and in the case of DP listed on the DP website as being approved), it tends to simply not work for anything above 1080p, or even worse, work intermittently. \$\endgroup\$
    – Joren Vaes
    Commented Oct 9, 2018 at 5:13

10 Answers 10

46
\$\begingroup\$

You would do a BERT (bit error rate test) on the cable. Better yet, look at the eye diagram at the far end of the cable.

HDMI is a digital format, which means that there's a threshold effect — cable quality does not affect the picture quality at all until it gets so bad that it actually causes bit errors.

"Premium" cable is (supposedly) built to tighter tolerances (reduced ISI), with thicker wire (reduced attenuation) and/or with better shielding (reduced external interference) so that you can have longer runs of it before that starts to happen.

Bit errors flip individual bits, and the visual effect depends on exactly what that bit is used for. A bit error in one of the MSBs of a color channel will cause a pixel to be unexpectedly brighter or darker than it should be — this is commonly called "salt and pepper noise" because in a B&W system, the random white and black pixels look like salt and pepper have been sprinkled on the image.

\$\endgroup\$
14
  • 35
    \$\begingroup\$ That statement should be made stronger: the cable does not affect image quality in any way whatsoever so long as there are no bit errors. Bit errors will cause "digital snow" on the image. You will not see any change whatsoever in color, detail, contrast, etc. depending on the cable. Either you get a perfect picture, a picture with digital snow, or no picture at all. \$\endgroup\$ Commented Oct 9, 2018 at 5:40
  • 8
    \$\begingroup\$ Length as a variable should be emphasized more. There are tests that show that there is a difference in error rates at longer lengths (greater than 50 ft). \$\endgroup\$
    – TREE
    Commented Oct 9, 2018 at 15:25
  • 6
    \$\begingroup\$ Bit errors have a much more drastic effect if the signal has been encrypted with HDCP. \$\endgroup\$
    – Ross Ridge
    Commented Oct 9, 2018 at 20:33
  • 3
    \$\begingroup\$ The only time I have seen digital snow was in a dentist's office where they used cheap passive HDMI over dual cat 5 baluns. The screens would even go blank entirely depending on the content. Lowering the resolution helped somewhat. I recommended replacing the cat 5 with proper long HDMI cables, which solved the problem. \$\endgroup\$ Commented Oct 9, 2018 at 23:36
  • 4
    \$\begingroup\$ HDMI uses 8 bits of BCH for error correction for every 64-bit subpacket, so I don't think it's true that a bit error will simply result in a pixel to be simply brighter or darker, especially considering the protocol uses TDMS with compressed and interleaved audio, video, and auxiliary data. So if a bit error is not stopped by 8b/10b encoding and not dealt with or detected by ECC, then an entire packet will be corrupt and the resulting corruption will be more than just a single slightly modified pixel. \$\endgroup\$
    – forest
    Commented Oct 10, 2018 at 5:45
27
\$\begingroup\$

HDMI cables are tested at an Authorized Testing Center (ATC) and given a certification based on how much bandwidth they can handle (which is to say, how high of a frequency signal they can transmit without the signal degrading beyond some parameters specified in the standard).

Signals in a cable degrade. The signal that is input to the cable is not precisely the same as the signal that is received, due to various effects which depend chiefly on cable length, the physical properties of the cable stock, and the signal frequency. The longer the cable is, the more distorted the signal will be, and the worse the cable stock is, the more distorted the signal will be per metre of cable that it passes through.

In analog signaling, any amount of distortion changes the image, it's just a question of by how much. If we're transmitting an image across, say, a VGA cable, then you have 3 signal lines, one for each channel of a pixel (red, green, and blue). Each pixel is transmitted in sequence, and the voltage on each line at any given time represents the brightness of one channel of the current pixel. I don't know what the signal voltage of VGA is, but I'm going to pretend it's 1.0 V. Since it's analog, if the signal voltage is 0, that means 0 brightness, if it's 1 V, that means 100% brightness, 0.5 V means 50% brightness, etc. The voltage on the line is analogous to the value being communicated. Of course, if you transmit 0.55 V and, due to distortion, the receiver picks up 0.51 V, the image will come out ever so slightly different than intended. And more distortion means larger inaccuracy in the results.

In digital signaling, nothing changes, except that we only signal 0 V or 1 V. We don't use any of the in-between voltage levels (some digital signals will use several levels, maybe 4 or 5 levels instead of 2, but the point is, we only use a few levels instead of a continuous spread. For simplicity, we'll go ahead with just 2-level digital signaling). Since we are not using any in-between levels, the receiver knows automatically that if it receives a 0.8 V or a 0.9 V signal, that it is really supposed to be a 1 V. So, distortion is corrected by the receiving device. Of course, there is a trade-off, since you can only represent 2 different numbers with each signal instead of dozens or hundreds, you require many additional signal cycles in order to communicate the same amount of information. That's why a 3-channel analog video system like VGA only needs to operate at around 150 MHz on each channel to transmit 1080p 60 Hz, while a comparable digital equivalent like HDMI (which also uses 3-channels, one for each color channel in RGB mode) has to operate at around 1.5 GHz on each channel to transmit 1080p 60 Hz. But anyway...

So distortion in the signal has no effect on the image quality of a digital transmission, because even though the voltages of the signal might be altered slightly during transmission, the system can tell what it was supposed to be, as long as it's even remotely close to the intended value. However, it's important to note that digital signals aren't immune to interference. The only difference is that the effects of the interference are corrected by the receiver.

Because of this ability to correct interference, the quality of an image transmitted over a digital interface like HDMI is not affected by the cable, as long as the distortion is small enough to be correctable. Different HDMI cables do have different amounts of signal distortion, but since the distortion is corrected, it's irrelevant, UNLESS the distortion is so high that the receiver starts interpreting values incorrectly. So how does that happen? Well like I said, the distortion in the cable is affected by the cable length, cable stock quality, and signal frequency. That means:

  • (Mainly applicable to manufacturers) For a given signal and cable stock, if you make longer and longer cables from that stock, the signal will eventually fail to transmit correctly. In this case, you would need a better quality cable stock if you wanted to make a cable of that length that can handle that signal
  • (Again mainly a consideration for cable manufacturers) For a given signal and cable length, if you make the cable out of poor enough quality cable stock, it will fail to transmit correctly. However, that cable stock may still work at the desired frequencies for shorter cables, and it will also likely work for lower frequency transmissions, so you can simply label it with a lower rated speed and sell it
  • (This is applicable to consumers) For a given cable, with a certain length and construction, if you signal at higher and higher frequencies, eventually it will fail to transmit correctly. So a cable that is ok at 10.2 Gbit/s may not work at 18 Gbit/s. To transmit at the higher signal frequency, you would need either a higher quality cable or a shorter cable, or some combination of the two.

If you have a cable and you transmit higher and higher frequencies, you won't get decreased image quality, it will just fail to work once you pass a certain point (or, it will work intermittently, if you are right up against the limits).

In realistic terms, pretty much any HDMI cable can handle 10.2 Gbit/s (1080p 144 Hz or 1440p 75 Hz, or 4K 30 Hz), and even 18 Gbit/s (4K 60 Hz) at shorter lengths, no matter how cheap the cable stock used by the manufacturer. However, when you start combining long cable lengths and high frequencies (i.e. if you want a 15 metre cable for 4K 60 Hz, requiring 18 Gbit/s), you will get failures if the cable is not a high enough quality.

But, it's not really a big deal. Because the creators of the HDMI have certifications for certain thresholds of bandwidth.

Cables that have been tested at an Authorized Testing Center to reliably handle signals with fundamental frequencies of up to 3.4 GHz on each channel (i.e. 10.2 Gbit/s aggregate, or the maximum speed of HDMI 1.3/1.4) are given a High Speed HDMI cable certification.

Cables that have been tested at an ATC to reliably handle signals up to 6.0 GHz per channel on 3 channels (i.e. 18.0 Gbit/s, or maximum speed of HDMI 2.0) are given a Premium High Speed HDMI cable certification.

Cables that have been tested at an ATC to reliably handle signals up to 12.0 GHz per channel on 4 channels (48.0 Gbit/s aggregate, or maximum speed of HDMI 2.1) are given a Ultra High Speed HDMI cable certification.

Please note that version numbers are not a proper or officially recognized way of describing cables, so "HDMI 2.1 cable" has no official meaning and does NOT mean the cable has been certified at an authorized testing center. In fact advertising version numbers on cables has been explicitly banned by the HDMI Licensing Authority and any such cables are automatically considered non-compliant. Genuine certified cables have a special logo which you can read more about on the HDMI website. There are many cables which have not passed certification, and they will advertise terms like "4K certified" or "HDMI 2.0 certified" or whatever, rather than the real title which is "Premium High Speed HDMI cable" and so forth. So watch out for those. Always look for the certification logo.

Anyway, as for the original question... Will a premium cable outperform a standard cable, if they have both passed the same certification? Well, it depends what you mean by "premium cable".

If you mean "a Premium High Speed certified HDMI cable", well if both cables have passed the certification, then they are both premium HDMI cables.

If you just meant "a really good quality HDMI cable vs. a normal quality HDMI cable", well again, if they have both passed the same certification. There will be no difference within the bounds of the certification. If two cables both passed the Premium High Speed HDMI cable certification, then that means they were both tested to reliably handle 18 Gbit/s speeds. If you use them at 18 Gbit/s or below, there will be no difference between them.

How the cables perform at speeds higher than that is a mystery, it's entirely possible that one cable just barely passes the certification, and will stop working at 25 Gbit/s, while the "high quality cable" will continue working up to 50 Gbit/s, you never know. So, you could make an argument for "future-proofing" by buying cables that can handle speeds way beyond what the specification demands today. But I don't think this is very wise, because:

  1. There's no such thing as a "bandwidth meter" that a normal person can buy, so the only way to "check" is by having hardware that can operate at that speed
  2. So, when you buy cables with "future proof extra bandwidth", you won't be able to check that it's true for many years (read: long after your warranty has expired)
  3. Cable vendors have already demonstrated, they don't give a shit about outright lying about the speeds their cables can handle if customers don't have an easy way to check

For further reading, I would suggest this.

Got work to do, so there may be some minor mistakes above, I don't really have time to proofread right now :)

\$\endgroup\$
1
  • 4
    \$\begingroup\$ "HDMI cables are tested at an Authorized Testing Center (ATC) and given a certification based on how much bandwidth they can handle (which is to say, how high of a frequency signal they can transmit without the signal degrading beyond some parameters specified in the standard)." which is all well and good if you only have one cable in the channel, but often you will have three cables source->wallport->wallport->sink. If you want the overall channel to meet the spec then the individual components need to exceed it. \$\endgroup\$ Commented Oct 9, 2018 at 19:21
11
\$\begingroup\$

If the cables actually conform to the standards specified, then there will be no difference between a "premium" or "ordinary" cable, since the signals in question are digital.

However, in reality you may find cables that do not conform to the standard that are advertised otherwise.

\$\endgroup\$
10
  • 11
    \$\begingroup\$ Standards imply a guarantee of minimum performance. One cable can certainly differ from another at the high end. \$\endgroup\$ Commented Oct 8, 2018 at 15:05
  • 2
    \$\begingroup\$ @ScottSeidman Yes, I absolutely agree. I guess I implicitly made the assumption that the user would be operating the cables within their rated limits, but we both know that's not always the case :) \$\endgroup\$
    – Shamtam
    Commented Oct 8, 2018 at 16:45
  • 2
    \$\begingroup\$ I agree with it to, @Chupacabras -- but it doesn't address the question being asked. \$\endgroup\$ Commented Oct 8, 2018 at 19:57
  • 2
    \$\begingroup\$ Define "outperform." if both cables have a bit error rate which is zero for all practical purposes when used as per the specification, then arguing that one is "better" than the other is "marketing BS", not "electrical engineering." \$\endgroup\$
    – alephzero
    Commented Oct 9, 2018 at 0:23
  • 3
    \$\begingroup\$ @Chupacabras You're assuming the transmitter and receiver are also meeting standards, which they may not. This is more common than you think; I recall an errata that some SATA ports on a mass-market Intel chipset were identified as marginal fail. Additionally, the user may be doing something violating standards, like daisy chaining cables or using one that is much too long. In all of these cases, cables (and transceivers) which are superior to standards help. \$\endgroup\$
    – user71659
    Commented Oct 9, 2018 at 3:08
5
\$\begingroup\$

Assuming two cables both conform to the same standards (e.g. HDMI2.0, high-speed, ...) is it possible that a premium cable will outperform a standard cable when using an objective benchmark that measures the quality of picture/audio?

No, it's not possible.

A digital transmission line is specified to have zero error rate (to be more specific: an error rate small enough so de-redundancy, error-correction or retransmission routines can correct all of them, thus ultimately achieving error-free transmission). If the cable meets specification, it is the case. The signal at the receiver is perfectly same signal that was sent. There is no room to improve on perfection.

Or what experiment would show that a premium cable has a benefit over an 'ordinary' cable?

If you stay within the spec, there is no benefit. Benefits of a better (signal-wise) cable can materialize if you go beyond the spec. Eg. as Joren mentioned, many old cables specced for 1080p in fact work with 4k. Eg. you can experiment with 4k cables at 8k or 120Hz.

Caveat is that you've asked for a premium cable, not "a cable that has more bandwidth" or "less attenuation". And what constitutes "premium" is debatable and purely subjective. Eg, I consider ultra-thin (about 5mm) cables as "premium", because they offer me extra flexibility. My reasoning is that it takes extra work for the manufacturer to meet same spec in a smaller diameter. On the other hand, there are plenty of cables where "premium" means extra-thick and extra-stiff. In this case, the premium is in supposed greater physical durability of the cable, personal satisfaction from getting physically "more of a cable" and possibly meeting future specs because of the (today) superfluous copper, insulation and shielding.

\$\endgroup\$
5
  • \$\begingroup\$ However, HDMI has neither error correction nor retransmission. \$\endgroup\$
    – Dave Tweed
    Commented Oct 10, 2018 at 11:51
  • \$\begingroup\$ @DaveTweed There is TERC4 "with high inherent error avoidance". So, even if it's not called "error correction algorithm", it does the same thing. \$\endgroup\$
    – Agent_L
    Commented Oct 10, 2018 at 12:03
  • \$\begingroup\$ No, it doesn't. And it isn't used on video data anyway, only auxiliary data. \$\endgroup\$
    – Dave Tweed
    Commented Oct 10, 2018 at 12:17
  • \$\begingroup\$ "Premium" in regards to HDMI is an industry accepted term of art, for the cables that have been through accreditation and passed. \$\endgroup\$
    – Ryan Leach
    Commented Oct 12, 2018 at 4:27
  • \$\begingroup\$ @Agent_L error avoidance isn't really error correction, though it does help reduce the error rate. At any rate, no error correction is used on HDMI video data. TERC4 and BCH are only used for audio and control data, not video data. \$\endgroup\$ Commented Oct 12, 2018 at 5:11
3
\$\begingroup\$

Low-end cables usually function just fine, in situations with a low noise floor and if you don't mind other cables getting influenced by your HDMI signal. Depending on how low you go in quality they won't even pass EMC compliance tests in modern countries.

A high-end cable has a much better chance of being properly shielded and is usually a bit more resistant to mechanical stress.

For short pieces of cable (say 3 meters) the signal-quality difference isn't enough to cause noticeable effects on whatever is on the other end of the line. However, if you have sensitive electronics near, they might.

Should you benchmark them as explained by Dave's answer, yes, you will find a major difference in quality between cables.

\$\endgroup\$
1
  • 1
    \$\begingroup\$ On my old sat system I used to be able to receive channels depending on how the (cheap) HDMI cable was positioned (I did have a very bad reception, sat cable and decoder)... \$\endgroup\$
    – gbr
    Commented Oct 9, 2018 at 17:28
3
\$\begingroup\$

Other answers cover testing at a single point in time, typically during production or at point of purchase.

One aspect of better-quality cables though is that they will typically be better constructed, and hence will last longer. This may relate to:

  • tolerances or quality of metalwork in the connector housing;
  • quality of contacts in the connector;
  • quality of soldering (or other connection method) between the wires and the connector contacts;
  • quality of strain-relief so that flexing the cable does not snap it at a "weak point" as it enters the connector;
  • quality of signal wires so that flexing the cable does not break them;
  • and quality of screen construction so that flexing the cable does not reduce its effectiveness.

Note that I say "better constructed" and not "more expensive". Typically it does cost more to do this properly, but without disassembling the cable it is impossible to tell whether the basic construction is better or worse for one cable than another. Unless the "premium" cable manufacturer can tell you how all this was put together, there is no guarantee that your "premium" cable is any better than your "ordinary" one.

\$\endgroup\$
1
  • \$\begingroup\$ Concur - a damage-resistance will help the cable survive better over the years, which by some definition is out-performing a failed or cut cable. \$\endgroup\$
    – Criggie
    Commented Oct 10, 2018 at 22:07
3
\$\begingroup\$

Even if both cables have identical signal propagation properties, but

  • the equipment is sensitive to common-mode currents (conducted EM interference), and
  • the premium cable has better common-mode suppression chokes (ferrite rings)
  • or maybe it has better shielding and a good path for dumping the CM current,

then the premium cable is still the winner, because it is a worse antenna for EMI.

Those large coated ferrite beads you see on the ends of good cables are purely for EMC, they are invisible to the signals. They may even be mandatory for meeting FCC requirements, then they will come with the equipment.

This may apply to analog cables (interconnects, speakers) too, because external cables ("antennas") are very significant components in EMC issues. If you live close to a broadcast station then this will matter more. You can also add such CM chokes later, see for example: https://product.tdk.com/info/en/products/emc/emc/clamp/index.html .

Something else: HDMI and DVI (TMDS) receivers contain equalizers that adapt dynamically to the cable length and quality. Early implementations had fixed equalizers that were tuned for an average cable, in that case you should choose a cable that is neither too bad nor too good !

\$\endgroup\$
2
  • 1
    \$\begingroup\$ So a "premium cable" can be better if something else in the system is "broken". \$\endgroup\$ Commented Oct 10, 2018 at 10:53
  • 1
    \$\begingroup\$ @IanRingrose: Can be, although I did encounter one setup at my wife's friend's house where for some reason a cheap cable would work and a name-brand one would not for connecting two particular pieces of equipment, though the name-brand cable worked fine for other pieces of equipment. \$\endgroup\$
    – supercat
    Commented Oct 11, 2018 at 20:32
1
\$\begingroup\$

YES. But perhaps not in the way you expect.

At standard length, with equipment that conforms to standards, any compliant cable will work exactly the same. No "slight visual quality loss" is possible, since HDMI is a digital standard. If there is any quality loss (e.g. with very poor, nonconforming cable), it would be severe (heavy snow) or simply the devices would not detect that they are connected.

So what could you gain with using higher quality cable? If you push your connection to the limits (using very long cable) then there is possibility that just standard cable will not work properly (signal would be too noisy or attenuated for it to work), but high-quality cable would work. The same holds for patching two cables together (in any case this is not recommended, because of degrading the connection).

So, if you do things you should not, then the better cable has better chances to work than just the "standard" cable. Only in this case there is a difference.

\$\endgroup\$
0
\$\begingroup\$

For a reasonable cable length (say less than a few meters) no difference will be seen, you'll only stress the cable/EQ system when you stretch into several meters.

\$\endgroup\$
0
\$\begingroup\$

Potentially!

If the "premium" branding of the cable, uses the HDMI trade marks / brandings, then unless it is counterfeit or fraudulent, you can be assured that the cable has passed the required testing.

Premium_HDMI_Cable_Certification mark

The kicker is your line

Assuming two cables both conform to the same standards (e.g. HDMI2.0, high-speed, ...) is it

Who tested it? Who verified it's true and actually meets standards? If people claim incorrectly that it meets standards, what can be done about it?

Making the claim that your cable passed accreditation, when it didn't, is a much stronger court case then claiming it matched a standard to your own testing.

https://www.hdmi.org/manufacturer/premiumcable/Premium_HDMI_Cable_Certification_Program.aspx

\$\endgroup\$

Not the answer you're looking for? Browse other questions tagged or ask your own question.