A few weeks ago, we had the opportunity to speak with Luxtera, which has come to prominence in the display industry for their role in the recently-ratified DisplayPort standard. Luxtera was one of the main forces behind the inclusion of hybrid devices--cables that convert electrical signals and transmit them across a fiber optic cable--in the DisplayPort 1.1 standard. In short, the company is built on the premise that traditional copper interconnects cannot keep up with the ever-increasing demands for inexpensive bandwidth in areas such as displays and networking.

To get some idea of why this is the case, let's take a look at the bandwidth of a number of interfaces.


Unsurprisingly, these higher bandwidth requirements are driven by increases in resolution for consumer displays, refresh rates, and colour bit depths for future Deep Colour displays. Interestingly, most of us aren't really stressing the interfaces as much as theoretically possible today:


It should be noted that both DisplayPort and HDMI also transmit audio data via the same lanes as the video data, so the audio bitrate should be added to that. It is, however, significantly less significant than the video bitrate. The good news is that, in theory, both of these standards have enough bandwidth to handle the display technologies of tommorow. Of course, there's a catch, and a very big catch at that: cable length.

The problem of cable length for the HDMI standard has been noted by other websites. Basically, their testing revealed that most cables on the market could handle 720p/1080i at a distance of up to 50 feet (15 meters) without errors on the screen. That might already not be quite long enough for certain users, but matters get much worse once you consider that the maximum cable length scales with the video and audio bitrates. 1080p 24bpp/60FPS uses less than one third the maximum bandwidth of the HDMI standard. Cables released today that can support 1080p at decent lengths are thus most likely inappropriate for future display standards, and their length would have to be reduced to potentially subpar levels to get an error-free signal. At the same time, it is very unlikely that the mainstream will migrate to anything higher-end than 'plain' 60Hz 1080p for the foreseeable future, so this problem will not affect the majority of the market. But for A/V enthusiasts interested in high-end products and the potential advantages of HDMI 1.3, this could be a serious limitation.

So, was there anything that could have been done to avoid this situation when the standard was created? Yes, but all of them have significant drawbacks. Many of them would have made easy backwards compatibility with DVI nearly impossible, and all possibilities that we considered would result in significantly higher costs for devices as well as cables. For example, coaxial cables could have been used instead of twisted pair cables,  or maybe the number of data lanes/wires per cable could have been increased. What about adding error correction and an extra lane to retransmit the necessary data? Or perhaps just use more expensive and exotic techniques in general.

However, none of this was done. One plausible explanation is that when the original revision of HDMI was created, 720p and 1080i were the only mainstream video display standards, and while 1080p was obviously going to become mainstream eventually, what was going to happen next was very much anyone's guess. And in fact, given how much of a major transition 720p/1080p was, and how infrequently and slowly these tend to occur, it is probably also a fair guess that the majority of consumers would not migrate towards anything higher-end than 1080p/24bpp/60Hz for the next several years. So, increasing costs and size in order to future-proof the standard would probably not have been considered such a wonderful idea.

What's especially troublesome is that it doesn't seem like much, if anything, was done to improve usable cable lengths with HDMI 1.3. Since the potential bandwidth of the standard is much higher than what any current display will support, it also isn't really possible to tell exactly what maximum cable length will be at 10Gbps. Let's just say we'd be quite surprised if it was anything to write home about, though. It only gets worse for future generations, with bandwidth requirements of 20Gbps or maybe even 30-40Gbps. For copper cables in general, these kinds of bitrates will be extremely hard to achieve, and in the Ethernet market where the next standard might actually be 100Gbps, all but impossible.

It should thus be fairly obvious that copper cables and the associated circuitry at both ends are nearing their physical limits at 10Gbps, either in terms of cable length for video interconnect standards such as HDMI or costs and power dissipation for Ethernet. Plus, it will only get worse after that. Sadly, current alternatives have problems of their own. Fiber optic cable, the obvious choice, is quite cheap. The optical components necessary for the transceivers, on the other hand, may cost hundreds of dollars, even if we take economies of scale into consideration.

But an optical solution with much lower production costs could be a very interesting proposition, and that's exactly what Luxtera has produced.