What is HDMI and is there any difference between HDMI and HDMI 2.0? Here's everything you need to know about this awesome TV connectivity standard.
HDMI, which stands for High-Definition Multimedia Interface, is the name for the most commonly-used cable connection for televisions. But as is the way with all things techy, it’s never that simple. In fact, HDMI has evolved over the years right up to the current HDMI 2.0 standard, and there are a few differences between the old and new versions.
If you’re buying a new TV or monitor and wish to connect a games console, Blu-ray player or some other streaming device, you’ll need to know what’s different in HDMI 2.0.
But no worries as this guide will clarify what HDMI is, what HDMI 2.0 offers and what those differences mean for you.
What is HDMI and what are the different versions?
HDMI is a cable connection used to deliver visual and audio information to screens. HDMI cables can also be used as pass-through connections, to hook-up amplifier surround sound systems to your TV.
HDMI 1.1 was the original version, made specifically for DVD playback. The next real big jump was version 1.4, where 3D support and audio return was added to the cable’s abilities. As long as the devices at either end support the latest HDMI then the current category 2 cable, which is essentially just a dumb pipe, will transfer the data.
Newer HDMI connections are backwards compatible since they feature the same 19-pin connector, so you’ll nearly always get a signal even on HDMI 1.4. But that won’t carry all of the required information for the highest quality image possible - which HDMI 2.0 can manage.
What is HDMI 2.0?
The big jump forward for this TV tech is in HDMI 2.0, which was unveiled back in 2013. This took 4K data transmission to the next level thanks to the new 18Gbps bandwidth.
Sure, HDMI 1.4 was already able to transfer 4K signals, but they were limited. While it could transfer the full 3,840 x 2,160 resolution, the picture was limited to 30Hz, which wasn’t enough for TV broadcasts and UHD gaming. For example, right now Sky Q offers Ultra HD Sky Sports games which are broadcast in 4K resolution at speeds of 50fps, something you need HDMI 2.0 to enjoy.
HDMI vs HDMI 2.0: How is HDMI 2.0 better for HDR?
One of the limitations of HDMI 1.4 is its colour range, which tops out at 8-bit. Now that High Dynamic Range (HDR) televisions have arrived with 10-bit and 12-bit colour options, the new HDMI 2.0 standard is required.
That means billions rather than millions of colours can be displayed on your compatible HDR TV, creating a far richer and more lifelike images that the old HDMI could not support.
Of course if you’re receiving your 4K HDR signal via the TV’s internet connection streamed through the likes of Netflix, Amazon Prime Instant Video or YouTube, the HDMI cable won’t be involved anyway.
When it comes to 4K Ultra HD Blu-ray players, or 4K console gaming, it’s essential that you have HDMI 2.0 if you want to take full advantage of the higher quality content.