HDMI goes the USB way

HDMI goes the USB way
HIGHLIGHTS

USB went through several standard revisions over the years

First, it was 1.0, followed by 1.1 and then came 2.0 which continues to be the most popular standard since its release in the year 2000

USB 3.0 brought a lot of improvements in power and transfer rates but it soon underwent multiple revisions

Remember the old times when USB speeds were easy to understand and the features were not 10 different ancillary standards? I sure do. USB went through several standard revisions over the years. First, it was 1.0, followed by 1.1 and then came 2.0 which continues to be the most popular standard since its release in the year 2000. USB 3.0 brought a lot of improvements in power and transfer rates but it soon underwent multiple revisions. It started with a USB 3.0 SuperSpeed mode which then became USB 3.1. Yeah, the USB-IF (USB Implementers Forum i.e. those who manage the USB standard) just renamed USB 3.0 SuperSpeed as USB 3.1 Gen 1 and brought out a faster version called USB 3.1 Gen 2. And if that wasn’t enough, they chucked it all and came out with USB 3.2 and started renaming everything again. And the cherry on top of the cake? There’s now an added layer of designation. It’s now USB 3.2 Gen 1×1 or Gen 2×1 or Gen 2×2. And now that USB4 has started to come out, I’m pretty certain they’ll find even better ways to confuse people. 

HDMI

The thing about maintaining standards is that you get to charge every company that wants to sell a USB product, a certain amount, for certification. You can only advertise or call your cable as USB after you’ve received the certification. I’m not saying that this hellish jumble of similarly-specced standards was deliberately set up to allow the USB standard maintainers to milk hardware manufacturers but it sure seems that way. 

The folks running the HDMI standard now want to copy this confusing way of handling standards. HDMI released version 2.0 around the same time as the folks over at USB-IF (USB Implementers Forum) did. HDMI 2.0 came out with support for TMDS encoding allowing it to carry 4K at 60 Hz and 24bit colour depth video signals. HDMI 2.0a soon followed with support for HDR video and static metadata. This was followed by HDMI 2.0b with more support for HDR metadata including for Hybrid Log-Gamma. All of these features might seem like Latin and Greek but bear with me. 

Then HDMI 2.1 was launched in 2017 and added support for 4K 120 Hz and 8K 120 Hz. In fact, it can go up to 10K at 120 Hz. Support also includes HDR dynamic metadata. And there’s also Variable Refresh Rate (VRR) and Auto Low Latency Mode (ALLM). HDMI 2.1 can carry up to 48 Gbps of data thanks to a new standard called FRL or Fixed Rate Link. 

Thus far, it seems as if each new revision to the standard is bringing a decent upgrade in terms of the resolution and frame rates aside from other features. It remains very clear what each revision brings to the table. And now comes the irritating part.

When HDMI 2.1 was announced, HDMI 2.0 ceased to exist. SHOCKING! Yes? No? Confusing, very much so.

HDMI

HDMI 2.0 is treated as a subset of HDMI 2.1 and is clubbed under HDMI 2.1. There are a lot of features that distinguish HDMI 2.1 from its predecessors. Now, as per the HDMI standards body, all those features are deemed optional. So, every manufacturer under the sun can relabel their HDMI 2.0 products as HDMI 2.1 and consumers would end up buying the higher standard for a premium. Only to realise later on that they’ve got none of the features that were defined in the new HDMI 2.1 standard. So features such as:

FRL (Fixed Rate Link) – Responsible for higher bandwidths (OPTIONAL)
VRR (Variable Refresh Rate) – For avoiding frame tearing (OPTIONAL)
ALLM (Auto Low Latency Mode) – For reducing input lag (OPTIONAL)
Everything under HDMI 2.1 standard (OPTIONAL)

Manufacturers are now required to only mention HDMI 2.1 as the interface and avoid all mentions of HDMI 2.0. So, how does a customer figure out what HDMI features they can use on the new TV that they’re about to buy? By trusting the marketing mumbo-jumbo of the TV manufacturer. And we all know how that will pan out.

Display manufacturers are expected to be transparent about the HDMI 2.1 features that they support. So, if you’re buying a TV, you’ll have to read the fine print which is purposefully hidden from plain sight, and then you have to understand what each technical term means. This has to be compared against the HDMI 2.1 features of the device that is going to be connected to the TV such as a gaming console or a media player. Does that sound fun to you? 

Buying advice

Laptop for work

Hey Agent001, my company is giving us `65K to purchase a laptop. I’m thinking about adding some extra money from my own pocket and getting a decent laptop for work. I’ve already got a desktop PC that I use for gaming so I don’t need the laptop to be for gaming. Although, a little bit of gaming is always going to happen. I’m willing to increase the budget to approximately `90K-1 lakh. Should I go for a 2-in-1 laptop or an ultrathin notebook or should I ditch Windows and go for a Macbook?
–Sid

Hey Sid, I’d ideally recommend understanding the software ecosystems that your company is currently invested in while considering options for your purchase. For example, if you are in the media creation industry and colour accuracy is an important aspect, then you need a calibrated display that comes closest to the colours being used in your line of work. Similarly, if your entire company is using software that uses GPU acceleration and the APIs are proprietary, then you will have to opt for a laptop that has the appropriate GPU. So, if your work software uses CUDA, you’ll need NVIDIA hardware. But if it uses OpenCL, then either NVIDIA or AMD GPUs will work fine. In the same line, if everyone in your department uses Windows, then it’s best that you stick to Windows. And the same goes for MacOS. Since you haven’t mentioned the kind of work that you’re into, I can’t give you a proper recommendation, so I’m going to make a few assumptions and work out the recommendations. 

MacBook air

If you need to get an Apple device then you only have the option of picking the MacBook Air. The MacBook Pro starts at `1.3 lakhs so you’ll need to double your budget if you want to get the base Pro model. The one with the Apple M1 chip is just as good, if not better than the Intel CPU-based laptops for media creation workloads. Again, your budget can only accommodate the base model which is very good, to begin with.

If you need to get a PC, then check out the Acer Swift X laptops which come with AMD Ryzen 5 5600U and NVIDIA RTX 3050 graphics. There’s also the Swift 5 with the Intel Core i7-1165G7 if you’re a Team Blue fan. The downside is that this one does not come with a discrete GPU. There’s also the Zenbook 13 OLED laptops from ASUS which have really impressive displays if a good display panel is higher on your priority. Right now, the models will either offer you an AMD Ryzen 7 5800U or Intel 11th Gen Core i7-1165G7 as the processor. Both are great, except AMD has the better-integrated graphics. All laptops mentioned here are in the `85-95K price range and I hope you find one that’s to your liking. 

Agent 001

Agent 001

I have a keyboard and I'm not afraid to use it, because I have a license to quill. View Full Profile

Digit.in
Logo
Digit.in
Logo