What is HDR for TVs? High-dynamic-range formats and benefits, explained
HDR can make movies and shows appear closer to how content creators intended, with higher contrast and more colors. Here's what HDR means for your TV.
When you buy through our links, Business Insider may earn an affiliate commission. Learn more Steven Cohen/Business Insider
High dynamic range (HDR) is a feature found on many modern TVs, and it can dramatically enhance picture quality. HDR allows a display to produce images with higher brightness, better contrast, and expanded colors. In other words, HDR can make images pop more intensely, appear more realistic, and look closer to how content creators intended.
But while HDR is now a common feature on many of the best TVs, performance varies greatly across different models. Just because a display says it supports HDR doesn't mean it's capable of fully taking advantage of the feature. Some budget-friendly sets, like the Roku Plus Series, do a solid job of offering entry-level HDR performance, but to get the most impressive HDR experience, you'll need a top OLED display, like the LG G4, or a bright QLED TV with Mini LEDs, like the TCL QM7.
To help sort through all the ins and out of high-dynamic-range display technology, we've put together a comprehensive guide detailing everything you need to know about HDR TVs and different HDR formats.
What is HDR for TVs?
Steven Cohen/Business Insider
High dynamic range (HDR) is a display technology that enables a TV or monitor to produce enhanced contrast levels when playing HDR-encoded videos. This allows bright elements of an image to look brighter and dark elements to look darker while preserving more detail in all the steps between both ends of the spectrum.
The enhanced contrast of HDR creates an added sense of depth with more visible details and realistic intensity in specular highlights and shadows. A good example would be a scene featuring sunlight glimmering off the ocean. Without HDR, a sequence like this could look comparatively dull with lost detail and clipped highlights, but with the expanded range that HDR allows, extreme highlights like this are given room to breathe and really pop. The reflecting light can look more radiant and more detailed simultaneously.
Brightness levels for HDR are measured in a unit called nits. The more nits a TV can produce, the brighter its HDR highlights can look. In general, most HDR movies and TV shows are mastered with a max of 1,000 to 4,000 nits in mind, though HDR videos can technically be mastered for up to a whopping 10,000 nits. However, very few consumer displays can hit such a high number. Most premium HDR TVs max out at around 1,500 to 3,500 nits, while midrange models achieve 700 to 1,500 nits, and entry-level options offer around 400 to 700 nits.
What is Wide Color Gamut?
Steven Cohen/Business Insider
HDR is often bundled with another display feature called Wide Color Gamut (WCG), which enables a TV to produce an expanded range of colors. Though technically two separate things, a wide color gamut is almost always used when mastering a video in HDR, so WCG is often considered part of HDR tech.
When using WCG, HDR videos are encoded within a color space called Rec. 2020, though most HDR movies and TV shows are actually graded for a narrower standard called DCI-P3. DCI-P3 is the same range of colors used for modern digital theater projection, while Rec. 2020 can offer an even wider range than that.
So, with HDR and WCG, you can enjoy the full spectrum of colors you see in theaters at home.
How does HDR compare to SDR?
Steven Cohen/Business Insider
Before the first consumer HDR displays hit the market in 2015, TVs were built to adhere to a standard dynamic range (SDR) specification. SDR displays and content are designed with a max of only 100 nits in mind, and they're typically produced for a limited color gamut called Rec. 709. Compared to the DCI-P3 color space used on most HDR videos, Rec. 709 offers a more restricted gamut of colors.
This means that SDR displays produce comparatively dim and low-contrast images with a narrower range of colors when compared to an HDR TV playing HDR programming. As a result, an SDR version of a movie or TV show will often look a bit flat and slightly muted compared to its HDR counterpart.
SDR TVs in HD and 4K resolution are still manufactured today, and all HDR-compatible TVs can still display SDR signals accurately. While HDR mastering has become a popular choice for new on-demand streaming content and 4K Blu-ray movies, SDR is still the norm for cable, satellite, live TV streaming, and over-the-air broadcasts.
How do I watch HDR movies and TV shows?
Ryan Waniata/Insider
To watch HDR videos, you need an HDR-capable display and access to HDR-encoded content. Every element in your home entertainment chain also has to support HDR, so if you watch videos through a streaming stick, it needs to be HDR compatible, and if you connect your media player to your TV through an AV receiver or soundbar, those components need to support HDR passthrough. Likewise, all devices need to be paired together using premium- or ultra-high-speed HDMI cables. Check out our guide to the best HDMI cables for top picks.
HDR videos are available on all the best streaming services, including Disney Plus, Netflix, Hulu, and Amazon Prime Video. HDR is also used on most 4K Ultra HD Blu-ray discs. Some live sporting events are also shown in HDR through certain providers, but the vast majority of cable, satellite, and livestreaming broadcasts are still presented in SDR.
What should I look for in an HDR TV?
Steven Cohen/Business Insider
All of the best 4K TVs include some level of HDR support, and some HDTVs even have HDR capabilities. However, performance varies dramatically between cheaper models and more expensive displays.
If you want the best HDR performance, you'll want to buy either an OLED TV or a QLED display with local dimming and expanded colors. Some top QLED models are sometimes called Mini LED TVs since they use smaller LEDs in their backlights to enable more precise dimming capabilities.
These display options all offer the contrast control, peak brightness, and color capabilities needed to really showcase the benefits of HDR playback. For details on the differences between OLED and QLED TVs, check out our QLED vs. OLED comparison.
When it comes to brightness, you'll want to choose a display model that comes close to outputting 1,000 nits or higher to get the most impactful HDR quality. However, TVs with that level of performance tend to be a bit pricey, and you can still get worthwhile entry-level HDR out of cheaper models that max out in the 500 to 600 nits range.
Some top HDR TVs available right now include the Samsung S90D OLED, LG G4 OLED, TCL QM7 QLED, Sony Bravia 9 QLED, and Samsung S95D OLED. Meanwhile, the Hisense U6N QLED and Roku Plus Series QLED are both great options on a budget. For more recommendations, check out our various TV buying guides:
- Best budget TVs
- Best TVs under $500
- Best smart TVs
- Best OLED TVs
- Best 100-inch TVs
- Best 85-inch TVs
- Best 75-inch TVs
- Best 65-inch TVs
- Best 55-inch TVs
- Best 50-inch TVs
Are there different HDR formats?
Steven Cohen/Business Insider
There are four primary HDR content formats: HDR10, HDR10+, Dolby Vision, and HLG.
HDR10 is the most basic and common HDR format. It's supported on all HDR TVs and is used as the standard HDR format on all 4K Ultra HD Blu-ray discs and streaming apps with HDR content. In other words, it can be thought of as the default HDR base layer that more advanced HDR formats can be added on top of.
HDR10 videos can be mastered for a peak of up to 10,000 nits, though most HDR10 content has been graded for 1,000-4,000 nits. HDR10 videos are encoded with information called "static metadata." This metadata tells a TV what colors to show and how bright the TV's images are supposed to look. Static metadata is a bit limited, however, as it can only provide information that addresses the video as a whole instead of each individual scene.
When an HDR10 movie is played on an HDR TV that can't support the full range of brightness and color that its metadata calls for, the display must adapt on its own to scale highlights and color volume to land within its capabilities. This kind of adjustment is called "tone mapping," and different display manufacturers handle tone mapping differently.
In practice, this can lead to issues with certain scenes in HDR10 videos appearing too blown out or too dark since a TV's tone mapping may not match what a content creator intended. And this is where Dolby Vision and HDR10+ come in.
Dolby Vision and HDR10+ are both "dynamic metadata" HDR formats. This means that HDR brightness and color information can be detailed on a scene-by-scene or even shot-by-shot basis. As a result, Dolby Vision and HDR10+ videos can provide more detailed tone mapping instructions to a TV so that the creators' original intent is more accurately preserved. These formats are supported on select TVs, discs, and streaming services.
Finally, HLG is an HDR format used for TV broadcasts. This format does not use metadata at all, and it's backward compatible with SDR displays, so broadcasters can send an HLG signal to all customers and have it look correct on both SDR and HDR TVs. If an HDR TV broadcast were sent to an SDR TV using any of the other HDR formats we've discussed, it would display with inaccurate colors and contrast.
Are Dolby Vision and HDR10+ better than HDR10?
Steven Cohen/Business Insider
Yes, HDR10+ and Dolby Vision can both deliver a more accurate high-dynamic-range image than the standard HDR10 format. However, their improvements are often subtle and are best appreciated on entry- and midlevel HDR TVs, which can benefit the most from their dynamic tone mapping instructions.
Popular HDR TV models typically include support for one or both of these dynamic metadata formats. But in most cases, we don't think it should be a dealbreaker if one is included and the other isn't.
Is Dolby Vision better than HDR10+?
Steven Cohen/Business Insider
Dolby Vision and HDR10+ offer the same primary benefits compared to HDR10, and neither format has a major technical advantage over the other. However, Dolby Vision has an edge when it comes to industry support.
Six of the seven major TV brands in the US sell TVs with Dolby Vision capabilities, while five of those seven support HDR10+. Here's a chart detailing which TV brands sell models with Dolby Vision and HDR10+ support.
Brand | Dolby Vision | HDR10+ | HDR10 |
Hisense | ✓ | ✓ | ✓ |
LG | ✓ | ✓ | |
Panasonic | ✓ | ✓ | ✓ |
Samsung | ✓ | ✓ | |
Sony | ✓ | ✓ | |
TCL | ✓ | ✓ | ✓ |
Vizio | ✓ | ✓ | ✓ |
When it comes to content, Dolby Vision movies and TV shows are also more prevalent than HDR10+ programs. More studios support Dolby Vision on Ultra HD Blu-ray discs, and more streaming services use it. Here's a breakdown of Dolby Vision and HDR10+ support among major streaming services.
Streaming service | Dolby Vision | HDR10+ | HDR10 |
Amazon Prime Video | ✓ | ✓ | ✓ |
Apple TV Plus | ✓ | ✓ | ✓ |
Disney Plus* | ✓ | ✓ | |
Max | ✓ | ✓ | |
Hulu | ✓ | ✓ | ✓ |
Paramount Plus | ✓ | ✓ | ✓ |
Peacock | ✓ | ✓ |
*Disney Plus has announced plans to launch HDR10+ support later this year.
Do video games support HDR?
Steven Cohen/Business Insider
HDR gaming is supported on PS4, PS4 Pro, PS5, Xbox One X/S, Xbox Series X, Xbox Series S, and compatible computers. However, only select games are designed with native HDR output.
Windows PCs can support games using the HDR10, HDR10+, and Dolby Vision formats. Xbox Series X and Xbox Series S support HDR10 and Dolby Vision. The PS5 is currently limited to HDR10 only.
If you're using a PC, you'll need an operating system, monitor, and graphics card that all support HDR to display HDR games properly.