Short description: Video and image technology for HDR displays
High-dynamic-range television (HDR-TV) is a technology that uses high dynamic range (HDR) to improve the quality of display signals. It is contrasted with the retroactively-named standard dynamic range (SDR). HDR changes the way the luminance and colors of videos and images are represented in the signal, and allows brighter and more detailed highlight representation, darker and more detailed shadows, and more intense colors.[1][2]
HDR allows compatible displays to receive a higher-quality image source. It does not improve a display's intrinsic properties (brightness, contrast, and color capabilities). Not all HDR displays have the same capabilities, and HDR content will look different depending on the display used, and the standards specify the required conversion depending on display capabilities.[3]
HDR-TV is a part of HDR imaging, an end-to-end process of increasing the dynamic range of images and videos from their capture and creation to their storage, distribution and display. HDR is frequently together with wide color gamut (WCG) technology, which aims to increase the gamut and number of distinct colors available whereas HDR increases the range of luminance available for each color. HDR and WCG are separable but complementary technologies. Standards-compliant HDR display also has WCG capabilities, as mandated by Rec. 2100 and other common HDR specifications.
The use of HDR in television sets began in the late 2010s. By 2020, most high-end and mid-range TVs supported HDR, and some budget models did as well. HDR-TVs are now the standard for most new televisions.
There are a number of different HDR formats, including HDR10, HDR10+, Dolby Vision, and HLG. HDR10 is the most common format, and is supported by all HDR-TVs. Dolby Vision is a more advanced format that offers some additional features, such as scene-by-scene mastering. HDR10+ is a newer format that is similar to Dolby Vision, but is royalty-free. HLG is a broadcast HDR format that is used by some TV broadcasters.
Before HDR, improvements in display fidelity were typically achieved by increasing the pixel quantity, density (resolution) and the display's frame rate. By contrast, HDR improves the perceived fidelity of the existing individual pixels.[4] Standard dynamic range (SDR) is still based on and limited by the characteristics of older cathode ray tubes (CRT), despite the huge advances in screen and display technologies since CRT's obsolescence.[1]
SDR formats are able to represent a maximum luminance level of around 100 nits. For HDR, this number increases to around 1,000–10,000 nits.[1][5] HDR can represent darker black levels[2] and more saturated colors.[1] The most common SDR formats are limited to the Rec. 709/sRGB gamut, while common HDR formats use Rec. 2100, which is a wide color gamut (WCG).[1][6]
In practice, HDR is not always used at its limits. HDR contents are often limited to a peak brightness of 1,000 or 4,000 nits and P3-D65 colors, even if they are stored in formats capable of more.[7][8] Content creators can choose to what extent they make use of HDR capabilities. They can constrain themselves to the limits of SDR even if the content is delivered in an HDR format.[9]
The benefits of HDR depend on the display capabilities, which vary. No current display is able to reproduce the maximal range of brightness and colors that can be represented in HDR formats.
Benefits
The highlights—the brightest parts of an image—can be brighter, more colorful, and more detailed.[2] The larger capacity for brightness can be used to increase the brightness of small areas without increasing the overall image's brightness, resulting in, for example, bright reflections from shiny objects, bright stars in a dark night scene, and bright and colorful light-emissive objects (e.g. fire, and sunset).[2][1][9]
The shadows or lowlights—the darkest parts of an image—can be darker and more detailed.[2]
The colorful parts of the image can be even more colorful if a WCG is used.[1]
The color dynamism and wider range of colors frequently attributed to HDR video is actually a consequence of a WCG. This has become a point of significant confusion among consumers, whereby HDR and WCG are either confused for each other or treated as interchangeable. While HDR displays typically have WCGs and displays with WCGs are usually capable of HDR, one does not imply the other; there are SDR displays with WCGs. Some HDR standards specify WCG as a prerequisite of compliance. Regardless, when a WCG is available on an HDR display, the image as a whole can be more colorful due to the wider range of colors.[1]
More subjective, practical benefits of HDR video include more realistic luminance variation between scenes (such as sunlit, indoor, and night scenes), better surface material identification, and better in-depth perception, even with 2D imagery.[2]
Preservation of content creator intent
When a display’s capabilities are insufficient to reproduce all the brightness, contrast and colors that are represented in the HDR content, the image needs to be adjusted to fit the display’s capabilities. Some HDR formats (such as Dolby Vision and HDR10+) allow the content creator to choose how the adjustment will be done.[6] Other HDR formats, such as HDR10 and hybrid log–gamma (HLG), do not offer this possibility, so the content creator's intents are not ensured to be preserved on less capable displays.[10]
For optimal quality, standards require video to be created and viewed in a relatively dark environment.[11][12] Dolby Vision IQ and HDR10+ Adaptive adjust the content according to the ambient light.[13][14]
Formats
Since 2014, multiple HDR formats have emerged including HDR10, HDR10+, Dolby Vision, and HLG.[6][15] Some formats are royalty-free and others require a license. The formats vary in their capabilities.
Dolby Vision and HDR10+ include dynamic metadata while HDR10 and HLG do not.[6] The dynamic metadata are used to improve image quality on limited displays that are not capable of reproducing an HDR video to its fullest intended extent. Dynamic metadata allows content creators to control and choose the way the image is adjusted.[16]
The HDR10 Media Profile, more commonly known as HDR10, is an open HDR standard announced on 27 August 2015 by the Consumer Technology Association.[17] It is the most widespread of the HDR formats,[18] and is not backward compatible with SDR displays. It is technically limited to a maximum peak brightness of 10,000 nits; however, HDR10 content is commonly mastered with a peak brightness between 1000 and 4000 nits.[7]
HDR10 lacks dynamic metadata.[19] On HDR10 displays that have lower color volume than the HDR10 content (such as lower peak brightness capability), the HDR10 metadata provides information to help the display adjust to the video.[6] The metadata is static and constant with respect to each individual video, and does not inform the display exactly how the content should be adjusted. The interaction between display capabilities, video metadata, and the ultimate output (i.e. the presentation of the video) is mediated by the display, with the result that the original producer's intent may not be preserved.[10]
Dolby Vision is an end-to-end ecosystem for HDR video, and covers content creation, distribution, and playback.[20] It uses dynamic metadata and is capable of representing luminance levels of up to 10,000 nits.[6] Dolby Vision certification requires displays for content creators to have a peak luminance of at least 1,000 nits.[8]
HDR10+
HDR10+, also known as HDR10 Plus, is an HDR video format announced on 20 April 2017.[21] It is the same as HDR10 but with the addition of a system of dynamic metadata developed by Samsung.[22][23][24] It is free to use for content creators and has a maximum $10,000 annual license for some manufacturers.[25] It has been positioned as an alternative to Dolby Vision without the same expenses.[18]
HLG
HLG format is an HDR format that can be used for video and still images.[26][27] It uses the HLG transfer function, Rec. 2020 color primaries, and a bit depth of 10 bits.[28] The format is backwards compatible with SDR UHDTV, but not with older SDR displays that do not implement the Rec. 2020 color standards.[29][2] It does not use metadata and is royalty-free.
PQ10 (PQ format)
PQ10, sometimes referred to as the PQ format, is an HDR format that can be used for video and still images.[30][31] It is the same as the HDR10 format without any metadata.[30] It uses the perceptual quantizer (PQ) transfer function, Rec. 2020 color primaries and a bit depth of 10-bits.[29]
HDR Vivid
HDR Vivid is an HDR format developed by the China Ultra HD Video Alliance (CUVA) and released in March 2021.[32][33][34] It uses dynamic metadata standardized in CUVA 005-2020.[35][36]
Other formats
Technicolor Advanced HDR: An HDR format which aims to be backwards compatible with SDR.[18](As of December 2020) there is no commercial content available in this format.[18] It is a global term for either SL-HDR1, SL-HDR2, SL-HDR3.[37]
SL-HDR1 (Single-Layer HDR system Part 1) is an HDR standard that was jointly developed by STMicroelectronics, Philips International B.V., and Technicolor R&D France.[38] It was standardised as ETSI TS 103 433 in August 2016.[39] SL-HDR1 provides direct backwards compatibility by using static (SMPTE ST 2086) and dynamic metadata (using SMPTE ST 2094-20 Philips and 2094-30 Technicolor formats) to reconstruct a HDR signal from an SDR video stream that can be delivered using existing SDR distribution networks and services. SL-HDR1 allows for HDR rendering on HDR devices and SDR rendering on SDR devices using a single-layer video stream.[39] The HDR reconstruction metadata can be added either to HEVC or AVC using a supplemental enhancement information (SEI) message.[39] Version 1.3.1 was published in March 2020.[40] It is based on a gamma curve.
SL-HDR2 uses a PQ curve with dynamic metadata.[41]
EclairColor HDR is a HDR format that is only used in a professional movie environment. It requires certified screens or projectors and the format is only rarely used. It is based on a gamma curve.[43]
UHD Blu-ray (means HDR10 with further restrictions)
SDR displays supporting Rec. 2020 (such as UHD-TV)
Notes
PQ10 format is same as HDR10 without the metadata[28]
Technical characteristics of Dolby Vision depend on the profile used, but all profiles support HDR with Dolby Vision dynamic metadata.[45]
HLG backward compatibility is acceptable for SDR UHDTV displays that can interpret the BT.2020 colour space. It is not intended for traditional SDR displays that can only interpret BT.709 colorimetry.[29][2]
↑12-bit is achieved via reconstruction by combining a 10-bit base layer with a 10-bit enhancement layer. Current profiles only allow a 1920x1080 enhancement layer for 4K video.[45][46]
↑ 2.02.1The dynamic metadata of Dolby Vision and HDR10+ are not the same.
Displays
TV sets with enhanced dynamic range and upscaling of existing SDR/LDR video/broadcast content with reverse tone mapping have been anticipated since early 2000s.[54][55] In 2016, HDR conversion of SDR video was released to market as Samsung's HDR+ (in LCD TV sets)[56] and Technicolor SA's HDR Intelligent Tone Management.[57]
As of 2018, high-end consumer-grade HDR displays can achieve 1,000 cd/m2 of luminance, at least for a short duration or over a small portion of the screen, compared to 250-300 cd/m2 for a typical SDR display.[58]
Video interfaces that support at least one HDR Format include HDMI 2.0a, which was released in April 2015 and DisplayPort 1.4, which was released in March 2016.[59][60] On 12 December 2016, HDMI announced that HLG support had been added to the HDMI 2.0b standard.[61][62][63] HDMI 2.1 was officially announced on 4 January 2017, and added support for Dynamic HDR, which is dynamic metadata that supports changes scene-by-scene or frame-by-frame.[64][65]
Compatibility
As of 2020, no display is capable of rendering the full range of brightness and color of HDR formats.[28] A display is called an HDR display if it can accept HDR content and map them to its display characteristics,[28] so the HDR logo only provides information about content compatibility and not display capability.
Displays that use global dimming, such as most edge-lit LED displays, cannot display the advanced contrast of HDR content. Some displays implement local dimming technologies, such as OLED and full-array LED-backlighting, to more properly display advanced contrast.[66]
Certifications
VESA DisplayHDR
The DisplayHDR standard from VESA is an attempt to make the differences in HDR specifications easier to understand for consumers, with standards mainly used in computer monitors and laptops. VESA defines a set of HDR levels; all of them must support HDR10, but not all are required to support 10-bit displays.[67] DisplayHDR is not an HDR format, but a tool to verify HDR formats and their performance on a given monitor. The most recent standard is DisplayHDR 1400, which was introduced in September 2019, with monitors supporting it released in 2020.[68][69] DisplayHDR 1000 and DisplayHDR 1400 are primarily used in professional work like video editing. Monitors with DisplayHDR 500 or DisplayHDR 600 certification provide a noticeable improvement over SDR displays, and are more often used for general computing and gaming.[70]
HDR is mainly achieved by the use of PQ or HLG transfer function.[1][5] WCGs are also commonly used along HDR up to Rec. 2020 color primaries.[1] A bit depth of 10 or 12 bits is used to not see banding across the extended brightness range. In some cases, additional metadata are used to handle the variety in displays brightness, contrast and colors. HDR video is defined in Rec. 2100.[5]
Color space
ITU-R Rec. 2100
Rec. 2100 is a technical recommendation by ITU-R for production and distribution of HDR content using 1080p or UHD resolution, 10-bit or 12-bit color, HLG or PQ transfer functions, full or limited range, the Rec. 2020 wide color gamut and YCBCR or ICTCP as color space.[11][73]
Transfer function
See also: Transfer functions in imagingSDR uses a gamma curve transfer function that is based on CRT characteristics, and is used to represent luminance levels up to around 100 nits.[1] HDR uses newly developed PQ or HLG transfer functions instead of the traditional gamma curve.[1] If the gamma curve would have been extended to 10,000 nits, it would have required a bit-depth of 15 bits to avoid banding.[74]
HLG is a transfer function developed by the NHK and BBC.[82] It is backward compatible with SDR's gamma curve, and is the basis of an HDR format known as HLG.[28] The HLG transfer function is also used by other video formats such Dolby Vision profile 8.4 and for HDR still picture formats.[45][83][84] HLG is royalty-free.[85]
HDR is commonly associated to a WCG (a system chromaticity wider than BT.709). Rec. 2100 (HDR-TV) uses the same system chromaticity that is used in Rec. 2020 (UHDTV).[5][87] HDR formats such as HDR10, HDR10+, Dolby Vision and HLG also use Rec. 2020 chromaticities.
HDR contents are commonly graded on a P3-D65 display.[6][8]
Because of the increased dynamic range, HDR contents need to use more bit depth than SDR to avoid banding. While SDR uses a bit depth of 8 or 10 bits,[86] HDR uses 10 or 12 bits,[5] which when combined with the use of more efficient transfer function like PQ or HLG, is enough to avoid banding.[90][91]
Matrix coefficients
Rec. 2100 specifies the use of the RGB, YCbCr or ICTCP signal formats for HDR-TV.[5]
ICTCP is a color representation designed by Dolby for HDR and wide color gamut (WCG)[92] and standardized in Rec. 2100.[5]
IPTPQc2 with reshaping is a proprietary format by Dolby and is similar to ICTCP. It is used by Dolby Vision profile 5.[45]
Signaling color space
Coding-independent code points (CICP) are used to signal the transfer function, color primaries and matrix coefficients.[93] It is defined in both ITU-T H.273 and ISO/IEC 23091-2.[93] It is used by multiple codecs including AVC, HEVC and AVIF. Common combinations of H.273 parameters are summarized in ITU-T Series H Supplement 19.[94]
Static HDR metadata give information about the whole video.
SMPTE ST 2086 or MDCV (Mastering Display Color Volume): It describes the color volume of the mastering display (i.e. the color primaries, the white point and the maximum and minimum luminance). It has been defined by SMPTE[10] and also in AVC[95] and HEVC[96] standards.
MaxFALL (Maximum Frame Average Light Level)
MaxCLL (Maximum Content Light Level)
The metadata do not describe how the HDR content should be adapted to an HDR consumer displays that have lower color volume (i.e. peak brightness, contrast and color gamut) than the content.[10][96]
Dynamic metadata
Dynamic metadata are specific for each frame or each scene of the video.
Dynamic metadata of Dolby Vision, HDR10+ and SMPTE ST 2094 describe what color volume transform should be applied to contents that are shown on displays that have different color volume from the mastering display. It is optimized for each scene and each display. It allows for the creative intents to be preserved even on consumers displays that have limited color volume.
SMPTE ST 2094 or Dynamic Metadata for Color Volume Transform (DMCVT) is a standard for dynamic metadata published by SMPTE in 2016 as six parts.[24] It is carried in HEVC SEI, ETSI TS 103 433, CTA 861-G.[97] It includes four applications:
ST 2094-10 (from Dolby Laboratories), used for Dolby Vision.
ST 2094-20 (from Philips). Colour Volume Reconstruction Information (CVRI) is based on ST 2094–20.[39]
ST 2094-30 (by Technicolor). Colour Remapping Information (CRI) conforms to ST 2094-30 and is standardized in HEVC.[39]
ST 2094-40 (by Samsung), used for HDR10+.
ETSI TS 103 572 is a technical specification published in October 2020 by ETSI for HDR signaling and carriage of ST 2094-10 (Dolby Vision) metadata.[98]
HDR Vivid uses dynamic metadata standardized in CUVA 005-2020.[35][36]
Dual-layer video
Some Dolby Vision profiles use a dual-layer video composed of a base layer and an enhancement layer.[45][46] Depending on the Dolby Vision profile (or compatibility level), the base layer can be backward compatible with SDR, HDR10, HLG, UHD Blu-ray or no other format in the most efficient IPTPQc2 color space, which uses full range and reshaping.[45]
ETSI GS CCM 001 describes a Compound Content Management functionality for a dual-layer HDR system, including MMR (multivariate multiple regression) and NLQ (non-linear quantisation).[46]
Adoption
Guidelines
Ultra HD Forum guidelines
UHD Phase A is a set of guidelines from the Ultra HD Forum for the distribution of SDR and HDR content using Full HD 1080p and 4K UHD resolutions. It requires a color depth of 10 bits per sample, a color gamut of Rec. 709 or Rec. 2020, a frame rate of up to 60 fps, a display resolution of 1080p or 2160p and either standard dynamic range (SDR) or high dynamic range that uses HLG or PQ transfer functions.[99] UHD Phase A defines HDR as having a dynamic range of at least 13 stops (213=8192:1) and WCG as a color gamut that is wider than Rec. 709.[99]
UHD Phase B will add support to 120 fps (and 120/1.001 fps), 12 bit PQ in HEVC Main12 (that will be enough for 0.0001 to 10000 nits), Dolby AC-4 and MPEG-H 3D Audio, IMAX sound in DTS:X (with 2 LFE). It will also add ITU's ICtCp and CRI.[100]
Still images
HDR image formats
The following image formats are compatible with HDR (Rec. 2100 color space, PQ and HLG transfer functions, Rec. 2100 or Rec. 2020 color primaries:
HSP, CTA 2072 HDR Still Photo Interface (a format used by Panasonic cameras for photo capture in HDR with the HLG transfer function)[83]
Other image formats, such as JPEG, JPEG 2000, PNG, WebP, do not support HDR by default. They could support it by the use of the ICC profile,[102][103] but existing applications usually do not take into account the absolute luminance value defined in ICC profiles.[103]W3C is working to add HDR support to PNG.[104][105]
ISO/AWI 21496 defines a generic way to add HDR information to SDR formats. A layer of "gain map" records the luminance ratio between HDR source and its tone-mapped SDR rendering, so that the HDR source signal can be (partially) reconstructed from the SDR layer and this map. Software that does not support the gain map would show the fallback SDR rendering.[106] It was formerly known as Apple EDR (Enhanced Dynamic Range).[107]
Adoption of HDR in still images
Apple: iPhone 12 and later support the aforementioned "gain map" HDR technique for still images.[107]
Canon: EOS-1D X Mark III and EOS R5 are able to capture still images in the Rec. 2100 color space by using the PQ transfer function, the HEIC format (HEVC codec in HEIF file format), the Rec. 2020 color primaries, a bit depth of 10 bit and a 4:2:2 YCbCr subsampling.[108][109][110][111][81][excessive citations] The captured HDR pictures can be viewed in HDR by connecting the camera to an HDR display with an HDMI cable.[111] Captured HDR pictures can also be converted to SDR JPEG (sRGB color space) and then viewed on any standard display.[111] Canon refers to those SDR pictures as "HDR PQ-like JPEG".[112] Canon's Digital Photo Professional software is able to show the captured HDR pictures in HDR on HDR displays or in SDR on SDR displays.[111][113] It is also able to convert the HDR PQ to SDR sRGB JPEG.[114]
Panasonic: Panasonic's S-series cameras (including Lumix S1, S1R, S1H and S5) can capture photos in HDR using the HLG transfer function and output them in a HSP file format.[115][27][83] The captured HDR pictures can be viewed in HDR by connecting the camera to an HLG-compliant display with an HDMI cable.[115][83] A plug-in allowing the editing of HLG stills (HSP) in Photoshop CC has been released by Panasonic.[116][117] The company also released a plug-in for displaying thumbnails of those HDR images on a PC (for Windows Explorer and macOS Finder).[117]
Sony: Sony α7S III and α1 cameras can capture HDR photos in the Rec. 2100 color space with the HLG transfer function, the HEIF format, Rec. 2020 color primaries, a bit depth of 10 bit and a 4:2:2 or 4:2:0 subsampling.[84][120][121][122] The captured HDR pictures can be viewed in HDR by connecting the camera to an HLG-compliant display with an HDMI cable.[122]
Others:
Krita 5.0, released on 23 December 2021, added support for HDR HEIF and AVIF images with Rec. 2100 PQ and HLG encoding.[123][124]
Web
Work is in progress at W3C to make Web compatible with HDR,[125] which includes HDR capabilities detection[126] and HDR in CSS.[127]
History
2014
In January 2014, Dolby Laboratories announced Dolby Vision.[15]
In August 2014, PQ was standardized in SMPTE ST 2084.[128]
In October 2014, the HEVC specification incorporates code point for PQ.[129] Previously, it also incorporates the Main 10 profile that supports 10 bits per sample on their first version.[130]
In October 2014, SMPTE standardized the Mastering Display Color Volume (MDCV) static metadata in SMPTE ST 2086.[131]
2015
In March 2015, HLG was standardized in ARIB STD-B67.[132]
On 8 April 2015, The HDMI Forum released version 2.0a of the HDMI Specification to enable transmission of HDR. The specification references CEA-861.3, which in turn references SMPTE ST 2084 (the standard of PQ).[59] The previous HDMI 2.0 version already supported the Rec. 2020 color space.[133]
On 24 June 2015, Amazon Video was the first streaming service to offer HDR video using the HDR10 format.[134][135]
On 27 August 2015, Consumer Technology Association announced HDR10.[17]
On 17 November 2015, Vudu announced that they had started offering titles in Dolby Vision.[136]
2016
On 1 March 2016, the Blu-ray Disc Association released Ultra HD Blu-ray with mandatory support for HDR10 and optional support for Dolby Vision.[137]
On 9 April 2016, Netflix started offering both HDR10 and Dolby Vision.[138]
On June to September 2016, SMPTE standardized multiple dynamic metadata for HDR in SMPTE ST 2094.[139]
On 29 July 2016, SKY Perfect JSAT Group announced that on 4 October, they would start the world's first 4K HDR broadcasts using HLG.[140]
On 9 September 2016, Google announced Android TV 7.0, which supports Dolby Vision, HDR10, and HLG.[141][142]
On 26 September 2016, Roku announced that the Roku Premiere+ and Roku Ultra will support HDR using HDR10.[143]
On 7 November 2016, Google announced that YouTube would stream HDR videos that can be encoded with HLG or PQ.[144][145]
On 17 November 2016, the Digital Video Broadcasting (DVB) Steering Board approved UHD-1 Phase 2 with a HDR solution that supports HLG and PQ.[146][147] The specification has been published as DVB Bluebook A157 and was published by the ETSI as TS 101 154 v2.3.1.[146][147]
2017
On 2 January 2017, LG Electronics USA announced that all of LG's SUPER UHD TV models support a variety of HDR technologies, including Dolby Vision, HDR10, and HLG (Hybrid Log Gamma), and are ready to support Advanced HDR by Technicolor.
On 12 September 2017, Apple announced the Apple TV 4K with support for HDR10 and Dolby Vision, and that the iTunes Store would sell and rent 4K HDR content.[148]
2019
On 26 December 2019, Canon announced the adoption of the PQ format (PQ10) for still photography.[31]
2020
On 13 October 2020, Apple announced the iPhone 12 and iPhone 12 Pro series, the first smartphone that can record and edit video in Dolby Vision directly from the camera roll.[149] iPhone uses the Dolby Vision profile 8.4 cross-compatible with HLG.[150]
2021
In June 2021, Panasonic announced a plug-in for Photoshop CC to allow for the editing of HLG stills.[116]
2022
On 4 July 2022, Xiaomi announced the Xiaomi 12S Ultra, the first Android smartphone that can record Dolby Vision video directly from the camera roll.[151][152]