In this article:
- What is High Definition Range (HDR)?
- How are Wide Dynamic Cameras or HDR cameras applied within the Security Industry?
- What is the difference between HDR and Backlight Compensation?
- What is meant by HDR Display?
- What does HDR do to improve on earlier display technology?
- What are the different HDR formats?
What is High Definition Range (HDR)?
Within Photography, Videography, TV/Film, HDR stands for High Dynamic Range. Dynamic range is simply the range of the lightest tones to the darkest tones within an image, be that a photographic print, video or otherwise. It’s a measure of the light intensities from the highlights to the shadows.
For example, let’s take the human eye — it’s capable of a wide dynamic range, which is why we can see details in shadows as well as details in highlights at the same time.
If the sun is setting in a valley, our eyes can see where the sun is highlighting the peaks of the valley, but our eyes can also equally appreciate the darker shadows that are cast.
The higher dynamic range a camera sensor has, the closer the photo will compare to what an eye can see. This means that more details in the shadows can be captured, that might otherwise appear pure black and you’ll be able to see details in the highlights that might otherwise be washed out with white.
High Definition Range has its origin in the term Wide Definition Range.
In the film and television industry, the field of displays, mobile phone screens, photography, etc, have in general, been referred to as HDR.
After the security sensor entered the CMOS era, the term Wide Dynamic Range used within security industry is now also referred to as HDR.
Sony, which leads the development trend of security sensors, also has these names for the wide dynamic function: DOL-HDR or DOL-WDR, both of which mean the same thing.
How are Wide Dynamic Cameras or HDR cameras applied within the Security Industry?
In the field of security video surveillance, the wide dynamic range of the camera has always been a very important function, especially in some special industries or scenarios, the wide dynamic function must be used.
For example, miniature pinhole cameras often installed in bank ATMs to monitor users are in general, illuminated from indoors to outdoors, which means the contrast between light and dark is strong, making it difficult for ordinary cameras to see the face clearly. A solution is to use a camera that has a dynamic bandwidth function.
Other environments where HDR would be used are:
- Entrances with daylight outside and a shaded interior.
- Multi-storey car parks or tunnels with daylight outside and low levels of interior brightness.
- Outdoor scenes with direct sunlight and dark shadows.
- Office buildings or shopping centres that have numerous windows that reflect light.
What is the difference between HDR and Backlight Compensation?
The problem solved by backlight compensation is that the monitoring target is too dark and cannot be seen clearly because of the strong background light. When backlight compensation is turned on, the target object in the foreground can be seen clearly, while the previously bright areas are overexposed.
HDR means that the entire frame is clearly visible, with no overexposed areas and no areas that are noticeably darkened.
What is meant by HDR Display?
In essence, the same meaning and function translates to the displays industry.
There are HDR TVs, HDR monitors, HDR laptops, tablets and smartphones. The rich colour detail that’s possible with HDR10, Dolby Vision and other HDR formats makes the technology ideal for everything from still photography to action video to imagined game environments.
What does HDR do to improve on earlier display technology?
In layman terms, an HDR display produces greater luminance and colour depth than screens built to meet older standards.
Here are some HDR basics:
- HDR display luminance
Display luminance describes the amount of light it emits, which in turn determines the gap between the brightest and darkest pixels on the screen. The increased light produced by an HDR display makes its brightest pixels far brighter than before, further differentiating them from the darkest ones, leading to a subtle pixel-to-pixel changes and better image reproduction.
Luminance is measured in candelas/m2 or “nits” – Several different standards have been published to define what can qualify as an HDR display, generally starting at 400 nits for laptops and rising to 1000 or even 10,000 nits for high-end professional monitors.
- HDR display colour depth
The number of bits of data each pixel of a display can utilize to produce the colour in an image or video, is referred to as Colour Depth. Before HDR, most displays topped out at 8-bit colour, the new HDR formats can process 10-bit (or even 12-bit) colour, increasing the potential on-screen colour variations exponentially. Rarely, we also meet even 16-bit HDR image data, which can be considered as extremely high-quality data
Whether directly or through what’s called dithering, 8-bit colour depth allows for 256 different shades of each primary colour, making it possible to generate about 16.5 million colour variations. 10-bit colour bumps the number of shade options from 256 to 1024 — increasing the maximum colour variations to more than 1 billion.
However, there is more to HDR than just luminance and colour depth. HDR content also includes more metadata than typical content, providing details about how to process each image or scene to achieve the intended colors.
Some HDR formats use metadata to guide the display of an entire movie or scene, while other formats, such as Dolby Vision, promise frame-by-frame metadata.
Do not confuse HDR with other display-related acronyms such as Ultra High Definition (UHD) and 4K, as these terms only relate to display resolution (how many lines of pixels a display contains, which in turn determines how detailed each video or still image can be).
HDR is often associated with UHD/4K, but that’s because for now, the technology is mostly limited to high-end, IPS displays that are best capable of presenting it.
What are the different HDR formats?
HDR10: Most widely used open standard for HDR with 10-bit colour and generalized metadata
Dolby Vision: Proprietary Dolby HDR technology promising 12-bit equivalent colour and scene-by-scene metadata
HDR10+: HDR format developed with frame-by-frame metadata. As the name suggests, HDR10+ takes all of the good parts of HDR10 and improves upon them. The maximum brightness is quadrupled to 4,000 nits, which thereby increases contrast too. The metadata fed by the content source is static, which means there’s one set of values established for a whole piece of content.
HDR10+ makes this metadata dynamic, allowing it to change for each frame of video. This means every frame is treated to its own set of colors, brightness and contrast parameters, making for a much more realistic-looking image.
Areas of the screen that might have been oversaturated under HDR10 will display their full details with HDR10+.
Other HDR formats: Additional HDR formats developed by other companies include HLG (Hybrid Log Gamma, from BBC and NHK), and Advanced HDR from Technicolor. Both of these initiatives are more focused on live TV broadcasts than recorded/streamed content.
Regardless of the format, those who’ve viewed HDR vs. non-HDR content, report increased brightness, expanded contrast, more accurate colors and refined detail compared to displays with older technology.