Android added basic support for high dynamic range (HDR) video with the release of Android 7.0 back in 2016. In 2022, you’d be hard pressed to find a phone or TV above the mid-range class without some form of HDR support. Of course, since there are multiple HDR formats and hardware differences among devices that claim to support HDR, your experience with HDR video may not be the same as everyone else’s. Regardless of your device’s display and chipset capabilities though, if it runs Android, it’ll have the same problem as every other Android phone: HDR and SDR content don’t blend well together.

If it sounds like I’m making up a problem that doesn’t exist, I’m not. When you’re watching HDR content, most of the time it’s taking up your entire screen. Android, however, has a lot of UI elements that you can interact with while an HDR video is playing, whether that be the status bar, a head-ups notification, a screenshot overlay, in-app captions, etc. Sometimes those UI elements are way brighter than most of the pixels in whatever HDR video you’re watching. It’s hard to demonstrate this issue through text, so here’s a little experiment: open this YouTube video on a device that supports HDR, enter full screen, and then pull down the status bar. You’ll hopefully see what I’m talking about.

If you don’t notice it, then try opening this YouTube video in full screen and turn on subtitles/closed captions. The captions aren’t hard coded into the video but are instead overlaid on top, and they probably look a lot brighter than the video itself. You’d especially notice this if you turn on subtitles while watching an HDR film with a lot of dark parts on a screen that can hit 1,000 nits of peak brightness. Maybe you have, but I wouldn’t be surprised if most of you didn’t notice this issue before I pointed it out to you. 

Despite how niche this problem may seem, Google appears to be tackling it with a new feature they’re calling “SDR dimming.” This feature has been in development for over a year, and it isn’t clear how it works nor when it’ll launch. Still, I’ll attempt to explain what I think it does with help from my former colleague Dylan Raga, Display Reviewer for XDA-Developers.

Thanks for reading Android Dessert Bites, a weekly column that dives deep into the Android platform topics that matter to system engineers, app developers, or power users.

Following the release of Android 12’s source code last year, I discovered a code change titled “make sdr white point do a thing.”

AOSP commit implementing SDR dimming

The title and description are obviously not very descriptive to say the least, but a comment in the updated code fortunately explains what the feature is intended to do. When SDR dimming is enabled, “SDR layers should be dimmed to the desired SDR white point instead of being treated as native display brightness,” according to the comment.

AOSP comment explaining the enableSdrDimming function

To understand what SDR dimming is trying to solve, it’s important to understand how the problem arises in the first place. One of the key benefits of HDR over SDR is the increased dynamic range of the luminance component of a video, hence the “high” in high dynamic range. While most SDR content is mastered for 100 nits of peak luminance, a lot of HDR content is mastered for 1,000 nits of peak luminance or more! However, the average luminance of most HDR content is still well below 100 nits, and it turns out to be very similar to that of most SDR content. You can think of HDR content being divided into two segments: The SDR segment, which covers between 0 nits and 100 nits, and the highlights segment, corresponding to values above 100 nits. The goal isn’t to make HDR content as a whole way brighter than SDR content but rather to make certain parts of the video really stand out. While 90% of an HDR film may look identical in brightness to its SDR version, highlights can appear much brighter in the HDR version versus its SDR counterpart, thus creating that higher dynamic range that we’ve come to expect.

In order to properly display HDR content that’s mastered for 1,000 nits of peak luminance, the display’s brightness needs to be bumped up so that 100% white is at least 1,000 nits. When this happens, though, it affects everything else that appears on screen, not just highlights. Any content that’s in SDR, which includes pretty much all Android UI elements, will appear way too bright because their pixel intensities  are as high as those reserved for HDR highlights, when they should be, say, 100 nits; the same as SDR. The solution, thus, is to lower the pixel intensities of SDR layers only while keeping the display brightness high so that HDR content can maintain their brighter highlights.

So what does it mean to lower the pixel intensity? As Dylan Raga explains, Android, like other platforms, uses what’s called a transfer function to map software pixel values to display luminance. For SDR content, the most common transfer function that’s used for phones is gamma 2.2. Gamma 2.2 is a relative mapping function, though, which means that the output luminance that pixel values are mapped to are relative to the display’s peak brightness. For example, a pixel value of 255/255 (100%) will map to whatever peak luminance the display’s current brightness level is at.

HDR uses a different transfer function called perceptual quantizer (PQ) or ST.2084. ST.2084 is, in contrast to gamma 2.2, an absolute mapping function, which means that pixel values are mapped to absolute luminances in nits, regardless of the display brightness. It takes an input, typically between 0 and 1023, and outputs luminance values ranging from 0.0001 (0%) to 10,000 nits (100%, or pixel value 1023/1023).

Per Dylan Raga:

A complication occurs when both HDR and SDR layers are on the screen since Android must pick only one of these mapping for the display. The other layer must then be blended in by precisely matching the different transfer functions. On many phones on Android 12, during HDR playback SurfaceFlinger reports that the display maintains its SDR transfer function, which means that the HDR tone mapping is converted into SDR so that the separate layers appear consistent.

If you recall the two segments for HDR, the goal of SDR dimming is to blend the separate layers by mapping the SDR transfer function to the SDR segment of the HDR transfer function, not the entire HDR luminance range, which is what currently occurs on Android 12.

For a display that supports up to 1,000 nits for HDR content, such as the one rumored to be used by the Pixel 7 Pro, any pixel values  above 75% intensity on the ST.2084 scale should map to 1,000 nits because that’s pixel value for 1,000 nits, and it’s also the reported peak brightness of the display. Light-themed SDR surfaces are typically near 100% pixel intensity, so rather than blasting them at 1,000 nits, SDR dimming can map them down to 58% ST.2084 pixel intensity, which is the corresponding pixel value for 100 nits. Or mapping it back to Gamma 2.2, this is a mapping down to a maximum pixel value of 35%, or 90/255.

100 nits is just the reference value for SDR video content, though, and what may end up happening is that Android will dim UI elements in accordance with the position of the brightness slider. Say, for example, that a position of 25% on the brightness slider results in a peak display luminance of 50 nits under normal conditions. When HDR content is played, the display’s brightness can increase up to its peak while Android dims all SDR layers down to 50 nits, which corresponds to a gamma 2.2 pixel value of 65/255 or 25%. This is useful because the standard reference value of 100 nits may be too bright or too dark depending on the ambient lighting condition. In order for this to work, though, Android would need to know the exact display brightness level at every brightness slider value, but fortunately this mapping can already be found in the driver config of many devices.

This is how SDR dimming should work if we’re understanding the issue correctly and if Google wants to make it aware of the brightness slider position. Keep in mind, however, that this explanation involves speculation based on Dylan’s knowledge of how displays work.

Although this feature can seemingly be controlled by a pair of system properties, I didn’t notice any changes when I set both to true on a rooted Pixel 6 Pro. There aren’t any other code changes in AOSP related to SDR dimming from what I can tell, and while the “SilkFX” app mentioned in the description is publicly available, it crashed for me when accessing its HDR test activities (because of missing API calls). Furthermore, since the feature requires changes to SurfaceFlinger, the low-level system service in Android’s graphics stack that composites buffers of graphical data and sends them to the display, it’s difficult to find what’s changed in Android 13 since I’d have to look through native libraries and binaries. Once Android 13’s source code is available, I’ll be able to follow up on the progress of SDR dimming in Android.


Thanks for reading this week’s edition of Android Dessert Bites. If you want to learn more about this column as well as read previous editions, you can find them here.