This article documents all of the changes we found in Android 13 so developers can prepare their applications or devices while users can know what to expect from the new version. Although Google has not publicly documented many of the changes in Android 13, we have painstakingly combed through the Android developer docs, AOSP Gerrit, and other sources to put together a comprehensive changelog of everything new in Android 13.

To navigate this article, we highly recommend using the table of contents to navigate between sections. The ToC includes hyperlinks to each section header, so you can quickly navigate to a particular section. You can easily return to the ToC by clicking the link in the sidebar on the left if you’re browsing on desktop, or by tapping the arrow on the bottom right of the screen if you’re browsing on mobile.


Table of Contents


What is Android 13?

Android 13, also known by its internal dessert name Android Tiramisu, is the latest version of the Android operating system. Building on the foundation laid by Android 12, described by many as the biggest Android OS update since 2014, this year’s Android 13 release refines the feature set and tweaks the user interface in subtle ways. However, it also includes many significant behavioral and platform changes under the hood, as well as several new platform APIs that developers should be aware of. For large screen devices in particular, Android 13 also builds upon the enhancements and features introduced in Android 12L, the feature drop for large screen devices.

Android 13 is now publicly available, Google shared multiple preview builds so developers can test their applications. The early preview builds provided an early look at Android 13 and introduced many — but not all — of the new features, API updates, user interface tweaks, platform changes, and behavioral changes to the Android platform. Since Android 13 Beta 3 in June, however, the APIs and app-facing system behaviors have been frozen. Google calls this “Platform Stability”, a term that lets developers know that they can begin updating their app without any fear of breaking changes. And since Android 13 Beta 4’s release in mid-July, developers have been able to publish compatible versions of their apps.

When is the Android 13 release date?

Google released Android 13 on August 15, 2022. The source code is now available on AOSP, so system engineers can now begin compiling their own builds based on the latest Android release.

There were 2 developer previews and 4 betas during development of Android 13. Android 13 reached Platform Stability with the third beta release in June 2022. Once Platform Stability was reached, Android 13’s SDK and NDK APIs and app-facing system behaviors were finalized. Since the Android 12L release came with framework API level 32, Android 13 was released alongside framework API level 33.

Android 13 release timeline
The release timeline for Android 13. Source: Google.

The Developer Previews were intended for developers only and could thus only be installed manually. With the launch of the Android 13 beta program, however, Pixel users can enroll in the program to have the release roll out their devices over the air. Pixel devices that are eligible to install the Android 13 Beta include the Pixel 4, Pixel 4 XL, Pixel 4a, Pixel 4a (5G), Pixel 5, Pixel 5a with 5G, Pixel 6, and Pixel 6 Pro.

Although the initial public release of Android 13 is now available, the Android 13 beta program has not ended. Users who have already enrolled in the beta program can remain enrolled, and users who have not already enrolled in the program can enroll themselves. The beta program will proceed with testing the Android 13 QPR1, Android 13 QPR2, and Android 13 QPR3 releases. The program will conclude with the QPR3 release as Android 14 will already be in preview at that time.

At Google I/O 2022, multiple OEMs launched their own Android 13 Developer Preview/Beta programs for a select few devices. These OEMs include ASUS, Lenovo, Nokia (HMD Global), OnePlus, OPPO, Realme, Sharp, Tecno, Vivo, Xiaomi, and ZTE, and cover devices including the ASUS ZenFone 8, Lenovo Tab P12 Pro, Nokia X20, OnePlus 10 Pro, OPPO Find X5 Pro, OPPO Find N, Realme GT 2 Pro, Sharp AQUOS sense6, Tecno CAMON 19 Pro, Vivo X80 Pro, Xiaomi 12, Xiaomi 12 Pro, Xiaomi Pad 5, and ZTE Axon 40 Ultra. The full list of Android 13 beta builds available from OEMs can be found on this page.

Users who do not have a Pixel device or one of the aforementioned OEM devices can try the Android 13 Beta by installing an official Generic System Image (GSI). Alternatively, Android 13 can be installed on PCs through the Android Emulator.


What are the new features in Android 13?

Accessibility audio description

Under Accessibility settings, there’s a new “Audio Description” toggle. The description of this toggle reads as follows: “Select audio sound track with audio description by default.” The value of this toggle is stored in Settings.secure.enabled_accessibility_audio_description_by_default.

New Audio Description toggle in Android 13's Accessibility settings
Audio Description toggle in Android 13’s Accessibility settings.

In the Android 13 developer documentation, there’s a new isAudioDescriptionRequested method in AccessibilityManager that apps can call to determine if the user wants to select a sound track with audio description by default. As the documentation explains, audio description is a form of narration used to provide information about key visual elements in a media work for the benefit of visually impaired users. Apps can also register a listener to detect changes in the audio description state.

Accessibility magnifier can now follow the text as you type

Under Settings > Accessibility > Magnification, a new “Follow typing” toggle has been added that makes the “magnification area automatically [follow] the text as you type.” The value of this toggle is stored in Settings.Secure.accessibility_magnification_follow_typing_enabled. Here is a video showing this feature in action.

Quick Settings tiles for color correction & one-handed mode

In addition to the Quick Setting tile for the QR code scanner, Google has added other new tiles. These include:

  • A Quick Setting tile to toggle color correction.
  • A Quick Setting tile to toggle one-handed mode.
    • One-handed mode is disabled by default in AOSP but can be enabled with ‘setprop ro.support_one_handed_mode true’. One-handed mode settings won’t appear on large screen devices, but the Quick Setting tile can be appended to the set of active tiles by adding “onehanded” to Settings.Secure.sysui_qs_tiles.
  • The Quick Setting tile for “Device Controls” will have its title changed to “Home” when the user has selected Google Home as the Controls provider.
  • A Quick Setting tile to launch a QR code scanner. Read this section for more information on this feature.

In Android 13 preview builds, a Quick Setting tile to launch Privacy Controls, where users can toggle the camera, microphone, and location availability, was added. Privacy Controls also contained a shortcut to launch security settings. The tile was provided by the PermissionController Mainline module but was removed in later preview builds. Instead, it seems the activity launched by the tile will instead be accessible by tapping the privacy indicators for the microphone, camera, or location.

Bluetooth LE Audio support

Bluetooth LE Audio is the next-generation Bluetooth standard defined by the Bluetooth SIG. It promises lower power consumption and higher audio quality using the new Low Complexity Communications Codec (LC3). The new standard also introduces features such as location-based audio sharing, multi-device audio broadcasting, and hearing aid support. 

There are multiple products on the market with hardware support for BLE Audio, and to prepare for the release of new BLE Audio-enabled audio products, Google has built support for LE Audio into Android 13. Android 13’s Bluetooth stack supports BLE Audio, from including an LC3 encoder and decoder to integrating support for detecting and swapping to the codec in developer options. Developers do not have to make any changes to their applications to take advantage of the new capabilities afforded by Bluetooth LE Audio.

One of the key features of BLE Audio is Broadcast Audio, which lets an audio source device broadcast audio streams to many audio sink devices. Android 13, of course, will support this feature. Devices with BLE Audio support will see an option to broadcast media when opening the media output picker. A dialog will inform users that they can “broadcast media to devices near [them], or listen to someone else’s broadcast.” Other users who are nearby with compatible Bluetooth devices can listen to media that’s being broadcasted by scanning a QR code or entering the name and password for the broadcast.

MIDI 2.0 support

Musicians will be delighted to learn that Android 13 introduces support for the MIDI 2.0 standard. MIDI 2.0 was introduced in late 2020 and adds bi-directionality so MIDI 2.0 devices can communicate with each other to auto-configure themselves or exchange information on available functionality. The new standard also makes controllers easier to use and adds enhanced, 32-bit resolution.

Spatial audio with head tracking support

Android 13 improves upon the initial spatial audio implementation introduced in Android 12L. The audio framework adds support for both static spatial audio and dynamic spatial audio with head tracking. Spatial audio produces immersive audio that seems like it’s coming from all around the user. However, spatial audio only works with media content that has a multichannel audio track for the decoder to output a multichannel stream.

Audio can be spatialized when played back through wired headphones or the phone’s speakers. However, spatial audio support must be implemented by the device maker. The system property ‘ro.audio.spatializer_enabled’ should be set to true if an audio spatializer service is present and enabled, while Settings.Secure.spatial_audio_enabled holds the value of the spatial audio toggle.

Devices with an audio spatializer service may have a toggle in Bluetooth settings to enable spatial audio. This feature produces immersive audio that seems like it’s coming from all around you. However, the description in settings warns that spatial audio only works with some media. Audio can also be spatialized when played back through wired headphones or the phone’s speakers. Spatial audio support must be implemented by the device maker. The system property ‘ro.audio.spatializer_enabled’ should be set to true if an audio spatializer service is present and enabled, while Settings.Secure.spatial_audio_enabled holds the value of the spatial audio toggle.

<string name=”bluetooth_details_spatial_audio_title”>Spatial Audio</string>
<string name=”bluetooth_details_spatial_audio_summary”>Audio from compatible media becomes more immersive</string>
<string name="spatial_audio_speaker">Phone speaker</string>
<string name="spatial_audio_text">Spatial Audio creates immersive sound that seems like it’s coming from all around you. Only works with some media.</string>
<string name="spatial_audio_wired_headphones">Wired headphones</string>
<string name="spatial_summary_off">Off</string>
<string name="spatial_summary_on_one">On / %1$s</string>
<string name="spatial_summary_on_two">On / %1$s and %2$s</string>

If connected to a Bluetooth audio product with a head tracking sensor, Android 13 can also show a toggle in Bluetooth settings to enable head tracking. Head tracking makes audio sound more realistic by shifting the position of audio as you move your head around so it sounds more natural. Devices that can interface with Bluetooth products containing head tracking sensors should declare the feature ‘android.hardware.sensor.dynamic.head_tracker.’

<string name=”bluetooth_details_head_tracking_title”>Head tracking</string>
<string name=”bluetooth_details_head_tracking_summary”>Audio changes as you move your head around to sound more natural</string>

OEMs can use Android’s standardized platform architecture to integrate multichannel codecs. This architecture enables low latency head tracking and integration with a codec-agnostic spatializer. Google at I/O stated that Android 13 will include a standard spatializer and head tracking protocol in the platform. 

App developers can use Android’s Spatializer APIs to detect device capabilities and multichannel audio. The Spatializer class includes APIs for querying whether the device supports audio spatialization, whether audio spatialization is enabled, and whether the audio track can be spatialized. If the audio can be spatialized, then a multichannel audio track can be sent. If not, then a stereo audio track should be sent. 

Media apps that have updated their ExoPlayer dependency to version 2.17+ can configure the platform for multichannel spatial audio. ExoPlayer enables spatialization behavior and configures the decoder to output a multichannel audio stream on Android 12L or later when possible.

Cinematic wallpapers

Android 13 adds new system APIs that Google will be using to generate “3D wallpapers” that “[move] when your phone moves.” Within the latest version of the WallpaperPicker app included in Android 13 DP2 on Pixel devices, there are strings that hint at a new “Effects” tab being added to the interface. This tab will let users apply cinematic effects to their wallpaper, including the 3D wallpaper effect.

<string name="cinematic_effects_toogle">3D wallpapers</string>
<string name="effect_error_dialog_body">We were unable to apply the effects.\nTry with another photo.</string>
<string name="effect_error_dialog_title">Oh no!</string>
<string name="tab_effects">Effects</string>
<string name="wallpaper_effects_subtitle">Make your photo a 3D wallpaper that moves when your phone moves</string>
<string name="wallpaper_effects_title">3D Wallpapers</string>

Under the hood, this feature makes use of the new WallpaperEffects API. A new permission has been added to Android, android.permission.MANAGE_WALLPAPER_EFFECTS_GENERATION, which must be held by the app implementing the system’s wallpaper effects generation service in order to generate wallpaper effects. This permission was added because the wallpaper service is trusted and thus can be activated without the explicit consent of the user.

The system’s wallpaper effects generation service is defined in the new configuration value config_defaultWallpaperEffectsGenerationService. On Pixel, this value is set to com.google.android.as/com.google.android.apps.miphone.aiai.app.wallpapereffects.AiAiWallpaperEffectsGenerationService. This points to a component within the Android System Intelligence, however, there is no evidence of this component existing within current versions of the ASI app. It’s likely that only internal versions of the ASI app have this component. Since no service with this name exists on any of our test devices, the wallpaper effects generation service is disabled, hence we are unable to test this feature at the moment.

However, we are able to test another aspect of this feature: wallpaper dimming. Android’s WallpaperService has added several methods related to a new wallpaper dimming feature. It checks if wallpaper dimming is enabled through the value of persist.debug.enable_wallpaper_dimming before dimming the wallpaper set by the user. This feature is currently not enabled yet, but there’s a CLI used for testing that lets us see how different wallpapers appear at different dimming values. It’s accessed through the ‘cmd wallpaper’ command as follows:

$ cmd wallpaper

Wallpaper manager commands:
  help
    Print this help text.

  set-dim-amount DIMMING
    Sets the current dimming value to DIMMING (a number between 0 and 1).

  dim-with-uid UID DIMMING
    Sets the wallpaper dim amount to DIMMING as if an app with uid, UID, called it.

  get-dim-amount
    Get the current wallpaper dim amount.

Although Google’s service for implementing wallpaper effects is likely proprietary, the API seems to be open for any device maker to hook their own service into. The UI implementation in WallpaperPickerGoogle is also likely Google’s proprietary work, but other device makers could adapt the open source WallpaperPicker to add an Effects tab and a cinematic effects toggle as well.

Material You dynamic color styles

Google introduced dynamic color, one of the key features of Google’s new Material You design language, in Android 12 on Pixel phones. Dynamic color support is set to arrive on more devices from other OEMs in the near future, according to Google, due in large part to new GMS requirements. Google’s dynamic color engine, codenamed monet, grabs a single source color from the user’s wallpaper and generates 5 tonal palettes that are each comprised of 13 tonal colors of various luminances. These 65 colors make up the R.color attributes that apps can use to dynamically adjust their themes.

Each of these colors have undefined hue and chroma values that can be generated at runtime by monet. This is what Google is seemingly taking advantage of in Android 13 for a new feature that will likely let users choose from a handful of additional Material You tonal palettes, called “styles.”

In Android 13, Google is working on new styles that adjust the hue and chroma values when generating the 5 Material You tonal palettes. These new styles are called TONAL_SPOT, VIBRANT, EXPRESSIVE, SPRITZ, RAINBOW, and FRUIT_SALAD. The TONAL_SPOT style will generate the default Material You tonal palettes as seen in Android 12 on Pixel. VIBRANT will generate a tonal palette with slightly varying hues and more colorful secondary and background colors. EXPRESSIVE will generate a palette with multiple prominent hues that are even more colorful. SPRITZ generates an almost grayscale, low color palette.

The specs of these new styles are defined in the new com.android.systemui.monet.Styles class. These new style options are hooked up to SystemUI’s ThemeOverlayController, so Fabricated Overlays containing the 3 accent and 2 neutral tonal palettes can be generated using these new specs. The WallpaperPicker/Theme Picker app interfaces with SystemUI’s monet by providing values to Settings.Secure.THEME_CUSTOMIZATION_OVERLAY_PACKAGES in JSON format.

Users can run the following shell command to generate a tonal palette using these style keys:

adb shell settings put secure theme_customization_overlay_packages '''{\"android.theme.customization.theme_style\":\"STYLE\"}'''

where STYLE is one of TONAL_SPOT, VIBRANT, EXPRESSIVE, SPRITZ, RAINBOW, or FRUIT_SALAD.

In Beta 1, Google is using these styles as strategies to generate a whole range of new theme options. In the “Wallpaper & style” app on Pixel devices, there are now up to 16 “wallpaper colors” and 16 “basic colors” to choose from.

These styles are employed as follows:

  • Wallpaper colors
    • Option #1, 5, 9, and 13 are based on TONAL_SPOT
    • Option #2, 6, 10, and 14 are based on SPRITZ
    • Option #3, 7, 11, and 15 are based on VIBRANT
    • Option #4, 8, 12, and 16 are based on EXPRESSIVE
  • Basic colors
    • Option #1-4 are based on TONAL_SPOT
    • Option #5-12 are based on RAINBOW
    • Option #13-16 are based on FRUIT_SALAD

Running the following command will reveal the current mThemeStyle as well as the 5 tonal palette arrays:

dumpsys activity service com.android.systemui/.SystemUIService | grep -A 16 "mSystemColors"

Resolution switching

Android 13 introduces support for switching the resolution in the Settings app. A new “Screen resolution” page will appear under Settings > “Display” on supported devices that lets the user choose between FHD+ (1080p) or QHD+ (1440p), the two most common screen resolutions seen on handhelds and tablets.

Screen resolution settings in Android 13
Screen resolution settings in Android 13

The availability of these options depends on the display modes exposed to Android. The logic is contained within the ScreenResolutionController class of Settings.

Under the hood, Google has tweaked Android’s display mode APIs so that the resolution and refresh rate can be persisted for each display in a multi-display device, such as foldables. In addition, new APIs can now be used to set the display mode (or only the resolution or refresh rate). These settings are persisted in the following values:

  • Settings.Global.user_preferred_resolution_width
  • Settings.Global.user_preferred_resolution_height
  • Settings.Global.user_preferred_refresh_rate

Turn on dark mode at bedtime

Android’s dark mode is adding a new trigger: bedtime. Users can activate dark mode at their configured bedtime schedule on supported devices. On GMS devices, the bedtime schedule is typically configured via Google’s Digital Wellbeing app.

This feature was hidden from users in the earlier Android 13 preview builds but could be enabled by toggling the feature flag “settings_app_allow_dark_theme_activation_at_bedtime” in Developer Options. This feature flag could also be toggled by sending the following shell command:

adb shell settings put global settings_app_allow_dark_theme_activation_at_bedtime true

As of Beta 2, the “turns on at bedtime” option is available to all users.

Hub mode

Google believes that tablets are the future of computing, so they’ve recently invested in a new tablet division at Android which has helped oversee some of the new features in Android 12L, the feature update for large screen devices. Some of the major changes in Android 12L focus on improving the overall experience of tablets, but in Android 13, Google is preparing to improve one particular use case.

Android 13 Developer Preview 1 reveals early work on a new “hub mode” feature, referred to internally as “communal mode”, that will let users share apps between profiles on a common surface. Code reveals that users will be able to pick from a list of apps that support hub mode, though it isn’t clear what requirements an app needs to meet to support hub mode. Once selected, the apps will be accessible by multiple users on the common surface. The primary user can restrict which Wi-Fi APs the device has to be connected to in order for applications to be shared, though. These networks are considered “trusted networks”.

<string name="communal_mode_connected_network">Connected network</string>
<string name="communal_mode_previously_connected_networks">Previously connected networks</string>
<string name="communal_mode_settings_title">Hub mode</string>
<string name="communal_mode_shared_apps_empty">There are no apps which support hub mode</string>
<string name="communal_mode_shared_apps_title">Shared apps</string>
<string name="communal_mode_switch_title">Use hub mode</string>
<string name="communal_mode_trusted_networks_title">Trusted networks</string>

It isn’t yet entirely clear what form the common surface will take. Initially, we believed that the common surface would be the lock screen, which has seen some other multi-user improvements in Android 13. However, new code related to “dreams”, Android’s code-name for interactive screensavers, not only points towards a revamp of the old feature but also tie-ins with the new “hub mode”. Couple this with new dock-related code, both in Android 13 and in the kernel, suggests that Google is planning something big for tablets that are intended to be fixed in place on a dock.

Since hub mode is still a work-in-progress, we are not able to demonstrate the feature. Enabling the feature first requires that the build declare support for the feature ‘android.software.communal_mode.’ Then, one needs to set SystemUI’s boolean flag ‘config_communalServiceEnabled’ to true. From there, however, there are several missing pieces, including the communalSourceComponent and communalSourceConnector packages as well as much of the code for the common surface. We also couldn’t find the interface for adding applications to the allowlist for communal mode, which is stored in Settings.Secure.communal_mode_packages.

However, we were at least able to access the screen for choosing “trusted networks”.

Screen saver revamp

Google introduced screen savers to Android back in Android 4.2 Jelly Bean, but since the feature’s introduction, it has received few major enhancements. As an aside, screen savers used to be called “daydreams” but were renamed in Android 7.0 Nougat to avoid confusion with Daydream VR, the now-defunct phone-based VR platform. Google still refers to screen savers as “dreams” internally, though, which is important for us to note. That’s because Android 13 introduces a lot of new dream-related code in SystemUI, suggesting that significant changes are on the way.

New classes in Android 13 reveal work on a dream overlay service that is intended to allow “complications” to run on top of screen savers. In Wear OS land, a complication is a service that provides data to be overlaid on a watch face. It appears that dreams will borrow this concept, with some of the available complications including air quality, cast info, date, time, and weather.

<string name="dream_complication_title_aqi">Air Quality</string>
<string name="dream_complication_title_cast_info">Cast Info</string>
<string name="dream_complication_title_date">Date</string>
<string name="dream_complication_title_time">Time</string>
<string name="dream_complication_title_weather">Weather</string>
<string name="dream_complications_toggle_summary">Display time, date and weather on the screen saver</string>
<string name="dream_complications_toggle_title">Show additional information</string>

In Developer Preview 2, the screen saver settings page was revamped to show previews. The available screen savers are shown in a grid with a customize button at the center of each item. A preview button at the bottom lets users see what the screen saver is like. Meanwhile, Beta 1 introduces a toggle to turn off the feature, replacing the “never” option from “when to start.”

In addition, Google appears to be adding a page to the setup wizard so users can select a screen saver when setting up their device. No other changes to the settings were implemented, but it’s likely that Google is planning other enhancements to the screen saver experience.

However, we’ll have to wait for the company to release more preview builds to learn more. Given the evidence we’ve seen, however, we’re confident in saying that the company is preparing major enhancements to screen savers, though whether these changes will land in time for the final Android 13 release we cannot say.

Switch to admin user when docked

The first Developer Preview of Android 13 revealed a new “hub mode” feature in development that will let users share apps between profiles. The second Developer Preview reveals a new setting related to this feature that will seemingly let secondary profiles automatically switch to the primary user after docking the device. Switching to the primary user presumably allows the device to then enter “hub mode”.

The new setting is called “switch to admin user when docked.” It’s available in Android 13’s multi-user settings, but it isn’t shown to users unless the framework config value ‘config_enableTimeoutToUserZeroWhenDocked’ is set to ‘true’. The setting allows users to choose how long Android should wait before automatically switching to the primary user after the device is docked. Timeout values of “never”, “after 1 minute”, and “after 5 minutes” are currently supported.

NFC & NFC-F payment support for work profiles

Android 13 introduces NFC payment support for work profiles. Previously, only the primary user could perform contactless payments and access Settings > Connection preferences > NFC >  Contactless payments. Work profiles can now also use NFC-F (FeliCa) on supported devices.

WiFi Trust on First Use

Android 13 adds support for Trust On First Use (TOFU). When it is not possible to configure the Root CA certificate for a server, TOFU enables installing the Root CA certificate received from the server during initial connection to a new network. The user must approve installing the Root CA certificate. This simplifies configuring TLS-based EAP networks. TOFU can be enabled when configuring a new network in Settings > Network & Internet > Internet > Add network > Advanced options > WPA/WPA-2/WPA-3-Enterprise > CA certificate > Trust on First Use. Enterprise apps can configure WiFi to enable or disable TOFU through the enableTrustOnFirstUse API.

Game dashboard

Google has long recognized the mobile gaming industry for the money making goliath that it is, but it’s only recently that the company decided to develop new Android features that cater to mobile gaming. At the 2021 Google for Games Developer Summit, Google unveiled the Game Dashboard, a collection of tools and data to help gamers track their progress, share their gameplay, and tune their device’s performance.

Specifically, Google’s Game Dashboard integrates achievements and leaderboards data from Play Games, provides a shortcut to stream gameplay directly to YouTube, and has toggles to show a screenshot button, a screen recorder button, a Do Not Disturb button, and a FPS counter in the floating overlay that appears in-game. Lastly, there is also a setting to set the game mode, essentially a performance profile that optimizes the game’s settings to prolong battery life or maximize frame rate. While Android applies some interventions of its own, such as WindowManager backbuffer resizing to reduce GPU load, game developers largely define what happens when a particular game mode is set, including whether or not to support that particular mode, through the new Game Mode API.

The problem with Game Dashboard’s original implementation is that it is exclusive to the Pixel 6 series on Android 12. This means there are few users (relative to the entire Android user base) who can currently use it, which means game developers aren’t in a rush to support the Game Mode API. To solve this, Google is looking to expand the availability of Game Dashboard by decoupling it from SystemUIGoogle, the Pixel-exclusive version of AOSP’s SystemUI, and integrating it into Google Play Services. As Google Play Services is available on all GMS Android devices, Game Dashboard can be rolled out to far more devices.

Although Game Dashboard was introduced with Android 12, some of the APIs it relies on were marked as internal and could only be accessed by apps signed with the platform certificate. Since Google Play Services is signed by Google and not by OEMs, that means it would be unable to access the necessary internal APIs on OEM builds of Android 12. Thus, Game Dashboard in Google Play Services requires Android 13, which makes the necessary APIs available to system apps.

For example, SurfaceControlFpsListener, the API that’s used to get the FPS count of a task, is a hidden API in Android 12. In Android 13, this API has been raised to be a system API, which lets system apps access it as well as platform apps. This API specifically is guarded by a new permission called ACCESS_FPS_COUNTER, which has a protection level of “signature|privileged”. Hence, it can be granted to Google Play Services, which is typically bundled as a privileged system app.

In addition to ACCESS_FPS_COUNTER, Android 13 also introduces the MANAGE_GAME_ACTIVITY permission, which guards the new GameSession APIs used by the provider of a GameService. MANAGE_GAME_ACTIVITY also has a protection level of “signature|privileged”, allowing privileged system apps to hold it. 

Lastly, the existing MANAGE_GAME_MODE permission has been changed from a “signature” permission to a “signature|privileged” permission. GameManager’s setGameMode API checks this permission before letting apps set the game mode. 

Apart from these changes making it possible for privileged system apps to implement a game dashboard, Android 13 also enables game dashboard support on large screen devices like tablets. In Android 12L, the floating game overlay does not appear when the taskbar is visible. This is not the case in Android 13, which properly reports the taskbar state and appropriately registers the game dashboard listeners.

Per-app language preferences

In the Settings app under the System > Languages & input > Languages submenu, users can choose their preferred language. However, this language is applied system-wide, which may not be what multilingual users necessarily prefer. Some applications offer their own language selection feature, but not every app offers this. In order to reduce boilerplate code and improve compatibility when setting the app’s runtime language, Android 13 is introducing a new platform API called LocaleManager which can get or set the user’s preferred language option.

Users can access the new per-app language preferences in Android 13 by going to Settings > System > Languages & input > App Languages. Here, the user can set their preferred language for each app, provided those apps include strings for multiple languages. The app’s language can also be changed by going to  Settings > Apps > All apps > {app} > Language.

In order to help app developers test the per-app language feature, the first few Android 13 preview builds listed per-app language preferences for all apps by default. However, the list of languages that’s shown to the user may not match the list of languages that an app actually supports. Developers must list the languages their apps actually support in the locales_config.xml resource file and point to it in the manifest with the new android:localeConfig attribute. Apps that do not provide a locales_config.xml resource file will not be shown in the per-app language preferences page. The system can be forced to show all apps in the per-app language preferences page by disabling the “settings_app_locale_opt_in_enabled” feature flag in developer options.

Media Tap To Transfer

Android 13 ships with support for a “Media Tap To Transfer” feature. The feature is intended to let a “sender” device (like a smartphone) transfer media to a “receiver” device (like a tablet). The actual media transfer is handled by an external client, such as Google Play Services, which tells SystemUI about the status of the transfer. SystemUI then displays the status on both devices through a chip on top of the status bar.

Although the feature has “tap” in its name, it’s actually architected to be agnostic to any particular communication protocol. It’s up to the client to determine how the transfer is initiated, whether that be via NFC, Bluetooth, Ultra wideband, or a proprietary protocol. 

Android 13 has a few new shell commands to prototype what the status bar chips look like, including:

cmd statusbar media-ttt-chip-sender
cmd statusbar media-ttt-chip-receiver

System Photo Picker

To prevent apps with the broad READ_EXTERNAL_STORAGE permission from accessing sensitive user files, Google introduced Scoped Storage in Android 10. Scoped Storage narrows storage access permissions to encourage apps to only request access to the specific file, types of files, directory, or directories they need, depending on their use case. Apps can use the media store API to access media files stored within well-defined collections, but they must hold the READ_EXTERNAL_STORAGE permission (pre-Android 13) or one of READ_MEDIA_AUDIO, READ_MEDIA_VIDEO, or READ_MEDIA_IMAGES (Android 13+) in order to access media files owned by other apps. Alternatively, they can use the Storage Access Framework (SAF) to load the system document picker to let the user pick which files they want to share with that app, which doesn’t require any permissions.

Android’s system document picker app — simply called “Files” — provides a barebones file picking experience. Android 13, however, is introducing a new system photo picker that extends the concept behind the Files app with a new experience for picking photos and videos. The new system photo picker will help protect photo and video privacy by making it easier for users to pick the specific photos and videos to share with an app. Like the Files app, the new system photo picker can share photos and videos stored locally or on cloud storage, though apps have to add support for acting as a cloud media provider. Google Photos, for example, will appear as a cloud media provider in a future update.

Apps can use the new photo picker APIs in Android 13 to prompt the user to pick which photos or videos to share with the app, without that app needing permission to view all media files.

The new photo picker experience is already available in Android 13 builds but will also gradually roll out to Android devices running Android 11 or higher (excluding Android Go Edition) through an update to the MediaProvider module, which contains the Media Storage APK with the system photo picker activity. According to Google, this feature is available on Android 11+ GMS devices with the Google Play system update for May.

The feature can also be manually enabled through the following shell commands on devices with a recent MediaProvider version:

Non-root: cmd device_config put storage_native_boot picker_intent_enabled true
Root: setprop persist.sys.storage_picker_enabled true

It’s easier to install apps to guest profiles

When creating a guest user in Android 13, the owner can choose which apps to install to the guest profile. No data is shared between the owner and guest profiles, however, which means that the guest profile will still need to sign in to those apps if need be.

This feature was hidden from users starting in Developer Preview 2.

App drawer in the taskbar

The taskbar that Google introduced for large screen devices in Android 12L could only show up to 6 apps on the dock. In Android 13, an app drawer button has been added to the taskbar that lets users see and launch their installed apps. 

This feature is controlled by the Launcher3 feature flag ENABLE_ALL_APPS_IN_TASKBAR and is enabled by default on large screen devices.

Clipboard editor overlay

In Android 11, Google tweaked the screenshot experience by adding an overlay that sits in the bottom left corner of the screen. This overlay appears after taking a screenshot, and it contains a thumbnail previewing the screenshot, a share button, and an edit button to open the Markup activity.

In Android 13, Google has expanded this concept to clipboard content. Now, whenever the user copies text or images, a clipboard overlay will appear in the bottom left corner. This overlay contains a preview of the text or image that has been copied as well as a share button that, when tapped, opens the system share sheet. Tapping the image of text preview opens the Markup activity (for images) or a lightweight text editing activity (for text). If the text that’s been copied contains actionable information such as an address, phone number, or URL, then an additional chip may be shown to send the appropriate intent.

Developers can optionally mark clipboard content as sensitive, preventing it from appearing in the preview. This is done by adding the EXTRA_IS_SENSITIVE flag to the ClipData’s ClipDescription before calling ClipboardManager#setPrimaryClip().

On devices with GMS, the clipboard overlay may show an additional button to initiate Nearby Share. This feature enables quickly sharing text or an image file that has been copied to the clipboard. This feature was announced at Google I/O 2022 but has not yet rolled out to users.

Disable the long-press home button action

Under Settings > System > Gestures > System navigation, a new submenu has been added for the 3-button navigation that lets you disable “hold Home to invoke assistant”. After disabling this feature, a long press of the home button will no longer launch the default assistant activity.

Drag to launch multiple instances of an app in split-screen

Android 13 supports dragging to launch multiple instances of the same activity in split-screen view. The MULTIPLE_TASK flag is applied to the launch intent to let activities supporting multiple instances show side-by-side.

Launch an app in split screen from its notification

Early Android 13 builds made it possible to launch an app in split-screen multitasking mode by long-pressing its notification and then dragging and dropping to either half of the screen. This feature was actually introduced in Android 12L but was disabled by default. Since it is still quite inconsistent and likely only intended for large screen devices, the feature was disabled by default in Beta 3.2 for handhelds. This video shows the feature in action.

Predictive back gesture navigation

Android 13 promises to make back navigation more “predictive”, though not in the sense of using machine learning to improve back gesture recognition as is already the case on Pixel devices. Instead, Android 13 is attempting to address the ambiguity of what happens when performing the back gesture. The feature will let users preview the destination or other result of a back gesture before they complete it, letting them decide whether they want to continue with the gesture or stay in the current view.

To complement this feature, the launcher is adding a new back-to-home transition animation that will make it very clear to the user that performing a back gesture will exit the app back to the launcher. The new back-to-home animation scales the app window as the user’s finger is swiping inward, similar to the swipe up to home animation. The user will see a snapshot of the home screen or app drawer as they’re swiping, indicating that completing the gesture will exit the app back to the launcher.

In order to make this new animation possible, Android 13 is changing the way apps handle back events. First of all, the KeyEvent#KEYCODE_BACK and OnBackPressed APIs are being deprecated. Instead, the system lets apps register back invocation callbacks through the new OnBackInvokedCallback platform API or OnBackPressedCallback API in the AppCompat (version 1.6.0-alpha03 or later) library. If the system detects there aren’t any registered handlers, then it can play out the new predictive back gesture animation because it can “predict” what to do when the user completes the back gesture. If there are layers that have registered handlers, on the other hand, then the system will invoke them in the reverse order in which they are registered. 

Previously, the system wouldn’t always be able to predict what would happen when the user tries to go back, because individual activities could have their own back stacks that the system isn’t aware of and apps could override the behavior of back navigation. The way back events are handled in Android 13 enables a more intuitive back navigation experience while also letting apps continue to handle custom navigation.

Apps can opt in to the new predictive back gesture navigation system by setting the new enableOnBackInvokedCallback Manifest attribute to “true”. Then, in order to test the new back-to-home animation, developers can toggle “predictive back animations” in developer options. The new back dispatching behavior will be enabled by default for apps targeting Android 14 (API level 34).

A demo of the new back-to-home animation can be seen in this video from Google’s codelab.

Google is working to bring feature parity between the taskbar’s app drawer on large screen devices and the app drawer on handheld devices. The taskbar’s new app drawer now shows a predictions row and will support showing a search bar. The former is enabled by default while the latter is controlled by a feature flag (ENABLE_ALL_APPS_ONE_SEARCH_IN_TASKBAR) during testing. However, the search bar currently doesn’t appear with this flag enabled in Beta 1.

Predictions row in the taskbar's app drawer
As of Beta 1, the taskbar’s app drawer now shows the app predictions row.

Bandwidth throttling

Simulating slow network conditions can be useful for development and debugging, but Android hasn’t provided an easy way to throttle network speeds until the latest Android 13 release. In Android 13, a new setting in Developer Options lets developers set a bandwidth rate limit for all networks capable of providing Internet access, whether that be Wi-Fi or cellular networks. This setting is called “network download rate limit” and has 6 options, ranging from “no limit” to “15Mbps.”

For more information on this feature, please refer to this article.

7-day view in privacy dashboard

Android 12 introduced the “Privacy dashboard” feature which lets users view the app that have accessed permissions marked as “dangerous” (ie. runtime permissions). The dashboard only shows data from the past 24 hours, but in Android 13, a new “show 7 days” button will show permissions access data from the past 7 days.

This feature is not enabled by default in current Android 13 preview builds, but Google confirmed at I/O that the feature will be available. The feature is currently controlled by a device_config flag that can be toggled using the following ADB shell command:

cmd device_config put privacy privacy_dashboard_7_day_toggle true

Clipboard auto clear

Android offers a clipboard service that’s available to all apps for placing and retrieving text. Many keyboard apps like Google’s Gboard extend the global clipboard with a database that stores multiple items. Gboard even automatically clears any clipboard item that’s older than 1 hour.

Although any app can technically clear the primary clip in the global clipboard (so long as they’re either the foreground app or the default input method on Android 10+), Android itself does not automatically clear the clipboard. This means that any clipboard item left in the global clipboard could be read by an app at a later time, though Android’s clipboard access toast message will likely alert the user to this fact.

Android 13, however, has added a clipboard auto clear feature. This feature will automatically clear the primary clip from the global clipboard after a set amount of time has passed. By default, the clipboard is cleared after 3600000 milliseconds has passed (60 minutes), matching Gboard’s functionality.

The logic for this new feature is contained within the ClipboardService class of services.jar. Here is a demonstration of the clipboard auto clear feature in Android 13 with a timeout of 5 seconds:

This feature is enabled by default starting in the Android 13 Beta.

Control smart home devices without unlocking the device

Android 11 introduced the Quick Access Device Controls feature which lets users quickly view the status of and control smart home devices like lights, thermostats, and cameras. Apps can use the ControlsProviderService API to tell SystemUI which controls it can show in the Device Controls area. The device maker can choose where to surface the Device Controls area, but in AOSP Android 12, it can be opened through a shortcut on the lock screen or Quick Settings panel. However, if the user opens Device Controls while the device is locked, then they will only be able to see and not control any of their smart home devices.

In Android 13, however, apps can let users control their smart home devices without having them unlock their devices. The isAuthRequired method has been added to the Control class, and if it returns “true”, then users can interact with the control without authentication. This behavior can be set per-control, so developers do not need to expose all device controls offered by their app to interaction without authentication. The following video demonstrates the new API in action:

Starting with Beta 1, a new setting is available under Settings > Display > Lock screen called “control from locked device.” When enabled, users can “control external devices without unlocking your phone or tablet if allowed by the device controls app.”

Control from locked device toggle in Lock screen settings of Android 13

QR code scanner shortcut

QR codes have been an indispensable tool during the COVID-19 pandemic, as they’re a cheap and highly accessible way for a business to lead users to a specific webpage without directly interacting with them. In light of the renewed importance of QR codes, Google is implementing a handy shortcut in Android 13 to launch a QR code scanner.

Specifically, Android 13 implements a new Quick Setting tile to launch a QR code scanner. Android 13 itself won’t ship with a QR code scanning component, but it will support launching a component that does. The new QRCodeScannerController class in SystemUI defines the logic, and the component that is launched is contained within the device_config value “default_qr_code_scanner”. On devices with GMS, Google Play Services manages device_config values, and hence sets the QR code scanner component as com.google.android.gms/.mlkit.barcode.ui.PlatformBarcodeScanningActivityProxy.

The Quick Setting tile is part of the default set of active Quick Settings tiles. Its title is “QR code” and its subtitle is “Tap to scan.” The tile is grayed out if no component is defined in the device_config value “default_qr_code_scanner”. Within the Settings.Secure.sysui_qs_tiles settings value that keeps track of the tiles selected by the current user, the value for the QR code scanner tile is “qr_code_scanner”.

There is also a lock screen entry point for the QR code scanner, which is controlled by the framework flag ‘config_enableQrCodeScannerOnLockScreen.’ This value is set to false by default. Currently, Android 13 does not provide a user-facing setting to control the visibility of the lock screen entry point.

Unified Security & Privacy settings

During Google I/O, Google announced that it will introduce a unified Security & Privacy settings page in Android 13. This new settings page will consolidate all privacy and security settings in one place, and it will also provide a color-coded indicator of the user’s safety status and guidance on how to boost security. The “Security” settings page on Pixel devices already shows a color-coded indicator of the user’s safety status and provides guidance, but it does not integrate privacy settings.

Unified security & privacy settings in Android 13
The new Security & Privacy settings in Android 13. Source: Google.

The new Security & Privacy settings page is contained within the PermissionController APK delivered through the PermissionController module. The component is com.google.android.permissioncontroller/com.android.permissioncontroller.safetycenter.ui.SafetyCenterActivity (for the Google-signed module), but the activity won’t launch unless the feature flag is enabled. This feature flag can be enabled by sending the following command:

cmd device_config put privacy safety_center_is_enabled true

While this makes the Security & Privacy settings page appear in top-level Settings, the page itself is not fully functional as of now. Google has yet to announce the full roll out of this feature, which will arrive via an update to the aforementioned Mainline module at a later date.

Toggle to show the vibrate icon in the status bar

Android places an icon in the status bar to reflect the sound mode, but in Android 12, the vibrate icon no longer showed when the device was in vibrate mode. Many users complained about this change, and in response, Google has added a toggle in Android 13 under Settings > Sound & vibration that restores the vibrate icon in the status bar when the device is in vibrate mode. The vibrate icon even appears in the status bar when on the lock screen. This toggle is available under “Sound & vibration” as “Always show icon when in vibrate mode” and its value is stored in Settings.Secure.status_bar_show_vibrate_icon.

The setting to show the vibrate icon in the status bar in Android 13
Vibrate icon in status bar on Android 13
The vibrate icon appearing in Android 13’s status bar

This feature has been backported to Android 12 QPR3.

Vibration sliders for alarm and media vibrations

Under Settings > Sound & vibration > Vibration & haptics, sliders to configure the alarm and media vibration levels have been added. 

Alarm and media vibration sliders in Android 13's Settings
Alarm and media vibration sliders in Settings.

In conjunction with this change, the Settings configuration flag controlling the supported intensity level (config_vibration_supported_intensity_levels) has been updated to be an integer, so device makers can specify how many distinct levels are supported.


What are the UI changes in Android 13?

Consolidated font and display settings

The “font size” and “display size” settings under Settings > Display have been consolidated into a single page, called “display size and text.” The unified settings page also shows a preview for how changes to the font and display size affect icon and text scaling. It also includes two toggles previously found in Accessibility settings: “bold text” and “high contrast text.”

Display size and text settings in Android 13
Display size and text settings in Android 13

Low light clock when docked

Android has multiple features to display useful information while the device is idling, including a screen saver and ambient display. The former is set to receive a major revamp in Android 13 as part of Google’s overall effort to improve the experience of docked devices, while the latter is set to be joined by a simpler variant.

Android 13 includes a new “low light clock” that simply displays a TextClock view in a light shade of gray. This view is only shown when the device is docked, the ambient lighting is below a certain brightness threshold, and the SystemUI configuration value ‘config_show_low_light_clock_when_docked’ is set to ‘true.’

Bottom search bar in the launcher app drawer

Android 13 DP2 on Pixel has a new feature flag that, when enabled, shifts the search bar in the app drawer to the bottom of the screen. The search bar remains at the bottom until the keyboard is opened, after which it’ll shift to stay above the keyboard.

This feature is disabled by default but can be enabled by setting ENABLE_FLOATING_SEARCH_BAR to true. It remains to be seen if this behavior is exclusive to Google’s Pixel Launcher fork or if this will be available in AOSP Launcher3.

Custom interface for PCs

Android can run on a variety of hardware, including dedicated devices like kiosks, but Google only officially supports a handful of device types. These device types are defined in the Compatibility Definition Document (CDD), and they include handheld devices (like phones), televisions, watches, cars, and tablets. When building Android for a particular device, device makers need to declare the feature corresponding to the device type; for example, television device implementations are expected to declare the feature ‘android.hardware.type.television’ to tell the system and apps that the device is a television.

Since Android apps can also run on Chromebooks, Google created the ‘android.hardware.type.pc’ device type a few years back so apps can target traditional clamshell and desktop computing devices and the framework can recognize apps that have been designed for those form factors. However, it wasn’t until Android 12L that Google decided to revamp the UI for large screen devices, and in Android 13, Google is taking another step in that direction.

On PC devices, the launcher’s taskbar is tweaked to show dedicated buttons for notifications and quick settings. These buttons are persistently shown on the right side of the taskbar, where the 3-button navigation keys would ordinarily be displayed on other large screen devices.

In addition, I noticed that all apps are launched in freeform multi-window mode by default. Freeform multi-window was introduced in Android 7.0 Nougat and to this day remains hidden behind a developer option. Google may be getting ready to enable freeform multitasking support by default on large screen devices like PCs, but this remains to be seen.

Within Launcher3 is a new navigation bar mode called “kids mode.” When enabled on the large screen devices, the drawables and layout for the back and home icons are changed, the recents overview button is hidden, and the navigation bar is kept visible when apps enter immersive mode. When in immersive mode, the buttons fade after a few seconds until they’re pressed again.

Kids mode nav bar in Android 13

This feature is controlled by the boolean value Settings.Secure.nav_bar_kids_mode.

During the development of Android 12L, Google experimented with unifying the home screen and app drawer search experiences. This experiment was gated by the ENABLE_ONE_SEARCH flag, but it was removed from the Launcher3 codebase prior to the AOSP release.

This unified search bar returned in Android 13 with the release of Beta 1, but it was disabled by default. To enable it, the following command needed to be sent:

cmd device_config put launcher enable_one_search true

As of Beta 2, however, this search bar is now available by default.

Lock screen rotation enabled on large screen devices

Android’s framework configuration controlling lock screen rotation has been set to “false” by default for years, but it is now enabled by default. In Android 13, the lock screen will only rotate on large screen devices, however.

Lock screen in landscape orientation on Android 13

Redesigned media output picker UI

In Android 10, Google introduced an output picker that lets users switch audio output between supported audio sources, such as connected Bluetooth devices. This output picker is accessed by tapping the media output picker button in the top-right corner of the media player controls. Now in Android 13, Google has revamped the media output picker UI.

Android 13's redesigned media output picker.
Redesigned media output picker UI in Android 13

The highlight of the new media output picker UI is the larger volume slider for each connected device.

Redesigned media player UI

In Android 11, Google reworked the media player controls to support multiple sessions and integration with the notifications shade. Now in Android 13, Google has revamped the media player UI.

The new media player UI features a larger play/pause button that’s been shifted to the right side, a (squiggly) progress slider that’s at the bottom left in line with the rest of the media control buttons, and the media info on the left side. The album art is displayed in the background, and the color scheme of the media output switcher button is extracted from the album art.

The UI of the long-press context menu for the media player has also been updated. The shortcut to settings has been moved to a gear in the upper right corner, and the “hide” button is now filled.

Squiggly progress bar

The progress bar in the media player now shows a squiggly line up to the current timestamp.

A short screen recording showing Android 13’s squiggly progress bar.

In Beta 1, the squiggly progress bar was centered at the bottom of the media player. In Beta 2, the progress bar has been shortened and is now shown at the bottom left.

Fullscreen user profile switcher

In an effort to improve the experience of sharing a device, Google has introduced numerous improvements to the multi-user experience. One change that’s in development is a fullscreen user profile switcher.

Android 13's fullscreen user profile switcher for large screen devices
Android 13’s fullscreen user profile switcher

This interface is likely intended for large screen devices that have a lot of screen real estate. It’s currently disabled by default but can be enabled through the configuration value config_enableFullscreenUserSwitcher.

Revamped UI for adding a new user

The UI for creating a new profile has been redesigned in Android 13. Users now have a few options of varying colors to choose from when choosing a profile picture, or they can take a photo using the default camera app or choose an image from the gallery.

Status bar user profile switcher

Google is experimenting with placing a status bar chip that displays the current user profile and, when tapped, opens the user profile switcher. This chip is not enabled by default in current Android 13 builds, but it can be enabled by setting the SystemUI flag flag_user_switcher_chip to true. Given the limited space available on smartphones, it’s likely this feature is intended for large screen devices like tablets.

User switcher on the keyguard

In Android 13, the keyguard screen (ie. the lock screen PIN/password/pattern entry page) can show a large user profile switcher on the top (in portrait mode) or on the left (in landscape mode). This feature is disabled by default but is controlled by the SystemUI boolean ‘config_enableBouncerUserSwitcher’.

Button rearrangement in the notification shade

Google has moved the power, settings, and profile switcher buttons in the notification shade. Previously, they were located directly underneath the Quick Settings panel. Now, they are located at the very bottom, tucked to the right.

Do Not Disturb may be rebranded to Priority mode

Do Not Disturb mode, the feature that lets users choose what apps and contacts can interrupt them, was renamed to Priority mode in Developer Preview 2. Apart from the branding change, the schedules page has been redesigned to use switches instead of toggles and now shows summaries for schedules and calendar events (instead of just whether they’re “on” or “off”). Schedules list the days and times for which they’re active, while calendar events show what events they’re triggered on.

Android 13 Beta 1 brought back the original Do Not Disturb branding, so it seems that the “Priority mode” branding isn’t here to stay.

Enabling silent mode disabled all haptics

When setting the sound mode to “silent”, all haptics were disabled in the Android 13 developer previews, even those for interactions (such as gesture navigation). On Android 12L, “vibration & haptics” are similarly grayed out with a warning that says “vibration & haptics are unavailable because [the] phone is set to silent”, but in our testing, haptics for interactions still worked. This is not the case in the Android 13 developer previews, however. Fortunately, this change has been reverted in the Android 13 beta release.


What are the behavioral changes in Android 13?

Media controls are now derived from PlaybackState

MediaStyle is a notification style used for media playback notifications. In its expanded form, it can show up to 5 notification actions which can be chosen by the media application. Prior to Android 13, the system displays media controls based on the list of notification actions added to the MediaStyle notification.

Starting with Android 13, the system will derive media controls from PlaybackState actions rather than the MediaStyle notification. If an app doesn’t include a PlaybackState or targets an older SDK version, then the system will fall back to displaying actions from the MediaStyle notification. This change aligns how media controls are rendered across Android platforms.

Screenshots of media controls on a phone and tablet running Android 13. Credits: Google.

Android 13 can show up to five action buttons based on the PlaybackState. In its compact state, the media notification will only show the first three action slots. The following table lists the action slots and the criteria the system uses to display each slot.

How Android 13 decides which buttons to show in the media player notification

Control an app’s ability to turn on the screen

A new appop permission has been added to Android 13 that lets users control whether or not an application can turn on the screen. Users can go to “Settings > Apps > Special app access > Turn screen on” to choose which apps can turn the screen on. All apps that hold the WAKE_LOCK permission appear in this list, save for SystemUI.

Defer boot completed broadcasts for background restricted apps

Android allows applications to start up at boot by listening for the ACTION_BOOT_COMPLETED or ACTION_LOCKED_BOOT_COMPLETED broadcasts, which are both automatically sent by the system. Android also lets users place apps into a “restricted” state that limits the amount of work they can do while running in the background. However, apps placed in this “restricted” state are still able to receive the ACTION_BOOT_COMPLETED and ACTION_LOCKED_BOOT_COMPLETED broadcasts. This will change in Android 13.

Android 13 will defer the ACTION_BOOT_COMPLETED and ACTION_LOCKED_BOOT_COMPLETED broadcasts for apps that have been placed in the “restricted” state and are targeting API level 33 or higher. These broadcasts will be delivered after any process in the UID is started, which includes things like widgets or Quick Settings tiles.

App developers can test this behavior in one of two ways. First, developers can go to Settings > Developer options > App Compatibility Changes and enable the DEFER_BOOT_COMPLETED_BROADCAST_CHANGE_ID option. This is enabled by default for apps targeting Android 13 or higher. Alternatively, developers can manually change the device_config flag that controls this behavior as follows:

cmd device_config put activity_manager defer_boot_completed_broadcast N

where N can be 0 (don’t defer), 1 (defer for all apps), 2 (defer for UIDs that are background restricted), or 4 (defer for UIDs that have targetSdkVersion T+).

These conditions can be combined by bit-OR-ing the values. By default, the flag is set to ‘6’ to defer the ACTION_BOOT_COMPLETED and ACTION_LOCKED_BOOT_COMPLETED broadcasts for all apps that are both background restricted and targeting Android 13.

Foreground service manager and notifications for long-running foreground services

Android 13’s new Foreground Services (FGS) Task Manager shows the list of apps that are currently running a foreground service. This list, called Active apps, can be accessed by pulling down the notification drawer and tapping on the affordance. Each app will have a “stop” button next to it.

The FGS Task Manager lets users stop foreground services regardless of target SDK version. Here’s how stopping an app via FGS Task Manager compared to swiping up from the recents screen or pressing “force stop” in settings.

How stopping an app via the foreground service task manager in Android 13 differs from swiping up in recents and force closing it in settings
Comparing behavior with “swipe up” and “force stop” user actions. Source: Google.

The system will send a notification to the user inviting them to interact with the FGS Task Manager after any app’s foreground service has been running for at least 20 hours within a 24-hour window. This notification will read as “[app] is running in the background for a long time. Tap to review.” However, it will not appear if the foreground service is of type FOREGROUND_SERVICE_TYPE_MEDIA_PLAYBACK or FOREGROUND_SERVICE_TYPE_LOCATION.

Certain applications are exempted from appearing in the FGS Task Manager. These include system-level apps, safety apps holding the ROLE_EMERGENCY role, and all apps when the device is in demo mode. Certain apps cannot be closed by the user even if they appear in the FGS Task Manager, including device owner apps, profile owner apps, persistent apps, and apps that have the ROLE_DIALER role.

With the addition of a dedicated space for foreground service notifications, Google says that Android 13 will let users dismiss foreground service notifications. After dismissing the notification, users will be able to find them in the FGS Task Manager. Also, whenever there’s a change in the list of running foreground service notifications, a dot appears next to the affordance hinting to the user that they should review the list once more.

For more information on the new system notification for long-running foreground services, visit this page. For more information on the new foreground services task manager, visit this page.

High-priority FCM quota decoupled from app standby buckets

FCM, short for Firebase Cloud Messaging, is the preferred API to deliver messages to GMS Android devices. Developers have the option to either send a notification message, which results in a notification being posted on behalf of the client app, a data message, which client apps can process and respond to in a number of ways, or a notification message with a data payload. Developers can set the priority of these messages to be “normal priority” or “high priority” depending on their needs. Normal priority messages are delivered immediately when the device is awake but may be delayed when the device is in doze mode, while high priority messages are delivered immediately, waking the device from doze mode if necessary.

Google Play Services, the system app that implements FCM on Android, is exempt from doze mode on GMS Android devices, which is how high priority FCM messages are able to be delivered immediately. Due to the potentially adverse effect that high priority FCM messages can have on battery life, FCM places some restrictions on message delivery in order to preserve battery life. 

For example, FCM may not deliver messages to apps when there are too many messages pending for an app, when the device hasn’t connected to FCM in over a month, or when the app was manually put into the background restricted state by the user. In Android 9, Google introduced app standby buckets, further restricting the behavior of apps based on how recently and how frequently they’re used. Apps that are placed into the active or working set buckets face no restrictions to FCM message delivery, while apps placed in the frequent, rare, or restricted buckets have a daily quota of high priority FCM messages. This, however, changes in Android 13.

Android 13 decouples the high priority FCM quota from app standby buckets. As a result, developers sending high priority FCM messages that result in display notifications will see an improvement in the timeline based on message delivery, regardless of app standby buckets. However, Google warns that apps that don’t successfully post notifications in response to receiving high priority FCMs may see that some of their high priority messages have been downgraded to normal priority messages.

Job priorities

Android’s JobInfo API lets apps submit info to the JobScheduler about the conditions that need to be met for the app’s job to run. Apps can specify the kind of network their job requires, the charging status, the storage status, and other conditions. Android 13 expands these options with a job priority API, which lets apps indicate their preference for when their own jobs should be executed.

The scheduler uses the priority to sort jobs for the calling app, and it also applies different policies based on the priority. There are 5 priorities ranked from lowest to highest: PRIORITY_MIN, PRIORITY_LOW, PRIORITY_ DEFAULT, PRIORITY_HIGH, and PRIORITY_MAX.

  • PRIORITY_MIN: For tasks that the user should have no expectation or knowledge of, such as uploading analytics. May be deferred to ensure there’s sufficient quota for higher priority tasks. 
  • PRIORITY_LOW: For tasks that provide some minimal benefit to the user, such as prefetching data the user hasn’t requested. May still be deferred to ensure there’s sufficient quota for higher priority tasks.
  • PRIORITY_DEFAULT: The default priority level for all regular jobs. These have a maximum execution time of 10 minutes and receive the standard job management policy.
  • PRIORITY_HIGH: For tasks that should be executed lest the user think something is wrong. These jobs have a maximum execution time of 4 minutes, assuming all constraints are satisfied and the system is under ideal load conditions.
  • PRIORITY_MAX: For tasks that should be run ahead of all others, such as processing a text message to show as a notification. Only Expedited Jobs (EJs) can be set to this priority.

Notifications for excessive background battery use

When an app consumes a lot of battery life in the background during the past 24 hours, Android 13 will show a notification warning the user about the excessive background battery usage. Android will show this warning for any app the system detects high battery usage from, regardless of target SDK version. If the app has a notification associated with a foreground service, though, the warning won’t be shown until the user dismisses the notification or the foreground service finishes, and only if the app continues to consume a lot of battery life. Once the warning has been shown for an app, it won’t appear for another 24 hours.

Android 13 measures an app’s impact on battery life by analyzing the work it does through foreground services, Work tasks (including expedited work), broadcast receivers, and background services.

Prefetch jobs that run right before an app’s launch

Apps can use Android’s JobScheduler API to schedule jobs that should run sometime in the future. The Android framework decides when to execute the job, but apps can submit info to the scheduler specifying the conditions under which the job should be run. Apps can mark jobs as “prefetch” jobs using JobInfo.Builder.setPrefetch() which tells the scheduler that the job is “designed to prefetch content that will make a material improvement to the experience of the specific user of the device.” The system uses this signal to let prefetch jobs opportunistically use free or excess data, such as “allowing a JobInfo#NETWORK_TYPE_UNMETERED job run over a metered network when there’s a surplus of metered data available.” A job to fetch top headlines of interest to the current user is an example of the kind of work that should be done using prefetch jobs.

In Android 13, the system will estimate the next time an app will be launched so it can run prefetch jobs prior to the next app launch. Internally, the UsageStats API has been updated with an EstimatedLaunchTimeChangedListener, which is used by PrefetchController to subscribe to updates for when the system thinks the user will next launch the app. If a prefetch job hasn’t started by the time the app has been opened (ie. is on TOP), then the job is deferred until the app has been closed. Apps cannot get around this by scheduling a prefetch job with a deadline, as apps targeting Android 13 are not allowed to set deadlines for prefetch jobs. Prefetch jobs are allowed to run for apps with active widgets, though.

The Android Resource Economy

With every new release, Google further restricts what apps running in the background can do, and Android 13 is no exception. Instead of creating a foreground service, Google encourages developers to use APIs like WorkManager, JobScheduler, and AlarmManager to queue tasks, depending on when the task needs to be executed and whether the device has access to GMS. For the WorkManager API in particular, there’s a hard limit of 50 tasks that can be scheduled. While the OS does intelligently decide when to run tasks, it does not intelligently decide how many tasks an app can queue or whether a certain task is more necessary to run.

Starting in Android 13, however, a new system called The Android Resource Economy (TARE) will manage how apps queue tasks. TARE will delegate “credits” to apps that they can then “spend” on queuing tasks. The total number of “credits” that TARE will assign (called the “balance”) depends on factors such as the current battery level of the device, whereas the number of “credits” it takes to queue a task will depend on what that task is for.

From a cursory analysis, it seems that the EconomyManager in Android’s framework lists how many Android Resource Credits each job takes, the maximum number of credits in circulation for AlarmManager and JobScheduler respectively, and other information pertinent to TARE. For example, the following “ActionBills” are listed, alongside how many “credits” it takes to queue a task: ALARM_CLOCK, NONWAKEUP_INEXACT_ALARM, NONWAKEUP_INEXACT_ALLOW_WHILE_IDLE_ALARM, NONWAKEUP_EXACT_ALARM,  NONWAKEUP_EXACT_ALLOW_WHILE_IDLE_ALARM, WAKEUP_INEXACT_ALARM, WAKEUP_INEXACT_ALLOW_WHILE_IDLE_ALARM, WAKEUP_EXACT_ALARM, and WAKEUP_EXACT_ALLOW_WHILE_IDLE_ALARM.

TARE is controlled by the Settings.Global.enable_tare boolean, while the AlarmManager and JobScheduler constants are stored in Settings.Global.tare_alarm_manager_constants and Settings.Global.tare_job_schedule_constants respectively. TARE settings can also be viewed in Developer Options. Starting in Beta 1, TARE settings now supports editing the system’s parameters directly without the use of the command line.

Also in Beta 1 is a big revamp to the way TARE works under the hood. One of the biggest changes in the first Beta is the separation of “supply” from “allocation” of Android Resource Credits. Previously, the credits that apps could accrue to “spend” on tasks was limited by the “balances” already accrued by other apps. There was a “maximum circulation” of credits that limited how many credits could be allocated to all apps. The “maximum circulation” has been removed and replaced with a “consumption limit” that limits the credits that can be consumed across all apps within a single discharge cycle. This lets apps accrue credits regardless of the balances of other apps. The consumption limit scales with the battery level, so the lower the battery level, the fewer actions that can be performed.

Updated rules for putting apps in the restricted App Standby Bucket

Android 9 introduced App Standby Buckets, which define what restrictions are placed on an app based on how recently and how frequently the app is used. Android 9 launched with 4 buckets: active, working set, frequent, and rare. Android 12 introduced a fifth bucket called restricted, which holds apps that consume a great deal of system resources or exhibit undesirable behavior. Once placed into the restricted bucket, apps can only run jobs once per day in a 10-minute batched session, run fewer expedited jobs, and invoke one alarm per day. Unlike with the other buckets, these restrictions apply even when the device is charging but are loosened if the device is idle and on an unmetered network.

Android 13 updates the rules that the system uses to decide whether to place an app in the restricted App Standby Bucket. If an app exhibits any of the following behavior, then the system places the app in the bucket:

  • The user doesn’t interact with the app for 8 days. 
  • The app invokes too many broadcasts or bindings in a 24-hour period.
  • The app drains a significant amount of battery life during a 24-hour period. The system looks at work done through jobs, broadcast receivers, and background services when deciding the impact on battery life. The system also looks at whether the app’s process has been cached in memory.

If the user interacts with the app in one of a number of ways, then the system will take the app out of the restricted bucket and put it into a different bucket. The user may taps on a notification sent by the app, perform an action in a widget belonging to the app, affect a foreground service by pressing a media button, connect to the app through Android Automotive OS, or interact with another app that binds to a service of the app in question. If the app has a visible PiP window or is active on screen, then it is also removed from the restricted bucket.

Apps that meet the following criteria are exempted from entering the restricted bucket in the first place:

  • Has active widgets
  • Has the SCHEDULE_EXACT_ALARM, ACCESS_BACKGROUND_LOCATION, or ACCESS_FINE_LOCATION permission
  • Has an in-progress and active MediaSession

All system and system-bound apps, companion device apps, apps running on a device in demo mode, device owner apps, profile owner apps, persistent apps, VPN apps, apps with the ROLE_DIALER role, and apps that the user has explicitly designated to provide “unrestricted” functionality in settings are also exempted from entering the restricted bucket (and all other battery-preserving measures introduced in Android 13).

Hardware camera and microphone toggle support

Android 12 added toggles in Quick Settings and Privacy settings to enable or disable camera and microphone access for all apps. Developers can call the SensorPrivacyManager API introduced with Android 12 to check if either toggle is supported on the device, and in Android 13, this API has been updated so developers can check whether the device supports a software or hardware toggle.

Hardware switches for camera and microphone access are typically not found on smartphones, but they do appear in many television and smart display products. These devices may have 2-way or 3-way hardware switches on the product itself or on a remote, but in Android 12, the toggle state of these switches wouldn’t be reflected in Android’s built-in camera and microphone toggles. This, however, will change in Android 13, which supports propagating the hardware switch state.

Devices with hardware camera and microphone switches should set the ‘config_supportsHardwareCamToggle’ and ‘config_supportsHardwareMicToggle’ framework values to ‘true’.

Non-matching intents are blocked

Prior to Android 13, apps can send an intent to an exported component of another app even if the intent doesn’t match an <intent-filter> element in the receiving app. This made it the responsibility of the receiving app to sanitize the intent, but many often didn’t. To tighten security, Android 13 will block non-matching intents that are sent to apps targeting Android 13 or higher, regardless of the target SDK version of the app sending the intent. This essentially makes intent filters actually act like filters for explicit intents.

Android 13 will not enforce intent matching if the component doesn’t declare any <intent-filter> elements, if the intent originates from within the same app, or if the intent originates from the system UID or root user.

One-time access to device logs

Through the logcat command line tool accessed through ADB shell, developers can read the low-level system log files that record what’s happening to apps and service on the device. This is immensely useful for debugging and is why logcat integration is a key feature of Android Studio. Android apps can also read these low-level system log files through the logcat command, but they must hold the android.permission.READ_LOGS permission in order to do so. This permission has a protection level of signature|privileged|development, hence it can be granted to apps that were signed by the same certificate as the framework, have been added to a priv-app allowlist file, or were manually granted the permission via the ‘pm grant’ command. Once granted access to this permission, apps can read low-level system log files without restrictions.

However, starting in Android 13, the system shows a log access confirmation dialog every time an app tries to access logd, the logging daemon, via logcat. The dialog asks users if they give an app access to all device logs, warning that “some logs may contain sensitive info, so only allow apps you trust to access all device logs.” The user can tap “allow one-time access” to give the app access to all device logs, or they can tap “don’t allow” to restrict the app’s access to its own logs. Currently, though, only logs from the running app are visible instead of all device logs, which may be a bug in the current implementation.

This behavior is handled by the new LogcatManagerService and LogAccessDialogActivity classes. There is currently no way to grant an app persistent access to the logcat, and apps also don’t have the ability to show the log access confirmation dialog in context.

SAF no longer allows access to subdirectories under /Android

Android offers the Storage Access Framework (SAF) API to simplify how users browse and open files across local and cloud storage providers. An app can send an intent to launch the system-provided documents picker. The user can then browse their device’s external storage, their media store collections, or files available to them from local or cloud storage providers using the system-provided documents picker. When the user picks a file or directory, the app that invoked SAF is granted access to it.

Although Google introduced SAF all the way back in Android 4.4, the API wasn’t used by many apps that needed to gain broad access to external storage, such as file managers. Instead, apps that needed broad external storage access would simply request READ and WRITE_EXTERNAL_STORAGE permissions to gain access to files on external storage. However, the introduction of Scoped Storage changed the way apps access files on external storage.

One of Scoped Storage’s key changes was to reduce the scope of the READ and WRITE_EXTERNAL_STORAGE permissions, forcing apps that needed broad external storage access to find an alternative. That alternative came in the form of the MANAGE_EXTERNAL_STORAGE permission in Android 11, ie. “all files access”, which grants access to all directories in external storage except for /Android/data and /Android/obb. The reason is that Google considers those directories to be part of private storage, specifically referring to them as external private storage, while the rest of external storage is external shared storage. In order to prevent apps from accessing those directories through SAF, the system-provided documents picker won’t let users grant access to the /Android directory.

However, developers discovered a loophole that allowed them to request access to /Android/data and /Android/obb through SAF. By setting the initial directory when launching SAF to be /Android/data or /Android/obb, apps could individually gain access to those subdirectories even though they can’t request access to the entirety of /Android. This simple workaround is widely used by file manager apps on Android today but no longer works on Android 13, as the OS now explicitly checks if the initial directory that’s set in the intent to launch SAF should be blocked.

For more information on how this SAF loophole worked and how Android 13 closes it, you can read this article.

Sideloaded apps may be blocked from accessing Accessibility and Notification Listener APIs

Android’s Accessibility APIs are incredibly powerful as they allow for reading the contents of the screen or performing inputs on behalf of the user. These functions are often misused by banking trojans to steal data from users, which is why Google has been cracking down on misuse of the Accessibility API. Android 13 introduces further restrictions on the use of the Accessibility API, intended to target apps that are sideloaded from outside of an app store.

Android 13 may block the user from enabling an app’s accessibility service depending on how the app was installed. If the app was installed by an app that uses the session-based package installation API, then users will not be blocked from enabling the app’s accessibility service. If an app was installed by an app that uses the non-session-based package installation API, however, then users will initially be blocked from enabling the app’s accessibility service.

The reason Android doesn’t apply restrictions to apps installed via the session-based package installation API is that this installation method is often used by app stores. On the other hand, the non-session-based package installation API is often used by apps that handle APK files but that aren’t interested in acting as an app store, such as file managers, mail clients, or messaging apps.

This restriction is tied to a new appop permission called ACCESS_RESTRICTED_SETTINGS. Depending on the mode, the permission may allow or deny access to the app’s accessibility services page in settings. When an app is newly installed via the non-session-based API, ACCESS_RESTRICTED_SETTINGS is set to “deny”. This causes the settings entry for the app’s accessibility service to be grayed out and the dialog “for your security, this setting is currently unavailable” to be shown when tapping the entry. After viewing the dialog, however, the mode is set to “ignore”. The user can then go to the app info settings for the app in question, open the menu, and then press “allow restricted settings” to unblock access to the app’s accessibility service. Doing so changes the mode to “allow”, which is the mode that would have been set if the user had installed the app via an app that used the session-based API.

Developers can use the appops CLI to test this new permission and behavior. For example, by sending ‘cmd appops set <package> ACCESS_RESTRICTED_SETTINGS <mode>’, where <package> is the name of the application package and <mode> is allow|ignore|deny, it’s possible to manually change the mode for an application. The command ‘cmd appops query-op ACCESS_RESTRICTED_SETTINGS <mode>’ can also be used to query what mode ACCESS_RESTRICTED_SETTINGS is set to for each application.

Although early previews of Android 13 only gated access to accessibility service settings behind this permission, later previews expanded this to another sensitive permission commonly used by malware: notification listener. Notification listener is an API that lets apps read and dismiss all notifications outside of work profiles. The API is useful for cross-device notification syncing, but it’s valuable for malware authors due to the potential for abuse. Malicious apps can read the content of incoming texts and emails among other sensitive data. Thus, it makes sense why Android 13 also blocks users from enabling the notification listener for an app sideloaded via a non-session-based package installation method.

For more information on this change, refer to my previous article that covered the background and implementation details in more depth.

Faster hyphenation

When text reaches the end of a line in a TextView, rather than exceed the margin and go off screen, a line break will be inserted and the text will wrap around to the next line. Hyphens can be inserted at the end of the line to make the text more pleasant to read if a word is split, but enabling hyphenation comes at a performance cost. Google found that, when hyphenation is enabled, up to 70% of the CPU time spent on measuring text is on hyphenation. Thus, hyphenation was disabled by default in Android 10.

However, Android 13 significantly improves hyphenation performance by as much as 200%. This means that developers can enable hyphenation in their TextViews with little to no impact on rendering performance. To make use of the optimized hyphenation performance in Android 13, developers can use the new fullFast or normalFast frequencies when calling TextView’s setHyphenationFrequency method.

Improved Japanese text wrapping

Apps that support the Japanese language can now wrap text by “Bunsetsu”, the smallest unit of words that’s coherent, instead of by character. This makes text more readable by Japanese users. Developers can take advantage of this wrapping by using android:lineBreakWordStyle=”phrase” with TextViews.

Demonstrating improved Japanese text wrapping in Android 13
Japanese text wrapping with phrase style enabled (below) and without (above). Source: Google.

Improved line heights for non-Latin scripts

Support for non-Latin scripts such as Tamil, Burmese, Telugu, and Tibetan has improved in Android 13. The new version now uses a line height that’s adapted for each language, preventing clipping and improving the positioning of characters. Developers just need to target Android 13 to take advantage of these improvements in their apps, however, they should be aware these changes may affect the UI when using apps in non-Latin languages.


What are the platform changes in Android 13?

Audio HAL 7.1 with Ultrasound & latency mode

Android 13’s audio framework has added a system API for an Ultrasound input source and content type, requiring version 7.1 of the audio HAL. This API can only be accessed by apps holding the ACCESS_ULTRASOUND permission, which has a protection level of system|signature.

In addition, audio HAL v7.1 adds APIs for controlling output stream variable latency mode. Latency mode control is required if the device plans to support spatial audio with head tracking over a Bluetooth A2DP connection. There are two types of latency modes: FREE (ie. no specific constraint on the latency) and LOW (a relatively low latency compatible with head tracking operations, typically less than 100ms).

Virtual Audio Device

Android supports creating virtual displays of arbitrary resolution and density, and by specifying the ID of the virtual display, it’s also possible to launch applications directly onto it. In order to support streaming applications from that virtual display to a remote device, Android needs to support capturing both the video and audio of applications running on virtual displays. Video capture is already well-supported by Android, but audio capture of apps running on virtual displays has not been supported until now.

In Android 13, Google has added a new createVirtualAudioDevice API in the VirtualDevice class. This API returns a VirtualAudioDevice object for callers to capture audio and inject microphone into applications running on a virtual display. A VirtualAudioController service listens for changes in applications running on the virtual display as well as changes in the playback and recording config. This service notifies the VirtualAudioSession to update the AudioRecord/AudioTracker inside the AudioCapture/AudioInjection class internally.

HDR video support in Camera2 API

HDR, short for high dynamic range, offers a richer video viewing experience by allowing for much greater contrast between bright highlights and dark elements. It has long been possible to capture HDR video on Android smartphones, though this capability was usually limited to the OEM stock camera app accessing privileged APIs and camera driver functions. In Android 13, however, third-party camera apps using the Camera2 API will be able to capture HDR video.

Android 13’s updated camera HAL lets device makers expose whether or not their device supports 10-bit camera output to the Camera2 API. The new REQUEST_AVAILABLE_CAPABILITIES_DYNAMIC_RANGE_TEN_BIT constant in CameraMetadata indicates that the device supports one or more 10-bit camera outputs specified in DynamicRangeProfiles.getSupportedProfiles. Implementations that expose 10-bit camera output must at least support the HLG10 profile, though if they support other profiles, they can advertise the recommended (in terms of image quality, power, and performance) profile to apps through the CameraCharacteristics#REQUEST_RECOMMENDED_TEN_BIT_DYNAMIC_RANGE_PROFILE constant. Apps using the Camera2 API can set the dynamic range profile using the OutputConfiguration.setDynamicRangeProfile API.

In addition, new APIs will allow for the capturing and accessing of HDR buffers. The Media3 Jetpack library, meanwhile, will be adding new transformer APIs to tonemap HDR videos to SDR.

Stream use cases

Camera2 in Android 13 adds support for “stream use cases” that lets device makers optimize the camera pipeline based on the purpose of the stream. For example, the camera device could have a stream use case defined for video calls so that the optimal configuration is provided for video conferencing apps using the Camera2 API. Depending on the stream use case, the camera device may tweak the tuning parameters, camera sensor mode, image processing pipeline, and 3A (AE/AWB/AF) behaviors. The new REQUEST_AVAILABLE_CAPABILITIES_STREAM_USE_CASE constant in CameraMetadata indicates that the device supports one or more stream use cases. Apps can query the list of supported stream use cases through the CameraCharacteristics#SCALER_AVAILABLE_STREAM_USE_CASES field. Google requires that implementations supporting stream use cases support DEFAULT, PREVIEW, STILL_CAPTURE, VIDEO_RECORD, PREVIEW_VIDEO_STILL, and VIDEO_CALL. Apps can set the stream use case using the OutputConfiguration.setStreamUseCase API.

ART update brings lower memory use and faster runtime performance

An update to the Android Runtime (ART) will bring improvements to JNI performance, more bytecode verification at install time, and a new garbage collection algorithm. As announced at Google I/O 2022, switching between Java and native code will be up to 2.5 times faster after the update. In addition, the reference processing implementation has been reworked to be mostly non-blocking, and the class method lookup has been improved to speed up the interpreter. The biggest change is the new garbage collection algorithm, though, which uses Linux’s userfaultd feature.

ART’s current GC algorithm, called concurrent copying, results in fixed memory overhead for every object load. This overhead is incurred even when GC isn’t running as it uses a read barrier that’s executed when an object is loaded by the application threads. Furthermore, when defragmenting the heap, scattered objects are copied to another region contiguously, after which that space is reclaimed by the scattered objects in the original region, leading to a Resident Set Size (RSS) cliff as the RSS increases and then decreases during this process, which could lead to the OS killing off background processes.

Userfaultd performs the same functionality as the read barrier but without its fixed memory overhead, which is why ART’s new GC algorithm uses it for concurrent compaction. Google says that on average, about 10% of the compiled code size was attributed to the read barrier alone. In addition, the new GC algorithm doesn’t observe the RSS cliff since the pages can be freed before compaction progresses rather than waiting until the end of collection. The GC algorithm is also more efficient as the number of atomic operations that stall the processor pipeline are in the order of number of pages rather than the number of inferences. The algorithm also works sequentially in memory order and thus is hardware prefetch-friendly. Finally, it maintains the object’s allocation locality so objects that are allocated next to each other stay that way, thereby reducing the burden on hardware resources.

The patches implementing the new userfaultd-based GC algorithm in ART can be found in AOSP. Since ART was made into an updatable APEX module in Android 12, it’s possible this feature will roll out to devices running Android 12+.

Multi-generational LRU support

Linux currently manages pages using two pairs of least recently used (LRU) lists, one pair for file-backed pages and another for anonymous pages. Each pair contains just one active and one inactive list. Pages that have just been accessed are placed at the top of the active list followed by other pages that have been recently accessed, while pages that haven’t been recently accessed are eventually moved to the inactive list. The problem with this approach is that Linux often places pages in the wrong list, may evict useful file-backed pages when there are idle anonymous pages to purge, and scans for anonymous pages using a CPU-heavy reverse mapping algorithm.

To solve these problems, Google developed a new page reclaim strategy called “multi-generational LRU.” Multi-generational LRU further divides the LRU lists into generations spanning between an “oldest” and “youngest” generation. The more generations, the more accurately Linux will evict pages that are acceptable to evict. Since the multi-generational LRU framework scans the page table directly, it avoids the costly reverse lookup of page table entries that the current approach takes. Google’s fleetwide profiling shows an “overall 40% decrease in kswapd CPU usage,” an “85% decrease in the number of low-memory kills at the 75th percentile,” and an “18% decrease in app launch time[s] at the 50th percentile.”

This new page reclaim strategy is in the process of being merged to the upstream Linux kernel, but it has already been backported to the android13-5.10 and android13-5.15 branches of the Android Common Kernel. The feature can be enabled by compiling the kernel with the CONFIG_LRU_GEN flag and then sending the

echo y > /sys/kernel/mm/lru_gen/enabled

command.

For a high-level overview of how Linux’s virtual memory management works and how multi-generational LRU improves page reclamation, I recommend reading my Android Dessert Bites entry covering these topics.

Android’s Bluetooth stack is now a Mainline module

Android 13 has migrated Android’s Bluetooth stack from system/bt to packages/modules/Bluetooth, making it updateable as a Project Mainline module. Work is ongoing in AOSP to add new features to the Bluetooth stack — such as support for the new Bluetooth LE Audio standard.

Android’s ultra-wideband stack becomes a Mainline module

The new com.android.uwb APEX module contains Android’s ultra-wideband stack. Ultra-wideband, or UWB for short, is a short-range, high-frequency wireless communication protocol that is commonly used for precise positioning applications, such as pinpointing the location of a lost object that’s nearby. Android 12 first introduced UWB support but restricted API access. With a new Jetpack library on the way, a reference HAL in the works, and an updateable Mainline module, Android 13 will expand the use of UWB hardware for new software features.

Java.home environment variable now points to ART module

The JAVA_HOME environment variable, which points to the location where the Java binaries are located, has been updated in Android 13. In previous versions of Android, java.home pointed to /system. In Android 13, java.home points to /apex/com.android.art, which is where the Android Runtime (ART) Mainline module is mounted. This can be verified by comparing the output of System.getProperty(“java.home”) in Android 12L versus Android 13.

This change was discovered by users of the AdAway app, which modifies the hosts file to block ads. The app queried java.home to get the system path, but with the change in Android 13, the app would attempt to remount /apex/com.android.art as read-write.

ART was made an updatable APEX module back in Android 12, so it’s possible that java.home will be changed in earlier versions of Android following an ART module update.

Tweaks to updatable NNAPI drivers

In order to improve the consistency and performance of machine learning on Android, Google announced Android’s “updateable, fully integrated ML inference stack” at Google I/O last year. In conjunction with Qualcomm, Google would roll out NNAPI driver updates to devices through Google Play Services. Google originally planned to roll out driver updates to devices running Android 12, but they delayed their updatable platform drivers plans to Android 13.

Recently, however, an AOSP code change submitted by a Google engineer stated that the Android ML team “ultimately did not move forward with its updatability plans,” which would have had “updated platform drivers delivered through GMSCore.” Although this code change suggests that Google has abandoned its plans to deliver NNAPI driver updates through Google Play Services, Oli Gaymond, Product Lead on the Android ML team, confirmed that the company still has plans to ship updatable NNAPI drivers. He noted however that “there have been some design changes” that are “currently in testing” but that driver updates will still be delivered via Play Services.

This article has more information on Google’s plans for updatable NNAPI platform drivers.

Improvements to Dynamic System Updates

Google introduced Dynamic System Updates (DSU) in Android 10 to enable installing a generic system image (GSI) without overwriting the device’s original system partition and wiping the original user data partition. This is possible because DSU creates a new dynamic partition (on devices that support this userspace partitioning scheme) and loads the downloaded system image onto the new partition. The downloaded system image and generated data image are stored within the original data partition and are deleted when the user is finished testing on the GSI.

Development on DSU has been sparse since its initial release, but several improvements are being introduced to the feature in Android 13. These changes are documented in AOSP and include performance and UI improvements. Installing a GSI through DSU will be significantly faster in Android 13 thanks to a code change that enlarges the default shared memory buffer size. This brings the time to install a GSI down to under a minute on physical devices. Next, the user can now see what partition DSU is currently installing in the progress bar, which is now weighted to reflect that writable partitions take less time to complete installing.

Custom component for the Quick Access Wallet activity

Android 11 introduced the Quick Access Wallet feature to let users quickly select which card to use for contactless payments. These cards are provided by apps that implement the Quick Access Wallet API, while the surface on which these cards are displayed is provided by a system app. In Android 11, the wallet activity was provided by a standalone system app, while in Android 12, it was provided by SystemUI.

SystemUI still provides the wallet activity by default in Android 13, but device makers can specify a different component. Device makers can provide the configuration to be used by QuickAccessWalletController by defining a method in QuickAccessWalletClient. Alternatively, device makers can use the new boolean attribute ‘useTargetActivityForQuickAccess’ in QuickAccessWalletService to set whether the system should use the component specified by the android:targetAttribute activity (true) in SystemUI or the default wallet activity provided by SystemUI (false).

Basic support for WiFi 7

WiFi 7 is the marketing name for IEEE 802.11be, the next-generation WiFi standard that promises incredibly fast speeds and very low latency. Anshel Sag, Principal Analyst at Moor Insights & Strategy, explains the most important features in the new WiFi standard that are driving these improvements. “Wi-Fi 7 adds features like 4K QAM modulation for higher peak throughput (compared to 1024). In addition to the higher-order modulation, probably the biggest feature in Wi-Fi 7 is the addition of multi-link which comes in multiple flavors and adds the ability to aggregate spectrum across multiple bands which wasn’t possible before or to switch between those bands to use the band with the least interference/latency.”

The first products with WiFi 7 support will likely launch at the end of this year or early next year, well ahead of the standard’s finalization in early 2024. In preparation for these product launches, Android 13 adds preliminary support for WiFi 7. Android 13’s DeviceWiphyCapabilities class, which “contains the WiFi physical layer attributes and capabilities of the device”, has 802.11be in the list of standards and 320MHz as a supported channel width. The DeviceWiphyCapabilities class contains standard nl80211 commands to query the WiFi standards and channel bandwidths supported by the WiFi driver, which Android’s wificond process uses to communicate with the driver.

Evidence of Android 13’s basic support for WiFi 7 can be found in the 21st edition of Android Dessert Bites.

DNS over HTTP/3 support

Instead of connecting to remote servers directly using their IP address, most connections start with a lookup on a domain name server (DNS), which resolves the hostname entered by the user into an IP address that the device then connects to. The problem with most DNS lookups is that they’re unencrypted, meaning they aren’t private. However, encryption can be applied to the DNS lookup, and there are already a few encrypted DNS protocols to choose from.

Android has already supported encrypted DNS since Android 9, though it uses DNS-over-TLS (DoT) as its protocol. While DoT is supported by many DNS providers, it suffers from several problems outlined in this blog post by Google. That’s why Android is adding support for DNS-over-HTTP/3 (DoH3), an encrypted DNS protocol that uses QUIC.

Google says that DoH3 improves significantly over DoT when it comes to performance, with studies showing a reduction in median query time by 24%. Furthermore, the DoH3 implementation is written in Rust, a memory safe programming language, and is asynchronous, resulting in both improved security and performance.

Since Google spun out Android’s DNS-related code into a Project Mainline module called DNS Resolver in Android 10, support for DNS-over-HTTP/3 is rolling out to devices running Android 10 or newer. However, only some Android 10 devices will add support for DoH3, as DNS Resolver was optional for OEMs to implement. GMS requirements mandate that devices upgrading to or launching with Android 11 ship with Google’s DNS Resolver module, hence DNS-over-HTTP/3 will be widely available on GMS devices running Android 11 or later.

DNS-over-HTTP/3 support is enabled by default on all devices running Android 13. On older Android versions, the feature is controlled by a device_config flag that’s set remotely by Google Play Services. In its July 2022 blog post, Google said that DoH3 support had already rolled out to users on Android 11+ through a Google Play system update, but you can check whether it has through the following shell command:

cmd device_config get netd_native doh

If the command returns ‘1’, then DoH3 support is enabled in DNS Resolver. If not, you can run the following command to enable it:

cmd device_config put netd_native doh 1

The initial release of DNS-over-HTTP/3 support limits the user to two “well-known DNS servers which support it”, which includes Google DNS and Cloudflare DNS. DNS Resolver is hardcoded to only use DoH3 for these two servers, but support will likely be expanded to include additional providers in the future. In addition, Google aims to support DDR, which will enable dynamically selecting the correct configuration for any server, improving the performance of encrypted DNS even further.

(Note: Android’s “Private DNS” setting does not allow inputting URLs. Simply enter ‘dns.google’ or ‘cloudflare-dns.com’ to connect to Google DNS or Cloudflare DNS respectively via DNS-over-HTTP/3. Android will add the https:// and /dns-query bits of the URL for you.)

Fast Pair coming to AOSP

Fast Pair, Google’s proprietary protocol for the nearby detection and pairing of Bluetooth devices, appears to be headed to AOSP as part of the new “com.android.nearby” modular system component. The Fast Pair service is currently implemented in Google Play Services, thus requiring Google Mobile Services to be bundled with the firmware. However, the new NearbyManager system API is being added to AOSP. This should let OEMs set up their own server to sync and serve certified Fast Pair devices’ metadata.

In early Android 13 previews, users could toggle Fast Pair scanning in Settings > Connected devices > Connection preferences > Fast Pair. However, Google removed the Fast Pair toggle from Android 13 with Beta 3, suggesting the feature has been postponed to a later release.

For more details on this platform change and how it could impact the Android ecosystem, refer to this article.

Multiple Enabled Profiles on a single eSIM

In order to use multiple subscriptions from one or more carriers, Android devices need as many physical SIM slots as there are subscriptions. This can include multiple SIM card slots, multiple eSIM modules, or a combination of SIM cards and eSIMs. This is because both SIM cards and eSIMs currently only support a single active SIM profile.

Android 13, however, includes an implementation of Multiple Enabled Profiles (MEP), a method for enabling multiple SIM profiles on a single eSIM. This takes advantage of the fact that eSIMs already support storing multiple SIM profiles, so by creating a logical interface between the eSIM and the modem and multiplexing it onto the physical interface, more than one SIM profile stored on an eSIM can interface with the modem. Support for MEP requires an update to the radio HAL but does not require changes to the underlying hardware or modem firmware.

For more information on MEP, please read this article which covers the patent behind this method and the Android APIs that local profile assistant (LPA) apps are expected to use.

New Bluetooth stack used for scanning

Android’s current Bluetooth stack, called “Fluoride”, has been in use for many years now, but starting in Android 11, Google began testing a new Bluetooth stack called “Gabeldorsche.” In an interview with ArsTechnica, Dave Burke, VP of Engineering at Android, told the publication that Gabeldorsche is “basically a future direction for [Android’s] Bluetooth stack and really it’s an initiative to re-write the Bluetooth stack piece-by-piece.” The goal is to improve security, reliability, interoperability, and automated end–to-end testing with Gabeldorsche.

To test Gabeldorsche, Google added a developer option in Android 11 that remained in Android 12, 12L, and early preview versions of Android 13 before it was removed in Beta 2. This is because the Gabeldorsche Bluetooth stack is now enabled by default in Android 13, but only “up to the scanning layer”, which includes BLE scanning, BLE advertising, ACL connection management, controller information management, HCI layer, HAL interface layer, and other required components like config storage.

raven:/ $ cmd overlay lookup com.android.bluetooth com.android.bluetooth:bool/enable_gd_up_to_scanning_layer
true
raven:/ $ dumpsys bluetooth_manager | grep "gd"
    gd_advertising_enabled: true,
    gd_scanning_enabled: true,
    gd_acl_enabled: true,
    gd_hci_enabled: true,
    gd_controller_enabled: true,

APK Signature Scheme v3.1 support

Android 9 Pie introduced support for APK Signature Scheme v3, which made it possible to rotate signing keys. According to the documentation, APK Signature Scheme v3 has the option to include a proof-of-rotation record in its signing block for each signing certificate, enabling apps to be signed with a new signing certificate that’s linked to the past signing certificate used to sign the APK. In order to support installing streamed APKs in Android 11+, Google introduced APK Signature Scheme v4, which stores the signature in a separate file.

Now in Android 13, Google is introducing APK Signature Scheme v3.1, which addresses some of the known issues with APK key rotation on earlier OS versions. This scheme lets apps support original and rotated signers in a single APK, meaning they can target Android 13 or later for rotation without needing to configure multi-targeting APKs. APK Signature Scheme v3.1 uses a new block ID that isn’t recognized on Android 12L or earlier, so earlier releases will use the original signer in the v3.0 block. Furthermore, the new scheme supports SDK version targeting, which allows key rotation to target a later release.

Better error reporting for Keystore and KeyMint

For apps that use the Android Keystore system to store cryptographic keys, Android 13 has an exception that details failures in generating or using a key. The public error codes indicate the cause of the error, while the methods indicate if the error was caused by a system/key issue and if retrying the operation with the same or new key may succeed.

Binary transparency manager

With the launch of the Pixel 6 series, Google started publicly sharing append-only transparency logs that store signed hashes of system images. These transparency logs can be used to verify that the images installed on a device match the factory images that Google publishes online, enabling users to independently verify the integrity of the OS. Users can also verify the integrity of the factory images themselves by checking that there’s an associated entry in the transparency log.

Each entry in the transparency log is comprised of two parts: the build fingerprint of the factory image and a hex-encoded string representing the VBMeta digest. While the build fingerprint is not cryptographically bound, the VBMeta digest is, as it’s a cryptographically bound composite digest of the binary images within a factory image. The VBMeta digest is used during the Android Verified Boot process to determine the integrity of the binaries that are loaded. If the VBMeta digest pulled from a live device or factory image matches the digest published in the transparency log, then users can be assured that they’re looking at a legitimate version of the firmware provided by Google.

In order to construct the payload for verification from an Android device, it’s necessary to read the system properties ‘ro.build.fingerprint’ and ‘ro.boot.vbmeta.digest’ using the ‘getprop’ ADB shell command. Android 13 simplifies pulling this information through the new transparency manager CLI, which also prints information about all identifiable partitions and installed APEX/Mainline modules.

The transparency manager CLI can be accessed via the ‘cmd transparency’ command. It supports the following commands:

raven:/ $ cmd transparency
Transparency manager (transparency) commands:
    help
        Print this help text.

    get image_info [-a]
        Print information about loaded image (firmware). Options:
            -a: lists all other identifiable partitions

    get apex_info [-v]
        Print information about installed APEXs on device.
            -v: lists more verbose information about each APEX

    get module_info [-v]
        Print information about installed modules on device.
            -v: lists more verbose information about each module

The ‘get image_info’ command outputs the build fingerprint and VBMeta digest from the previously mentioned system properties. By appending the -a option, the command also outputs the names, fingerprints, and build times of other identifiable partitions.

The ‘get apex_info’ command outputs information about the installed APEX modules on the device. The ‘get module_info’ command also outputs information about installed APEX modules, but it also outputs information about other installed modules via the PackageManager.getInstalledModules method. This information includes the module’s name, visibility, install location, and computed SHA-256 hash.

Android starts the binary transparency service at boot to get the VBMeta digest and schedule a job to update binary measurements.

Full-disk encryption support removed from Android

Android has supported encrypting the contents of the user data partition through two different schemes: full-disk encryption (FDE) and file-based encryption (FBE). FDE encrypts the entire data partition using a key derived from the user’s PIN, passcode, or password, and before Android boots, the user is required to decrypt the partition. FBE, on the other hand, allows different files to be encrypted with different keys (that are still cryptographically bound to the user’s lock screen authentication method), offering more flexibility through the Direct Boot feature.

GMS requirements require that devices launching with Android 10 or later use file-based encryption (FBE), so Google is removing support for converting devices from FDE to FBE. Furthermore, Android 13 has fully removed support for FDE, so the OS will not recognize the encrypted data partition of devices that haven’t migrated.

Hardware support for Android’s Identity Credential API

To support the development of mobile driver’s licenses applications on Android, Google created the Identity Credential API. This API provides an interface to a secure store for user identity documents, including not just mobile driver’s licenses but any generic document type. The API can be implemented with or without hardware support, but implementing hardware support enables a greater level of security and privacy.

If a device maker chooses to implement hardware support, they must implement the Identity Credential HAL. Implementing this HAL enables the ability to store identity documents in the device’s secure hardware, which on most devices is their Trusted Execution Environment. Few device makers have implemented the IC HAL, but in Android 13, Google plans to make its implementation a requirement for new chipset launches.

For more details on this upcoming platform change, please refer to this article.

Legacy fs-verity support dropped

Fs-verity is a feature of the Linux kernel that Android uses to continuously verify the integrity of APK files using trusted digital certificates. Fs-verity support was initially introduced with Android-specific kernel patches alongside the Pixel 3’s kernel release. The feature was later upstreamed to the Linux kernel and merged in version 5.4 before being backported to Android Common Kernel branches 4.14 and higher. In the process, the API was changed, leaving two different implementations: the legacy fs-verity API and the standard fs-verity API.

Android’s legacy API for fs-verity has been dropped in Android 13. It is no longer possible to use the legacy fs-verity implementation, thus devices with the system property ro.apk_verity.mode set to ‘1’ will need to migrate to the standard fs-verity implementation. GMS requirements already mandate that OEMs build their kernels with CONFIG_FS_VERITY enabled when shipping devices with Android 11 or later, so most devices upgrading to Android 13 should not be affected by this change. In fact, new devices using ACK 4.14 or higher and EXT4/F2FS for the userdata partition automatically come with support for fs-verity.

Linux kernel version requirement update

Google maintains a fork of the upstream mainline Linux kernel in AOSP (android-mainline). Whenever a new Long-Term Support (LTS) release is declared, a new Android Common Kernel (ACK) branches off from android-mainline. For example, when Linux 5.10 was declared as a LTS release back in 2020, android12-5.10 branched off from android-mainline. When a new ACK branch was created, it’s open for feature contributions for the next Android platform release. This is known as the “development phase” and it continues until the ACK branch is declared feature complete, after which it enters the “stabilization phase”. During the stabilization phase, bug fixes and partner features are accepted, as are breaking changes to the Kernel Module Interface (KMI). Before the next platform release is pushed to AOSP, the ACK KMI branch is frozen and Google will no longer accept KMI-breaking changes for the most part, though they will accept bug fixes and other patches that don’t affect the stable KMI for about five years after the initial branching.

When an ACK KMI branch enters the stabilization phase, it’s generally branched off again to allow for development of new features for the next platform release. For example, android13-5.10 branched off from android12-5.10 and is one of two “feature kernels” for Android 13 (the other one being android13-5.15) and will receive feature contributions until the Android 13 platform release. OEMs that wish to upgrade their devices to Android 13 don’t have to ship kernels based on android13-5.10 or android13-5.15, as Google likes to maintain backward compatibility. However, every Android platform release only supports a handful of kernel versions for devices launching with that new Android version, and Android 13 is no different.

According to Google, Android 13 launch devices may ship with kernels based on android12-5.4, android12-5.10, android13-5.10, or android13-5.15. However, it should be noted that devices with chipsets under the Google Requirements Freeze (GRF) program, which freezes Linux kernel version requirements so vendor implementations built for Android version N are certifiable up to version N+3, may still launch with Android 13 using an even older Linux kernel. For example, a device launching with Android 13 that has a chipset with a vendor implementation frozen against Android 11 could feature a kernel based on android11-5.4.

Devices upgrading to Android 13 can feature even older kernel versions. Google maintains a list of kernel versions that are supported and tested with each Android platform release, though the list hasn’t been updated yet for Android 13. It is technically possible to upgrade a device running a Linux kernel version that isn’t supported and hasn’t been tested by Google to Android 13, though it may require manually backporting patches to support new mandatory features. For example, Android 12’s use of the eBPF network traffic tool requires capabilities added in Linux 4.9, which is why Android 12 dropped support for Linux 4.4.

Memory Tagging Extension for Armv8.5+ devices

Mistakes with pointers in C or C++ that cause memory to be misinterpreted, ie. memory safety problems, are some of the most severe bugs encountered by software engineers. The Google Chrome team found that nearly 70% of its serious security bugs are memory safety problems, so they’ve been working to secure Chrome by addressing these problems in three ways. They have been working to implement compile-time and runtime checks to make sure that pointers are correct, and they’ve also explored using memory-safe languages like Rust to write parts of their codebase.

Memory safety bugs also represent a large proportion of high severity security vulnerabilities in the Android platform, which is why Google has been using HWASan to find memory issues, has been writing parts of Android in Rust, and has been preparing to support Memory Tagging Extension (MTE) throughout the Android software stack.

MTE is a hardware feature of Arm v8.5+ CPUs that mitigates memory safety bugs by providing more detailed information about memory violations. It has low CPU overhead so it can always run without significantly affecting the performance.

With the first batch of SoCs with Armv9 CPUs now on the market, Google is adding a new setting in the Developer Options of Android 13 that toggles software support for MTE. The toggle, called “reboot with MTE”, is hidden by default but can be surfaced by the OEM if they set ro.arm64.memtag.bootcl_supported to true. This property can be set by OEMs that don’t want to enable MTE by default yet but want to offer users a preview that can be manually enabled.

After enabling MTE, a message will appear that reads as follows: “System will reboot and allow to experiment with Memory Tagging Extension (MTE). MTE may negatively impact system performance and stability. Will be reset on next subsequent reboot.”

Privacy Sandbox

As third-party cookies are being phased out on the web, Google is evaluating how digital advertising can also be reworked on Android. In February, Google announced a multi-year initiative to build the “Privacy Sandbox” on Android. The goal is to introduce new, more private advertising solutions that limit sharing of user data with third parties and work without cross-app identifiers, including the advertising ID provided by Google Play Services. Google also wants to reduce covert data collection from advertising SDKs integrated into apps.

The Privacy Sandbox is comprised of multiple projects: the Topics API, the SDK Runtime, the Attribution Reporting API, and FLEDGE on Android. The Topics API is an implementation of interest-based advertising (IBA), a form of personalized advertising that selects ads based on the user’s interests derived from the apps the user has engaged with in the past. The SDK Runtime is a platform capability that allows third-party SDKs to run in a dedicated runtime environment, isolating them from the sandbox of the application using the SDK. The Attribution Reporting API helps advertisers measure the performance of their campaigns without using cross-party identifiers. Lastly, FLEDGE enables remarketing and custom audience targeting without sharing identifiers across apps or a user’s app interaction information with third-parties.

Since the Privacy Sandbox is a multi-year initiative, the APIs and services it offers are bound to change over the coming months. Furthermore, we may not even see the first public release of these APIs with the stable release of Android 13 later this year. However, to give developers an opportunity to try these APIs and share feedback, Google is maintaining a separate developer preview for the Privacy Sandbox on Android.

Release timeline

The first of these developer previews was released in April, and it brought a preview of the Topics API and SDK Runtime. Device system images were made available for the Pixel 4 through Pixel 6, as well as an Android SDK, 64-bit Android Emulator system image, and code samples. Apart from the addition of the Privacy Sandbox features, these images are identical featurewise to the Android 13 beta builds.

The second developer preview of the Privacy Sandbox on Android was released on May 17th, bringing an early preview of the MeasurementManager attribution reporting APIs. These include new registerSource() and registerTrigger() methods to “register app ad events and receive event-level reporting data for app-to-app attribution.” 

The third developer preview was released on June 9, 2022, bringing new functionality for the Attribution Reporting API and FLEDGE on Android. For Attribution Reporting, new developer resources were added to test registering attribution source and trigger events, exercising source-prioritized and post-install attributions, receiving event reports, and receiving aggregatable (unencrypted) reports. Meanwhile, FLEDGE on Android adds developer resources to test joining or leaving a custom audience and observing how parameter values can affect auction outcomes, fetching JavaScript auction code from remote endpoints, configuring and initiating on-device ad auctions, and handling impression reporting.

The fourth Privacy Sandbox on Android developer preview was released on July 14, 2022, bringing improvements to all four key components. The updated SDK Runtime lets apps communicate with runtime-enabled SDKs through the addition of the sendData() method, made local storage available in the SDK Runtime process, and made it possible for SDKs to render standalone video-based ads or content. The Attribution Reporting API documentation was improved to provide additional clarity. FLEDGE on Android added support to override remote URLs for retrieving JavaScript logic, improved error reporting during ad selection, and filtered out inactive custom audiences during ad selection. Lastly, the Topics API introduces the “On Device Classifier” system to dynamically assign Topics based on publicly available app info, has been updated to require a new permission with a “normal” protection level, and has been updated to change the return type of the getTopics() API.

Key components of the Privacy Sandbox on Android, including the Topics API, the SDK Runtime, the Attribution Reporting API, and FLEDGE on Android, will be distributed as Mainline modules. When the Privacy Sandbox project enters its beta phase, Google will begin rolling out these Mainline modules to supported devices.

The following sections provide a summary of each project within the Privacy Sandbox on Android initiative.

Topics API

The Topics API is aimed at providing advertisers coarse-grained interest signals (called topics) that are derived from a user’s app usage. Topics are human-readable interest signals that are predefined by humans and number somewhere between a few hundred and a few thousand. The taxonomy will be tailored to the types of ads that can be shown in Android apps, but the initial list is not available as of early May. 

Google is training a classifier model on publicly available app information (such as app names, descriptions, and package names) to derive topics of interest. This model uses signals such as apps installed or recently used to compute topics of interest on-device. The system will use this model to compute the user’s top 5 topics once every epoch (the period of time when topics are computed). 

Apps that call the Topics API may be given a list of up to 3 topics, 1 from each of the past 3 epochs. Google says that providing up to 3 topics ensures that frequently used apps will learn at most 1 new topic each epoch, while infrequently used apps will still have enough topics to find relevant ads. Because the system assigns one of several topics to each app that invokes the API, it’s difficult for two apps to correlate information with a specific user since different apps get different topics. The Topics API will only return topics that the caller has observed in the past, however.

Apps will be able to opt out of the Topics API through new manifest elements, and users will be able to view and remove topics that are associated with their app usage. Neither of these have been implemented yet, however.

Google says that the topics API implementation and its usage of the classifier model will be made available in AOSP. The classifier model itself will be freely available to apps that want to test to see what topics their apps classifies to.

SDK Runtime

Currently, SDKs that are bundled with an app are executed within the host’s app sandbox. This gives them the same privileges and permissions as their host app and also lets them access the host app’s memory and storage. Unscrupulous SDKs have used this to their advantage to collect and share user data unbeknownst to the user or developer. In order to combat this, Android 13 enables support for running select third-party SDKs in a dedicated runtime environment called the SDK Runtime. 

Compatible SDKs — referred to as runtime-enabled (RE) SDKs — operate in an isolated process and communicate with apps via well-defined permissions and APIs. SDKs in the SDK Runtime will by default have access to permissions commonly used by ads-related SDKs, such as INTERNET and AD_ID. They will also be granted permissions to access the new privacy-preserving APIs that provide core advertising functionality without the need for cross-app identifiers.

Runtime-enabled SDKs aren’t statically linked and packaged with apps under this design. Instead, SDK developers upload their versioned SDKs to app stores and app developers specify their dependencies by version. When the user downloads an app, the installer downloads the app’s specified dependencies from the app store.

Currently, the SDK Runtime is designed to support advertising-related SDKs. Like other projects in the Privacy Sandbox, the SDK Runtime is under active development, so Google is still seeking feedback on its design. The design doc goes into more detail on the changes to access, execution, communication, development, and distribution that SDK developers need to be aware of.

Attribution Reporting

Google’s proposed Attribution Reporting API provides support for the following features: conversion reporting, optimization, and invalid activity detection. The API helps advertisers measure the performance of their ad campaigns by showing them conversion counts and values across campaigns, ad groups, and ad creatives. The API provides per-impression attribution data that can be used to train ML models, thus allowing advertisers to optimize ad spend. Lastly, the API can provide reports to analyze invalid traffic and ad fraud.

The Attribution Reporting API supports the aforementioned use cases while improving privacy over existing mobile attribution and measurement solutions that use cross-party identifiers. It does this by limiting the number of bits available for event-level reports, enabling higher-fidelity conversion data in aggregatable reports only, rate limiting available conversions and the number of ad tech platforms that can be associated with a single attribution source, and incorporating various noise adding techniques.

In order to use the Attribution Reporting API, ad tech platforms must first complete an enrollment process. Then, they must register attribution sources and conversions using the registerSource() and registerTrigger() methods of MeasurementManager. The API will then match conversions to attribution sources and send conversions off-device through event-level and aggregatable reports to ad tech platforms.

Report uploads happen at the end of fixed intervals of time rather than at an exact scheduled time. By default, the reporting upload interval is 4 hours, but this can be overridden by sending the following shell command:

cmd device_config put adservices measurement_main_reporting_job_period_ms <duration in milliseconds>

FLEDGE on Android

FLEDGE, short for First “Locally-Executed Decision over Groups” Experiment, is another web API that is being adapted to Android. It’s designed for advertisers who seek to serve ads to potentially-interested users, ie. users who previously interacted with the advertiser’s app. 

FLEDGE encompasses two APIs: the custom audience API and the ad selection API. The custom audience API lets apps or SDKs create and use a custom audience representing a group of users with common intentions or interests. Audience information is stored locally on the device, limiting the sharing of user information. The ad selection API orchestrates auction execution for ad tech platforms. Ad tech platforms are expected to write Javascript code implementing the buy-side bidding logic, buy-side ad filtering and processing, and sell-side decision logic, which are run sequentially on-device.

Flow chart showing the custom audience management and ad election workflow. Source: Google.

Remote Key Provisioning

The Android Keystore API lets apps store cryptographic keys in a container that can later be used for cryptographic operations. The key may be stored in a software or hardware-backed container; the latter is far more secure as the key material is never exposed outside of the device’s secure hardware. When assembling a device, OEMs request a Google-signed attestation private key that is provisioned to the device’s secure hardware before leaving the factory.

If an attacker somehow manages to compromise that key, whether via a leak from the factory or a vulnerability in the secure hardware, it would need to be revoked so it can’t be used by other devices. Doing so would prevent potentially tens of thousands of users of the same device from accessing many features, since a single key is often used to provision many devices and hardware-backed key attestation is employed by a range of apps and services that require security, such as SafetyNet Attestation, Identity Credential, Digital Car Key, and more. In order to address these issues, Google is revising its attestation infrastructure to add support for Remote Key Provisioning, which will be mandatory for Android 13 devices.

Under the new Remote Key Provisioning scheme, OEMs will no longer provision attestation private keys in the factory. Instead, a unique, static public/private keypair is generated by each device at the factory, and the OEM extracts the public portion of the keypair and submits it to Google. The public keys serve as the basis of trust for provisioning later, while the private key never leaves the secure hardware where it’s generated. When the device is powered on and connected to the Internet, it sends a certificate signing request to Google, signed with the private key in the secure hardware. Google verifies the authenticity of the request by looking up the public keys it stored earlier, and if it’s verified, a temporary attestation certificate is sent to the device. Keystore then assigns these certificates to apps requesting attestation. When the attestation certificate expires, the process repeats.

How Remote Key Provisioning works on Android
How Remote Key Provisioning works. Credits: Google.

Google says that Remote Key Provisioning is “privacy preserving” because each application receives a different attestation key, the keys themselves are regularly rotated, and backend servers are segmented so the server verifying the public key does not see the attached attestation keys. While these changes won’t impact end users, developers that rely on hardware-backed key attestation will need to be aware that the certificate chain length is longer than before, the root of trust will use an ECDSA key instead of an RSA key, that RSA key attestation will be deprecated, and that certificates will generally be valid for up to two months before they’re rotated.

Shared UID migration

Android isolates app processes from one another by assigning a unique user ID (UID) to each application at installation. This is a key part of the application sandbox, which is one of Android’s core security features. However, apps signed by the same key can have a shared user ID which enables them to access each other’s data and run in the same process, letting them communicate directly instead of via IPC. This is achieved by setting the android:sharedUserId element to be the same in the manifest of both apps.

Google highly discourages use of this feature and in Android 10, they deprecated the android:sharedUserId constant. However, Android doesn’t support migrating off a shared user ID, so existing apps that utilize the feature cannot remove the constant from their manifests. This will change in Android 13, which will provide apps a way to migrate off a shared UID.

Apps can set the android:sharedUserMaxSdkVersion attribute, which is the maximum SDK version for which the app will remain in the UID defined in android:sharedUserId, to be 32. Doing so tells the system that the app no longer relies on a shared user ID. Newly installed apps that declare android:sharedUserMaxSdkVersion will behave as if they never defined a shared UID. On the other hand, updated apps will still use the existing shared UID. Google recommends that apps continue to leave the android:sharedUserId attribute in place if it was previously defined, as removing it can cause updates to fail.

Virtualization support for isolated compilation

Android 13 includes the first release of the pKVM hypervisor and virtual machine framework. Google’s goal is to de-privilege and isolate third-party code (such as third-party code for DRM and cryptography) from Android by having it execute in a virtual machine at the same privilege level as the OS and kernel rather than at a higher level.

To accomplish this, Google has chosen to deploy KVM as the common hypervisor solution (pKVM is simply KVM with additional security features) and crosvm as the virtual machine manager. pKVM is enabled through the kernel, while crosvm is shipped as part of the new Virtualization (com.android.virt) Mainline module. Google has been using the Pixel 6 to test pKVM and the Virtualization module, but prior to the Android 13 release, neither was enabled in production builds. Starting with Android 13, however, the Pixel 6 series ships with the Virtualization module as well as KVM support out of the box. This allows the devices to securely boot operating systems in a virtual machine.

The Virtualization module contains images for a lightweight and headless build of Android called “microdroid” which is used to execute targeted payloads. Microdroid is currently used for “isolated compilation” of boot and system_server classpath JARs. This logic is handled by the new CompOS module (com.android.compos) which manages isolated compilation. Further uses of virtualization in Android 13 have yet to be implemented.

For more information on virtualization in Android 13, refer to this article. For a guide on how to use crosvm on the Pixel 6 series, refer to this article.

Head tracking sensor support

Google began work on implementing spatial audio with dynamic head tracking support in Android 12L, but full support for the feature has been added to Android 13. Android’s Sensor class has added a new constant for head tracking sensors, STRING_TYPE_HEAD_TRACKER or android.sensor.head_tracker. On devices that declare head tracking support, denoted by the feature android.hardware.sensor.dynamic.head_tracker, the SpatializerHelper class can initialize head tracking sensors denoted by a UUID reported by a connected Bluetooth A2DP device. The head tracking mode can then be set to one of the supported modes: STATIC (no head tracking), RELATIVE_WORLD (no screen tracking), or SCREEN_RELATIVE (full screen-to-head tracking). The device must ship with a spatializer effect that can use head tracking in order for head tracking functionality to be initialized and the corresponding APIs to be active.

Head tracking combines accelerometer and gyroscope data to determine the rate of rotation and orientation of a user’s head relative to an arbitrary reference frame. Android provides head tracking sensor event data in Euler vector representation, with the direction indicating the axis of rotation and magnitude indicating the angle to rotate around that axis. The axes of this coordinate frame are centered around the head, with the X axis crossing the user’s ears, the Y axis crossing the back of the user’s head through their nose, and the Z axis crossing from the neck through the top of the user’s head.

Because certain head movements are physically impossible, accelerometer and gyroscope data will be restricted to certain axes. Android 13 adds new Sensor.TYPE_ACCELEROMETER_LIMITED_AXES and Sensor.TYPE_GYROSCOPE_LIMITED_AXES to denote these sensor types where one or two axes are not supported.

When a head tracking device is removed and then put back, the reference frame may have significantly changed, causing a discontinuity. The new firstEventAfterDiscontinuity field will be set to true when this happens, so apps can be aware of the sudden and significant change in the reference frame.

Heading sensor support

Heading sensors are used to provide the direction the device is pointing to relative to true north. These sensors combine data from accelerometers and geomagnetic field sensors to determine the direction. Since Android already supports these two sensors, it is already possible to determine the direction of a device relative to true north. Android 13, however, adds support for a new composite sensor of TYPE_HEADING, which directly provides the heading without additional calculations.

The TYPE_HEADING sensor returns values between 0.0 and 360.0, with 0 indicating north, 90 east, 180 south, and 270 west. It also returns the accuracy in degrees. The accuracy is defined at a 68% confidence. If the heading returns 60 degrees with an accuracy of 10 degrees, then there’s a 68 percent probability that the true heading is between 50 and 70 degrees, ie. within one standard deviation of the mean assuming a normal distribution.

EROFS support

In recent years, many Android OEMs have adopted EROFS, short for “Enhanced Read-Only File System”, as the file system to format their devices’ read-only partitions in. EROFS, first developed and put in use by Huawei, offers significant storage space savings and performance benefits over the EXT4 file system when used for Android’s read-only partitions. The kernel driver for EROFS became part of the mainline Linux kernel with version 5.4 in late 2019 and was subsequently made available in the android11-5.4 Android Common Kernel branch. Thus, any Android device with a recent GKI version should have kernel-level support for EROFS.

Google attempted to mandate the use of EROFS for all read-only partitions of Android 13 launch devices, but the company relaxed this requirement late in the development cycle. Google will instead require that Android 13 launch devices ship with kernel support for EROFS in order to pass the Vendor Test Suite (VTS). However, if the Android 13 launch device’s vendor software was built for an older Android version, then the device’s kernel does not have to support EROFS to pass the VTS. This exception was added so devices launching on Android 13 with vendor software built for older Android versions can still pass certification, something that was made possible with the Google Requirements Freeze (GRF) program.

For more information on the history of EROFS in Android and Linux, you can read this article I previously wrote.

exFAT support

exFAT, short for Extensible File Allocation Table, is a Microsoft-developed and patented file system that was made to solve some of the limitations of FAT32. Flash memory devices like USB flash drives and SD cards are commonly formatted in exFAT, so in order to read their contents, the host device needs to have an exFAT driver. Since exFAT is proprietary, however, open source projects like Linux and AOSP were unable to include an exFAT driver for many years. That changed in 2019 when Microsoft published the technical specification for exFAT and endorsed its addition to the Linux kernel, paving the way for native exFAT support to reach all Android devices.

An exFAT kernel driver developed by Samsung became part of the mainline Linux kernel with version 5.7. Subsequently, this driver was merged downstream to the Android Common Kernel and is available in branches since android12-5.10. However, vold, Android’s volume daemon, will not mount exFAT volumes unless certain helper binaries are present. These binaries are included in AOSP builds when building the master branch but are not present when building Android 12 or 12L specifically. The Android 13 beta builds for the Pixel 6 and 6 Pro, two devices that ship with Linux 5.10 and hence have kernel drivers for exFAT, feature the aforementioned helper binaries. Hence, it’s likely that going forward, all Android 13 devices with Linux 5.10+ will support exFAT out of the box.

For more information on how exFAT came to be supported on Android and Linux, you can read this article.

Smart idle maintenance

Android 13 adds a smart idle maintenance service, which intelligently determines when to trigger filesystem defragmentation without hurting the lifetime of the UFS storage chip.

Smart idle maintenance can be manually run through the ‘sm’ shell command:

sm idle-maint [run|abort]

USB HAL 2.0 with support for limiting power transfer and audio docks

Google is updating Android’s USB HAL to version 2.0, introducing several new features. First, the new enableUsbData API lets system apps toggle USB data on a specific port rather than in all ports. Another addition, which is not present in AOSP at the moment,  is the limitPowerTransfer API. When this API is invoked, power transfer is limited to and from the USB port. This behavior is limited when the USB service detects the USB has been disconnected. Next, the new enableUsbDataWhileDocked API enables data transfer over USB while the device is docked. Lastly, support for USB digital audio docks has been introduced, represented by the device type DEVICE_OUT_DGTL_DOCK_HEADSET. These system APIs are guarded by the MANAGE_USB permission, which has the signature|privileged protection levels.

These system APIs, along with other new features in Android 13, are likely intended for tablets that can be docked.


What are the new APIs in Android 13?

Camera2 improvements: Preview stabilization, jitter reduction, 60fps streams

In addition to HDR video support and stream use cases in Camera2, Google at I/O announced several additional improvements that will make their way to the framework Camera2 API as well as the Jetpack CameraX support library. 

These include a new preview stabilization API, which stabilizes the preview as well as all other non-RAW streams, in order to give clients a “what you see is what you get” effect. When this is enabled, the field of view is reduced by a maximum of 20% both horizontally and vertically for the given zoom ratio/crop region. 

Next, Google says that Android 13 reduces viewfinder jitter when a camera device outputs to a SurfaceView or SurfaceTexture output surface. The way this works is that “the camera system re-spaces the delivery of output frames based on image readout intervals.” This arrives as Android 13 adds APIs to set the timestamp base for each camera output stream.

Google also says that apps will be able to use high stream configuration in 60fps in addition to 30fps.

LED flash brightness control

The vast majority of Android smartphones and tablets have at least a single rear-facing camera accompanied by a LED flash module. Though Android supports toggling the LED flash on or off through an API, it doesn’t provide an API for modulating the brightness. This, however, will change in Android 13.

The Android 13 release includes new methods in the CameraManager class that let apps get and set the torch strength level. Only devices that report a value greater than 1 when apps use CameraCharacteristics.FLASH_INFO_STRENGTH_MAXIMUM_LEVEL will support programmatically setting the brightness of the flashlight. OEMs will need to implement a new version of the camera device HAL in order to add support for this API. Because of Google’s vendor freeze requirements, however, support for this feature will likely be very limited among devices that upgrade to Android 13.

For more information on Android 13’s new LED flash brightness control API, please refer to this article that goes more in-depth.

Block users from adding new WiFi networks

Enterprises can now block users of fully managed devices from adding a new Wi-Fi network in Android 13. Device or profile owner apps can add the new UserManager.DISALLOW_ADD_WIFI_CONFIG restriction to hide the “add network” option in Internet settings and make Settings reject any requests to add a Wi-Fi network.

WiFi SSID policy

Android 13 will let enterprises configure an allowlist or denylist of Wi-Fi SSIDs that the device can connect to. The new WifiSsidPolicy API lets device admins set a restriction policy that the network must satisfy. If the policy type is a denylist, then the device cannot connect to any networks on the list. If the policy type is an allowlist, then the device can only connect to networks on the list. Networks configured by the admin are not exempted from the restriction policy set by this API. Furthermore, this API does not prevent users from adding a network that is present on the denylist or missing from the allowlist – if they do so, the network will simply be disconnected after being added.

Color vector fonts

Android 13 can render COLR version 1 fonts, which is a new and highly-compact font format that supports color gradients. The system emoji have also been updated to the COLRv1 format. Android will handle rendering text using COLRv1 for most apps, but for apps that implement their own text rendering using the system’s fonts, Google recommends at least testing how emojis render. For more information on COLRv1, check out the announcement on the Chrome blog.

Stylus handwriting

Android 13 introduces an API for the current input method (ie. keyboard) to receive stylus events when an editor is focused. To test this behavior, developers can enable the new “Stylus handwriting” setting in developer options. Android will check if this developer option is enabled before calling InputMethodManager.startStylusHandwriting to start the stylus handwriting session on the given View. Input methods that declare support for stylus input should show an inking window on ACTION_DOWN events to let the user perform handwriting input.

Developer option to enable stylus handwriting in Android 13

Text conversion APIs

Languages that use phonetic alphabets, such as Japanese and Chinese, are getting improvements to search speeds and auto-completion in Android 13. Apps can use the new text conversion API to convert characters between phonetic alphabets. This will, for example, make it so Japanese users can type search queries in Hiragana and immediately see results in Kanji.

Framerate interventions

Android’s Game Mode, first introduced in Android 12, now supports setting the FPS that a game should run at. By setting the attribute allowGameFpsOverride to true, developers can opt in to FPS override interventions. Developers can override the FPS of their game in Android 13 through the CLI for Game Mode:

cmd game set --fps [30|45|60|90|120|disabled]

Loading time improvements via GameState hints

Android’s GameManager API has added a method called setGameState that lets games communicate the current state of the game to the platform. Games can pass the top-level state of the game, indicating if the game can be interrupted or not. Games can also tell the platform if it’s loading something (assets/resources/compiling/etc.), which can pass a hint to the power HAL to boost CPU performance to improve loading times. The loading time hint is part of the new power HAL version which adds the GAME_LOADING mode to the Mode.aidl file; device makers should configure the powerhint.json file to specify the CPU performance tuning that should be done when the GAME_LOADING mode is active.

Google plans to add a test to VTS that enforces GAME_LOADING mode for all devices which ship with Android 13 or later, but we do not know if this requirement is final. Due to the Google Requirements Freeze (GRF) program, it’s possible that many devices upgrading to Android 13 will not include the updated power HAL version with the new GAME_LOADING mode.

Programmable shaders

The Android Graphics Shading Language (AGSL) is derived from the OpenGL Shading Language (GLSL) but is designed to work within Android’s rendering engine to customize painting within Android’s canvas and filter View content. Android internally uses RuntimeShaders, with behavior defined using AGSL, to implement blur, ripple effects, and stretch overscroll in previous versions of Android. With Android 13, developers can create advanced effects of their own using programmable RuntimeShader objects.

Schedule exact alarms without user prompts

Android’s AlarmManager API lets apps schedule background work, and it offers the ability to schedule inexact alarms or exact alarms. Inexact alarms are delivered by the system within a window of the trigger time specified by the app, while exact alarms run at exactly the time specified by the app. Because exact alarms have the ability to bring a device out of doze mode, which if done too frequently can adversely affect the device’s battery life, Google decided to gate exact alarm APIs behind a permission.

In Android 12, Google introduced the SCHEDULE_EXACT_ALARM permission, which apps targeting API level 31+ had to hold in order to use exact alarm APIs. On Android 12 and 12L, this permission is automatically granted at install time but can be revoked by the user by going to Settings → Apps → Special app access → Alarms & reminders. This is because the SCHEDULE_EXACT_ALARM permission has a protection level of “normal|appop” on Android 12 and 12L.

For apps that depend on scheduling exact alarms as part of their core functionality, Android 13 provides the new USE_EXACT_ALARM permission that grants access to exact alarm APIs. Unlike SCHEDULE_EXACT_ALARM, however, USE_EXACT_ALARM cannot be revoked by the user through Settings. This is because USE_EXACT_ALARM has a protection level of “normal” on Android 13.

However, Google warns that “app stores may enforce policies to audit and review the use of this permission” since this permission was introduced “only for apps that rely on exact alarms for their core functionality.”  The company specifically says that an upcoming Google Play policy will prevent apps from using the USE_EXACT_ALARM permission unless they’re an alarm app, a clock app, or a calendar app that shows notifications for upcoming events. This policy will take effect July 31, 2023.

In conjunction with this change, Google planned to make it so apps targeting Android 13 would no longer be granted the SCHEDULE_EXACT_ALARM permission at install time. This would make it so apps would have to request the user to grant the SCHEDULE_EXACT_ALARM permission through Settings, or request the USE_EXACT_ALARM permission but face scrutiny from app stores. Google even went as far as changing the protection level of the SCHEDULE_EXACT_ALARM permission to “appop” in Android 13 from its previous “normal|appop” in Android 12 and 12L. However, the company deferred this change to a future release.

For a more in-depth look at the impact of these permissions on background work in Android, read this article.

Anticipatory audio routing

Android 13 introduces new audio route APIs to AudioManager that media apps can use to anticipate how their audio will be routed. Apps can use the new getAudioDevicesForAttributes() API to retrieve the list of devices that may be used to play the specified audio track based on the provided audio attributes. The getDirectProfilesForAttributes() API helps determine if the audio stream can be played directly on those devices, ie. without resampling or downmixing.

More granular media file permissions

Android 10 introduced the concept of “Scoped Storage” to restrict applications’ access to files on external storage directories. One of the biggest changes introduced with Scoped Storage is the restriction of what files can be accessed if an app holds Android’s READ_EXTERNAL_STORAGE permission. Starting with Android 11, which is when Scoped Storage became enforced, apps holding the READ_EXTERNAL_STORAGE permission aren’t granted broad access to the external storage but rather are given access to media files owned by other apps residing in well-defined media collections (MediaStore.Images, MediaStore.Video, and MediaStore.Audio; or MediaStore.Files).

In an effort to improve transparency and provide more control to users, Android 13 makes media file access even more granular. Apps targeting Android 13 must now request individual permissions to read audio (READ_MEDIA_AUDIO), video (READ_MEDIA_VIDEO), or image files (READ_MEDIA_IMAGES). If an app requests permissions to read video and image files at the same time, the system will combine the permissions dialog for both.

Apps targeting Android 12 or lower must continue to request READ_EXTERNAL_STORAGE to access media files owned by other apps, however. The system will warn the user that granting these apps access to certain types of media files will grant them access to all other media file types. Alternatively, apps can interact with the system document picker app or the new system photo picker for files and photos respectively in order to retrieve user-selected files/photos without the need for any permissions.

Performance Class 13

Android 12 introduced the concept of a “Performance Class” to make it easier for app developers to determine if a device is capable of performing highly-demanding tasks. For example, say a social media app offers a real-time video processing effect as a feature, but the developer wants to control which devices it’s available on to ensure a good user experience. 

Before the Performance Class API, app developers could set up a device allowlist of known device models the feature works well on, query the device’s available memory and CPU information, or perform a quick benchmark, but none of these solutions are universal. The allowlist requires prior knowledge and may exclude capable devices. Having an ample amount of RAM alone doesn’t guarantee the feature will be performant, while the CPU needs to be checked against a database to ensure it’s powerful enough. Adding an in-app benchmark may require a lot of additional work.

The Performance Class API solves these issues by shifting the burden of categorizing a device’s performance onto its OEM. Google defines a set of requirements in the CDD that the device must meet in order to fall under a certain Performance Class. These requirements are tailored towards media apps and are broken down into sections on AV, camera, hardware, and performance. There is a new Performance Class for every Android release, starting with Android 11, but devices declaring that they meet a Performance Class from an older release don’t have to declare support for a newer Performance Class when they update to the newer release (which makes sense as many of the requirements involve hardware that’s set in stone.)

Google has not published the full list of requirements to meet Performance Class 13, but at Google I/O 2022, they did share a preview of what to expect. Performance Class 13 makes the following changes compared to Performance Class 12:

  • Generic
    • Increased memory requirements
  • Media
    • Higher concurrent codec sessions
    • Lower codec latency & frame drops
    • AV1 hardware decoder support
    • Secure hardware decoders
    • Round trip audio latency
    • Wired headsets & USB audio devices
  • Camera
    • Preview stabilization
    • Slow-mo recording
    • Minimum zoom ratio for ultrawide
    • Concurrent camera

The most notable change is the requirement to support AV1 decoding at a hardware level. Current generation Qualcomm Snapdragon devices do not support hardware-accelerated AV1 decoding, hence they would be unable to claim they support Performance Class 13.

For the full rundown on what’s required for Performance Class 11 and Performance Class 12, read this section of my Android 12 CDD summary. Once Google publishes the Android 13 CDD, I will update this section with the full list of requirements for Performance Class 13.

Newly restricted non-SDK interfaces

Starting in Android 9, Google decided to block apps from accessing non-SDK interfaces, ie. hidden APIs that aren’t documented in the public SDK. The reason is that these APIs are unsupported (hence why they aren’t part of the public SDK) so apps should not be relying on them. Hidden APIs are subject to change in breaking ways that aren’t documented in the Android SDK, resulting in headaches for developers and a poor experience for users.

Android maintains a list of non-SDK interfaces in a file called hiddenapi-flags.csv. This file is updated every release to include new non-SDK interfaces that ART will either block entirely or permit but throw a warning about the unsupported usage. Apps targeting Android 13, for example, are now blocked from accessing the following four APIs that were previously only considered unsupported in Android 12:

Landroid/app/Activity;->setDisablePreviewScreenshots(Z)V # Use setRecentsScreenshotEnabled() instead.

Landroid/os/PowerManager;->isLightDeviceIdleMode()Z # Use isDeviceLightIdleMode() instead.

Landroid/os/Process;->setArgV0(Ljava/lang/String;)V # In general, do not try to change the process name. If you must change the process name (for instance, for debugging), you can use pthread_setname_np() instead, though be aware that doing this might confuse the system.

Landroid/view/accessibility/AccessibilityInteractionClient;->clearCache(I)V # Use android.accessibilityservice.AccessibilityService#clearCache() instead.

(These APIs can continue to be accessed on Android 13 provided the app that calls them targets Android 12 or lower.) 

The complete list of all non-SDK interfaces for Android 13 can be downloaded here.

Because hidden APIs sometimes enable functionality that isn’t supported by any public APIs, Google asks developers to submit a request for a new public API if they cannot find an alternative to a non-SDK interface.

OpenJDK 11 support

Google has been recently experimenting with building Android with Java 11 as the default version, and the company says that they plan to not only refresh Android’s Core Libraries in Android 13 but that these changes will be backported to Android 12 devices through an update to the ART module. This means that Android’s Core Libraries will align with the OpenJDK 11 LTS release, bringing both library updates and new programming language features for app and platform developers.

Quick Settings Placement API

In Android 7.0 Nougat, Google introduced the TileService API to let apps add their own custom tiles to the Quick Settings. However, in order to add a tile from a third-party app to the Quick Settings, the user needs to pull down the notifications shade, tap the Quick Settings edit button (usually taking the form of a pencil icon), and then scroll down to find the tile they want to add. While third-party apps have numerous ways to inform users about the existence of their custom tiles, users still need to manually add the tile.

An example dialog asking the user to place a tile in the Quick Settings through Android 13's new Quick Settings Tile Placement API
A screenshot of a sample app using the new tile placement API to prompt a user to add a tile to the set of active Quick Settings tiles. Source: Google.

Starting in Android 13, however, a new tile placement API will let apps prompt users to directly add their custom tile to the set of active Quick Settings tiles. When an app calls this API, a system dialog will appear that lets the user add the tile in a single tap. This will make it easier for users to discover your app’s custom Quick Settings tiles.

Computer and App Streaming device profiles

Google introduced the Role API in Android 10 to grant multiple, often unrelated permissions to apps based on the type of role the app fulfills. For example, apps holding the DIALER role are automatically granted permissions related to phone calling, contacts, messaging, and microphone. Since roles give access to a wide array of (often sensitive) permissions, and it’s not always necessary for apps to need those permissions at all times, Google devised a way for the system to temporarily grant a role.

In Android 12, Google added the Companion Device Manager (CDM) profiles feature to make it easier for apps to request and be granted access to the requisite permissions needed to manage a smartwatch. Under the hood, a system app called Companion Device Manager grants the COMPANION_DEVICE_WATCH role to apps that request it, giving those apps access to permissions they need to sync phone status and data with a smartwatch. Users only see a single permissions dialog instead of multiple, saving time and reducing friction. When the user resets their smartwatch, causing the association between the device and the smartwatch to be lost, the Companion Device Manager app revokes the role until the next time an association is made.

Smartwatches aren’t the only “companion” devices where this flow can be applied to simplify setup. Recognizing this, Google has created the new COMPANION_DEVICE_COMPUTER and COMPANION_DEVICE_APP_STREAMING roles in Android 13. The first role grants the permissions needed to access notifications, recent photos, and recent media, while the second role grants the permissions for creating a virtual display (where apps can be launched and then streamed to a PC). Only system apps can hold these roles, however, as the underlying permissions have a system|signature protection level.

COMPANION_COMPUTER_DEVICE role
The role definition for COMPANION_COMPUTER_DEVICE
COMPANION_DEVICE_APP_STREAMING role
The role definition for COMPANION_DEVICE_APP_STREAMING

For a more detailed breakdown on these new roles and the Role API in general, refer to this article.

Background access of body sensors requires new permission

Android has long allowed applications to access data from sensors that measure the heart rate, temperature, or blood oxygen levels of the body. This data can only be accessed by applications that hold the BODY_SENSORS permission, which has a protection level of “dangerous”. Until Android 13, applications that held this permission could access body sensor data while in the background. Android 13 changes this by adding a new permission called BODY_SENSORS_BACKGROUND. Apps holding the BODY_SENSORS permission on Android 13 will only have access to body sensor data while the app is in use. In order to access body sensor data while in the background, apps must hold both the BODY_SENSORS and BODY_SENSORS_BACKGROUND permissions.

The new BODY_SENSORS_BACKGROUND permission also has a protection level of “dangerous”, but unlike the BODY_SENSORS runtime permission, BODY_SENSORS_BACKGROUND is hard restricted. This means that the PackageInstaller has to allowlist the permission while installing the app so it can later be granted by the user.

Developer downgradable permissions

Applications need permissions to access many of Android’s APIs, but they may not necessarily need persistent access to those APIs. However, once they’ve been granted that permission — either at install-time or at runtime — they’ll retain that permission until the user uninstalls the app, the user manually revokes the permission, or the system automatically revokes the permission when the app enters hibernation.

In Android 13, Google has added a new API that enables developer downgradable permissions. Apps can trigger the revocation of one or more runtime permissions granted to the package calling the API. Apps that don’t need access to certain runtime permission-gated APIs can self-revoke those permissions so users can be assured those apps aren’t using those APIs without their knowledge.

Disable the screenshot shown in the recents overview

Android 13 introduces the setRecentsScreenshotEnabled API so developers can tell the system to never take a screenshot of an activity for use as a preview in the recents overview. This differs from the FLAG_SECURE window flag in that it only applies to screenshots the system takes for the recents overview — it does not block screenshots taken by the user or the Assistant.

Nearby device permission for Wi-Fi

Because a device’s location can be inferred by tracking nearby Wi-Fi APs and Bluetooth devices, Google decided to prevent apps from accessing Bluetooth or Wi-Fi scan results unless those apps hold location permissions. It made sense for Google to gate these features behind location permissions given that they could be used to derive a user’s physical location, but it resulted in confusion from users who believed that their apps were tracking their location, because both ACCESS_COARSE_LOCATION and ACCESS_FINE_LOCATION are “dangerous” (ie. runtime) permissions that require post-install user consent to be granted.

To reduce confusion, Google introduced new BLUETOOTH_SCAN, BLUETOOTH_CONNECT, and BLUETOOTH_ADVERTISE permissions under the NEARBY_DEVICES permission group in Android 12. These permissions can be requested by apps that need to interact with Bluetooth, and when one or more of them are requested by the app, the system prompts the user to allow the app access to “nearby devices”. An optional Manifest attribute called “neverForLocation” lets the app strongly assert that it won’t derive physical location.

In Android 13, Google is similarly decoupling Wi-Fi scanning from location. Android 13 introduces the new NEARBY_WIFI_DEVICES runtime permission under the NEARBY_DEVICES permission group. This permission should be requested by apps that need to manage a device’s connections to nearby Wi-Fi APs and will in fact be required to call many commonly used Wi-Fi APIs. The optional Manifest attribute “neverForLocation” will let developers strongly assert that their app won’t derive physical location from Wi-Fi scan results.

Non-dangerous permission to read the phone state

By holding the READ_PHONE_STATE permission, Android apps can read the current cellular network information, status of any ongoing calls, and a list of PhoneAccounts registered on the device. This information may contain sensitive information, so the READ_PHONE_STATE permission has a protection level of “dangerous” and hence must be granted by the user at runtime. For apps that only need to determine the cellular network type, Android 13’s new READ_BASIC_PHONE_STATE permission provides a “non dangerous” alternative. This permission has a protection level of “normal”, hence it is granted by the system at install time.

Runtime permission for notifications

Unlike with other APIs, apps by default can post notifications without requesting any permission. Notifications are the key way for Android apps to interact with users outside of the app, so it makes sense why Google didn’t gate them behind a permission check.

While most apps utilize notifications to post useful alerts and reminders, some apps misuse notifications to send unsolicited advertisements. Android does let users turn off notifications on a per-app and per-channel basis through an interface in Settings, however, this approach has multiple problems. By making notifications opt-out rather than opt-in, and putting the settings to opt-out behind several layers in Settings, most users will keep the default notification settings. Developers and marketers who send notifications to reengage users with their apps and services will find this valuable, but if too many apps post notifications, their importance will be reduced and they’ll feel overwhelming to the user.

That’s why in Android 13, Google has reworked the notification contract between apps and the Android OS. In Android 13, Google has added a runtime permission for notifications. However, in order to not be disruptive to users and developers, notification access in Android 13 is handled differently depending on the target API level of the app that’s being run. Regardless of an app’s target API level, however, Android 13 will prompt the user to grant an app permission to send (non-exempt) notifications.

Here’s how Android 13 handles notifications access based on an app’s target API level:

  • If a newly installed app’s target API level is…
    • 33, the app needs to declare the android.permission.POST_NOTIFICATIONS permission in its Manifest. This permission has a protection level of “dangerous” and hence apps are required to show a runtime prompt to the user in order to be granted the permission. Packages that have not been granted the permission will have their notifications silently dropped by the system.
    • 32 or lower, the system will show the permission dialog when the app creates its first notification channel.
  • If an existing app’s target API level is…
    • 33, the system temporarily grants the app permission to send notifications until the first time an activity in the app is launched. The app must have had an existing notification channel and its notifications must not have been explicitly disabled by the user.
    • 32, the system temporarily grants the app permission to send notifications until the user explicitly selects an option in the permission dialog. The temporary grant persists if the user dismisses the permission dialog before making a choice.

The permission dialog for the new notification permission is structured like other dialogs for runtime permissions. If the user…

  • selects “Allow,” then the app can send notifications through any channel and post notifications related to foreground services.
  • selects “don’t allow,” then the app cannot send notifications through any channel, except for a few specific roles.
  • swipes the dialog away, then the app can only send notifications if the system has a temporary grant.

MediaStyle notifications are exempt from Android 13’s notification runtime permission.

This change puts Android in line with iOS, which also requires users to opt-in to notifications from apps. Developers of Android apps will now need to put in effort to convince users to turn on notifications. Developers are encouraged to request the notification permission in context, ie. prompt the user only after explaining why the app needs the permission. Once the app has been granted permission, developers should use the permission responsibly, as users can at any time revoke the permission. Apps can check if the user has enabled notifications by calling the areNotificationsEnabled() method of NotificationManager.

Upon upgrading to Android 13, the Android System will prompt the user (via a notification) to “review notification settings.” Tapping this notification will bring the user to Settings, where the user can deny notification access for apps they’ve previously installed.

Safer exporting of context-registered receivers

Android 12 required app developers to explicitly declare whether any activity, service, or broadcast receiver with intent filters statically defined in the app’s Manifest file should be exported or not. Google asked app developers to carefully consider whether they wanted to expose their manifest-declared intent receivers to other apps, and in Android 13, they’re doing the same for context-registered receivers as well.

Developers that dynamically register broadcast receivers in their apps should add either the RECEIVER_EXPORTED or RECEIVER_NOT_EXPORTED flag. This way, developers can decide if they want their receivers to be available for other apps to send broadcasts to. Google isn’t requiring that apps targeting Android 13 utilize this feature, but they highly recommend it as a security measure.

Ambient Context events

A new framework API called “Ambient Context” has been added to Android 13, but it is currently undocumented. Android is providing a client API that apps can subscribe to to receive notice of AmbientContext events such as coughing (EVENT_COUGH) and snoring (EVENT_SNORE). The API also provides apps with information on the start and end time of detected events, the confidence that the detected event is accurate, and the intensity level of the event (ranging from LEVEL_LOW to LEVEL_HIGH). All of this data is provided by a service in a system app that implements the provider API, which only the system can bind to as the service should be gated behind the new BIND_AMBIENT_CONTEXT_DETECTION_SERVICE permission. Furthermore, only client apps that hold the new ACCESS_AMBIENT_CONTEXT_EVENT permission can access data provided by the Ambient Context API. According to Android’s Privacy Working Group (PWG), this permission will switch from a Role to a runtime permission in Android 14.

The system service that implements the provider API is defined in the framework config value ‘config_defaultAmbientContextDetectionService.” On Pixel devices, this value is defined as ‘com.google.android.as/com.google.android.apps.miphone.aiai.labs.ambientcontext.AiAiAmbientContextDetectionService’ which points to a service that doesn’t exist in public versions of the Android System Intelligence app. If this service were present, then to enable it, the device_config value ‘service_enabled’ under the ‘ambient_context_manager_service’ namespace would also need to be set to ‘true’. Then, the new ‘ambient_context’ CLI could be used to start or stop detection or query events.

Based on our understanding, it seems that Google is providing an interface for the system intelligence app (Android System Intelligence on Pixel devices) to detect sleeping-related events and then privately share those events with apps subscribing to the client API. This way, client apps that just need sleep data won’t also need the raw sensor data (such as continuous microphone usage) needed to detect sleep events. This will enable apps to implement sleep detection features in a privacy-preserving way, in line with the API updates Google has making as part of its Private Compute Core initiative.

Cross device calling

While every smartphone can make phone calls, the same isn’t true for every tablet. According to the Google Play Console’s device catalog, only about 40% of tablets support telephony (android.hardware.telephony). Some tablet makers like Samsung offer a cross-device calling feature so users can make and receive calls on their tablet using the telephony service from a connected phone. Google is introducing similar APIs in Android 13 that will enable calls to be forwarded from a smartphone to a tablet or other device.

The new API is called “cross device calling” and has already been partially merged to AOSP. Android’s Telecom framework now supports pushing calls to remote endpoints, which can either contain a complete calling stack capable of carrying out a call on its own or lack the required calling infrastructure to carry out a call on its own. A cross device call streaming app can interface with the telecom stack to share updates about the status of the call at the endpoint.

Calls that are routed to endpoints that lack the required calling infrastructure are considered “tethered” external calls. Since tethered devices can’t carry out the phone call on their own, the audio stream from the phone is re-routed to the device using Android 13’s new external call audio routing API.

Splash screen style

Android 12 introduced system-generated splash screens for application launches. However, developers discovered that launching an app from ADB, a deeplink, or a notification would result in a blank splash screen being shown without an icon. Google confirmed this behavior is intentional, as Android 12 only shows the splash screen when launching an app from the launcher.

Android 13 solves this by adding a new public API called windowSplashScreenBehavior. This API lets apps set whether to show a splash screen with or without an icon when launching another activity. This API is not officially available in Android 12, but Google says that developers can do:

Bundle options = ActivityOptions.makeBasic().toBundle() // or just new Bundle()

options.putInt("android.activity.splashScreenStyle", 1)

startActivity(intent, options)

Or wait for an update to Jetpack’s core-splashscreen library.

Themed Icons API

Google introduced the third major version of their Material design language alongside the Android 12 release last year. One of the key features of Material You — the marketing name of Google’s updated design language — is dynamic color. Dynamic color exposes 5 dynamic color tonal palettes, each comprised of 13 color values with various luminance values, as an API that system and third-party apps can call. Apps can follow the Material guidelines for dynamic color or their own design language when deciding how to use the color palettes to theme their own UIs. Since the dynamic color tonal palettes are generated from a single source color, which is usually picked from the user’s wallpaper, the resulting theme that’s applied across system and third-party apps can vary widely and feel personalized to the user.

An app’s UI isn’t the only area where dynamic color can be used. Widgets can also be recolored, as can some app icons in Android 12 on Pixel devices. In Android 12 on Pixel, Google introduced an experimental “themed icons” feature in their Theme Picker app. When enabled, dynamic colors are applied to various Google app icons whenever the wallpaper is changed. However, Google hardcoded a list of themeable icons within a XML file called grayscale_icon_map contained within the launcher, which also contains the drawable resources for the monochrome app icons.

In Android 13, Google is extending Material You dynamic color to all app icons. Google has updated the AdaptiveIconDrawable API to support themed app icons. Developers need to simply supply a monochromatic app icon and tweak the <adaptive-icon> element in ic_launcher.xml to include the new <monochrome> inner element that points to the monochromatic drawable. Developers that have already supplied an adaptive icon will find it easy to add support for themed icons in Android 13.

Google says that themed app icons will initially appear on Pixel devices, but the company is working with its partners to bring them to more devices.

The feature can be enabled on Launcher3 by setting both the preference KEY_THEMED_ICONS and the feature flag ENABLE_THEMED_ICONS to true.


Miscellaneous changes

This section contains changes for Android on handheld devices that weren’t deserving of their own sections.

  • When enabling freeform mode or force desktop mode in Developer Options, a dialog now informs the user that they need to reboot before it’ll work.
  • The “show touches on screen” toggle in SystemUI’s screen recorder is functional. This toggle was removed in Android 12L due to a bug with how the cursor is drawn.
  • A new animation has been added to cleanly transition between the smartspace widget on the lock screen and the smartspace widget on the home screen. Smartspace is a proprietary Google widget, but a basic form of it is available to Android partners. This video clearly shows the new animation. I’ve set the animation scales to 5X to lengthen the animations so the smartspace shared element transition is more visible.
  • Android supports creating restricted profiles that are limited in what apps they can launch and content they can view. By default, the ability to create a restricted profile is only available on devices without telephony capabilities. Android 13 replaces this check with a new config flag in Settings (config_offer_restricted_profiles).
  • The USB debugging icon has been updated to reflect the Android T update.
The new icon for USB debugging in Android 13.
  • “Emergency call” on the lock screen has been changed to “Emergency.”
  • When silent mode is enabled, it’s possible to adjust the “touch feedback” level in Settings > Sound & vibration > Vibration & haptics.
  • Android 13 supports dragging app icons to swap the positions of apps in split-screen view. This does not work if the activities that are being swapped are the same (ie. they’re multi-instance).
  • Android’s Battery Saver feature, which limits background activity and tweaks other settings to preserve battery life, can be turned on automatically when the battery level reaches a user-defined percentage. In previous versions, the minimum battery level that could be set by the user was 5%, but in Android 13, that minimum has been raised to 10%.
  • The toggle for app hibernation has been renamed. It was previously called “remove permissions and free up space” but is now “pause app activity if unused.”
  • The option to “enable Gabeldorsche”, Android’s next-generation Bluetooth stack, has been removed from developer options.
  • A setting to “allow mock modem” has been added. This is used to run the mock modem service for instrumentation testing.
  • Android 13’s Bluetooth stack has introduced support for the new Bluetooth Low Energy Audio standard, and on devices with chipsets supporting it, system engineers can toggle LE audio hardware offload in developer options.
  • Settings has added a new x-axis transition animation that can be seen here.
  • The taskbar’s app drawer icon now follows the system theme.
  • The long-press context menu of the taskbar now displays the split-screen shortcut which was previously only available from an app’s context menu on the home screen or app drawer.
  • The gestural navigation pill has become bolder and larger.
  • A chroma-limiting bug in “monet”, Android’s dynamic color engine, has been fixed. This bug resulted in colors generated by the “Vibrant” theme more vibrant.
  • Android’s Easter egg has been updated in Android 13, like clockwork. It features a clock with the “Android 13” PlatLogo surrounded by various emojis. The new Easter egg is activated by going to Settings > About phone > Android version and repeatedly tapping on the “Android version” field (the exact steps may vary per-manufacturer). Once the Easter egg activity has been launched, spin the clock so it points at 1:00 and then long press on the bubbles that appear. Continue holding down on the bubbles to cycle through the emojis.
  • The setting to show the keyboard when opening the app drawer in Launcher3 was removed in early Android 13 builds but was re-added as a developer option in Beta 3 and enabled by default in Beta 3.2.

What’s new for Android Automotive?

Android 13 brings a number of changes to Android Automotive OS, the version of Android designed to run on in-vehicle infotainment (IVI) systems. According to Google’s official release notes, the Android Automotive 13 release brings improvements to the camera subsystem, car framework, connectivity, and more. I’ll summarize each change and provide additional information as well as links to AOSP commits and source code where relevant.

Screenshot of Android Automotive 13 version settings
A screenshot of Android Automotive 13’s version settings. Thanks to Snapp Automotive for providing the build.

Camera

  • Android camera2 API: Third-party apps can now access one or more vehicle cameras concurrently without impacting the Extended View System (EVS).
  • Enumerate camera devices by relative locations: Clients can enumerate and open camera devices (or video streams) according to relative locations. Hardware details such as the device node name will be hidden from clients.
  • EVS hotplug events: Android Automotive 13 supports notifying and handling the hotplugging of camera devices, ie. unplugging and plugging in an external camera.

Car Framework

  • Car framework mainline: Through a new APEX module, Android Automotive 13 now supports updating the car stack independent of Android platform versions. 
  • Driving safety region support: Apps on Android Automotive 13 can now specify which “driving safety regions” they support, ie. the regions where they meet all safety regulations. Developers can define these particular regions through the android.car.drivingsafetyregions metadata element in the AndroidManifest, or they can allow all regions through the android.car.drivingsafetyregion.all metadata element. OEMs can define the system’s driving safety region through the ‘ro.android.car.drivingsafetyregion’ property, and apps that do not support the current system’s region will be considered unsafe to use. Note that since the driving safety region APIs are marked as hidden, they are only available to OEM apps.
  • Migrate the vehicle HAL from HIDL to AIDL: Android Automotive’s VHAL (Vehicle Hardware Abstraction Layer) has migrated from HIDL to AIDL. Although the HIDL HAL remains supported, Google recommends that new properties should only be added to the AIDL HAL. Further, Google also recommends that all existing VHAL native clients migrate to a new cpp client.
  • Support larger payload and batch calls in VHAL: Google says that the VHAL can now pass larger payloads through shared memory. Furthermore, batching calls will allow for more efficient sending of multiple requests.
  • Touch mode: In Android Automotive 13, changes in touch mode are now represented by a new internal event called TouchModeEvent. In previous versions, application focus events were tied to touch mode state updates as they were represented by the same C++ native FocusEvent. With this change in Android 13, changes in touch mode are dispatched against all existing windows regardless of whether they’re focused or not.

Connectivity

  • Enable Ultra-Wideband (UWB): Android Automotive 13 adds support for ultra-wideband, enabling “multi-anchor support for locating UWB tags with an accuracy of 10cm.” Android 13’s UWB stack is based on the FIRA specification and is delivered through a new Project Mainline module. It sits above the HAL interface that must be implemented by the UWB chip vendor.
  • Bluetooth mainline integration: Android’s Bluetooth stack has been turned into a Project Mainline module so that security updates can be pushed more quickly and implementation fragmentation can be reduced. The module contains the system Bluetooth APK, native libraries, Bluetooth framework APIs, and HIDL interfaces. In the migration process, several hidden APIs are being converted to system APIs.
  • Bluetooth Gabeldorsche: Google says that a newer version of the Bluetooth stack is now in use in Android Automotive 13. This new Bluetooth stack is called Gabeldorsche and it has actually been in testing since Android 11. Currently, Gabeldorsche is enabled “up to the scanning layer”, which includes BLE scanning, BLE advertising, ACL connection management, controller information management, HCI layer, HAL interface layer, and other required components like config storage.
  • Ethernet-based networks: Android Automotive 13 adds controls for Ethernet-based networks, including dynamic management of IP config, network capabilities, app access control lists, and the ability to toggle networks on the fly.
  • Reference TCU (Telematics Control Unit): It’s now more straightforward to integrate an external telematics ECU (Electronic Control Unit) with Android through the Telephony HAL.
  • Improved projection support: Android Automotive 13 adds an API (and a CTS test) to include VendorElements as part of a generated hostapd AP configuration. Basically what this means is that, when a wireless AP is created by an Android Automotive device, the vendor-specific information elements of that device and its WiFi chip can be included. As for why this is done, it’s related to improving support for projecting Android Auto from a phone to an Android Automotive head unit, something which is possible using Android’s CarProjectManager API and the Android Auto Receiver app on head units with Google Automotive Services installed.
    • Android Automotive further improves this use case by creating a new “Automotive Projection” device profile. Apps can use Android’s CompanionDeviceManager API to request to be associated with a particular Automotive head unit that supports projection when the device connects to it. If the user accepts, the app is granted the SYSTEM_AUTOMOTIVE_PROJECTION role, enabling a host of necessary permissions including ones to create a virtual display device, keep that virtual display always unlocked, interface with the microphone, location, contacts, call log, etc. This role is automatically granted to the app specified by config_systemAutomotiveProjection, which on Pixel devices is the Android Auto app. Lastly, only apps holding the REQUEST_COMPANION_PROFILE_AUTOMOTIVE_PROJECTION permission, which has an internal|role protection level, can use the Automotive Projection API.
  • New API to get a list of Wi-Fi channels and country codes when Wi-Fi is off

Power

  • Suspend-to-disk: Android Automotive 13 adds support for a new “suspend-to-disk” power-off mode that preserves the contents of memory to the non-volatile storage device instead of to volatile RAM, ie. suspend-to-RAM (deep sleep). Suspend-to-disk is also known as “hibernation”, and it’s implemented through the Linux kernel feature by the same name. The list of supported power-off modes can be read from /sys/power/state (‘mem’ and ‘disk’ for suspend-to-RAM and suspend-to-disk respectively). OEMs can test these power-off modes through new shell commands. Support for triggering these power-off modes has been added to the VHAL and CarPowerManagerService.
  • Control of the shutdown process: OEM apps holding the new CONTROL_SHUTDOWN_PROCESS permission can now take timely actions before and after the device enters Garage Mode. They can create a listener to receive power state changes and then do appropriate work during each step. Previously, this API wasn’t exposed to OEM apps.

Privacy

  • Post-drive permission decision reminder: Users will get reminders when they park about any recent permission decisions they made while driving. A notification that states “while driving, you gave <app> access to <permission[s]>” will appear with a button that says “check recent permissions.” 
  • Recent permission decisions: In privacy settings, the user will see a list of recent permission decisions they made, regardless of whether it happened while driving or when parked.
  • Privacy dashboard: The privacy dashboard feature, introduced on handhelds in Android 12, has been added to Android Automotive 13. Through the privacy dashboard, users can review a timeline of events for sensors (location, microphone, and camera) and sub-attribution for Google Play Services usage.

Sensors

  • New sensor types: Android adds support for two new Inertial Measurement Unit (IMU) sensor types: Limited Axes and Heading.

Telemetry

  • OEM telemetry: OEMs can now use an Android-powered infotainment system to “configure and collect In-Vehicle Infotainment (IVI) and vehicle data.”

User Management

  • Improved user lifecycle events management: A new user lifecycle filter has been added to improve performance and simplify client code.

Vehicle Integration

  • New VHAL properties: New properties for fog lights, EV charging, trailer, vehicle weight, and wheel tick have been added.

What’s new for Android TV?

Google released Android 13 Beta 1 for the ADT-3 developer kit on May 4, 2022. The system image includes Google TV applications but very few user-facing changes compared to the previous Android 12-based image.

Android TV 13 Beta about screen

Android 13 Beta 2 for Android TV was initially released on May 11, 2022 for the Android Emulator followed by a release for the ADT-3 on June 6, 2022. The images bring support for new Android 13 features like HDMI state changes and expanded picture-in-picture mode.

On June 24, 2022, Google released Android 13 Beta 3 for TV. A system image with Google TV applications was provided for the ADT-3 developer kit.

Expanded picture-in-picture mode

Picture-in-picture (PiP) mode was first introduced in Android 7.0 for Android TV devices before expanding to all other device types in Android 8.0. PiP is a multi-window mode that enables watching a video in a small window that overlays other content on screen. As of Android 12, PiP windows can be moved around, stashed to the side, or resized, though resizing is limited by aspect ratio. By default, the aspect ratio of a PiP window is 1.777778:1 (16:9) to match most video content, but developers can set a custom aspect ratio between 1:2.39 to 2.39:1. In Android 13, however, developers can create PiP windows that are even longer or wider than before.

On devices that support Android 13’s new expanded picture-in-picture multi-window mode, defined by the system feature ‘android.software.expanded_picture_in_picture’, developers can set the aspect ratio of a PiP window to be less than 1:2.39 or greater than 2.39:1. This new expanded PiP multi-window mode is intended for Android TV devices and its logic can be found in the classes under com/android/wm/shell/pip/tv in SystemUI. According to this code, expanded PiP windows can be moved around on screen via DPAD key events.

The setPreferDockBigOverlays API determines how the expanded PiP window is displayed on screen. This API “specifies a preference to dock big overlays like the expanded picture-in-picture on TV.” Docking “puts the big overlay side-by-side” the activity that specifies this preference so “both windows are fully visible to the user.” In docked mode, the PiP window is docked “on one of the screen edges”, while the fullscreen app “is resized to occupy all the space next to it”.

Docked mode is the default behavior when an app enters expanded PiP mode, though how the two apps are displayed side-by-side depends on if the activity of the fullscreen app is resizable. If it isn’t, then it’s scaled down using size compatibility mode, and the system will apply borders around the window to maintain the activity’s aspect ratio. 

If docked mode is disabled through the setPreferDockBigOverlays API, then the expanded PiP window will be overlaid on top of the fullscreen app, which is the normal PiP behavior.

HDMI state changes are surfaced to the MediaSession lifecycle

Changes in the state of a device connected via HDMI are now surfaced to the MediaSession lifecycle. Google says that if developers handle these events accurately, then playback should stop if an HDMI device is turned off.

Keep clear APIs

Since PiP windows may overlay important UI elements, Android 13 adds the ability to mark UI elements that shouldn’t be overlaid. This new capability, called keep clear, doesn’t guarantee that those UI elements won’t be overlaid, but the system will attempt to abide by the app’s request nonetheless. Google warns that the system may not honor the app’s keep clear requests if too many UI components are marked as keep clear. If an app has large UI components that shouldn’t be overlaid, Google recommends supporting docked mode for apps running in expanded PiP mode.

Developers can mark views as keep clear using the android:preferKeepClear attribute in XML layouts. The setPreferKeepClear API can also be used to programmatically mark a view as keep clear. If the entire view doesn’t need to be marked as keep clear, the setPreferKeepClearRects API can be used to specify regions of the view that shouldn’t be overlaid.

Keyboard layouts API

The new getKeyCodeForKeyLocation API can be used to determine the layout of a connected keyboard. It returns the key code produced by a given location on a reference QWERTY keyboard. For example, if the input is set to KeyEvent#KEYCODE_B and the value returned is KeyEvent#KEYCODE_B, then the current keyboard layout must be QWERTY. If, however, the input is set to KeyEvent#KEYCODE_Q and the API returns KeyEvent#KEYCODE_A, then the keyboard layout is the French AZERTY because the location of the “Q” key on a QWERTY keyboard corresponds to the location of the “A” key on a French AZERTY keyboard.

Low power standby mode

Android 13 adds a new “low power standby” mode that places restrictions on apps while the device is in standby. While low power standby is active, wakelocks are disabled and network access is blocked. These restrictions are lifted temporarily during doze maintenance windows.

This feature is intended for Android TV devices and is disabled by default on other configurations. Low power standby cannot be enabled unless the framework value ‘config_lowPowerStandbySupported’ is set to true. If supported, it can then be enabled by default by setting the framework config ‘config_lowPowerStandbyEnabledByDefault’ to true or toggled via Settings.Global.low_power_standby_enabled.

This change is likely designed to better meet the EU’s energy saving requirements.

Picture-in-picture mode support comes to Google TV

According to Google, Android 13 brings support for picture-in-picture mode to Google TV. “While PiP support was introduced in Android 8.0 (API level 26), it was not widely supported on Android TV, and not supported at all on Google TV prior to Android 13,” reads the documentation.


Security patches

During the development of Android 13, Google, its partners, and independent researchers discovered security vulnerabilities affecting the new platform release. These vulnerabilities are patched on Android 13 devices with a security patch level of 2022-09-01 or later. 2022-09-01 is Android 13’s default security patch level for the initial release to AOSP. Devices with this patch level will be protected from the security vulnerabilities listed in the Android 13 Security Release Notes page.

Future updates to Android 13 will include patches to security vulnerabilities discovered after the initial release. Android Security Bulletins published after the  initial release will mention which vulnerabilities, if any, affect Android 13 as well as older Android releases. Google will continue to backport security patches to Android 13 for about 3.5 years after its first release.

To learn more about Android’s security patch process, I recommend reading this article that covers the subject.


Conclusion

At Esper, we support Android on a variety of form factors, from handhelds to large screen devices like tablets and POS terminals. Although the release of Android 13 is several months away, we’ll be diligently monitoring new releases to see what new features, behavior changes, and APIs that users, developers, and more importantly, enterprises, need to be aware of. Because Android is a rapidly evolving operating system, it’s easy to fall behind the latest developments. Let Esper manage the software that runs on your device fleet; we care about the nitty-gritty implementation details so you don’t have to.


Developer Preview & Beta changelogs

As mentioned earlier, Google plans to release 2 developer preview and 4 beta builds of Android 13 prior to the initial stable release in Q3 2022. This article documents all of the changes introduced in Android 13 and does not distinguish between the versions they were introduced. However, for historical purposes, this section will list all of the changes introduced in each developer preview and beta build. Content creators and journalists are welcome to use this section as a historical reference.

Navigate this section:

What’s new in Android 13 Developer Preview 1?

Android 13 Developer Preview 1 was released on February 10, 2022. According to Google’s announcement, the first Developer Preview came with the following features, as well as some changes mentioned in the developer docs but not in the blog post:

Following the release of Developer Preview 1, we discovered the following hidden or undocumented changes:

What’s new in Android 13 Developer Preview 2?

Android 13 Developer Preview 2 was released on March 17, 2022. According to Google’s announcement, the second Developer Preview introduced the following features:

Of course, the second Developer Preview was also full of many hidden or undocumented changes, as well as some changes mentioned in the developer docs but not in the blog post. These include:

What’s new in Android 13 Beta 1?

Android 13 Beta 1 was released on April 26, 2022. According to Google’s announcement, the first beta introduced the following features:

Of course, the first Beta is also full of many hidden or undocumented changes, as well as some changes mentioned in the developer docs but not in the blog post. These include:

What’s new in Android 13 Beta 2?

Android 13 Beta 2 was released on May 11, 2022. According to Google’s announcement, the second beta introduced the following features (that we previously have not covered):

As always, the second Beta release is chock-full of hidden or undocumented changes, including many that are mentioned in the developer docs but not in the official blog post. These are:

At Google I/O, Google also released the second Android 13 beta for Android TV. The company also finally documented some of the new features coming in the Android 13 update for TVs. These include the following:

The low power standby feature we previously discovered was not brought up in Google’s documentation.

What’s new in Android 13 Beta 3?

Android 13 Beta 3 was released on June 8, 2022. According to Google’s announcement post, the third beta brought the release to Platform Stability, which means that all APIs in the SDK and NDK are finalized, as are all app-facing system behaviors. Subsequent point updates to Beta 3 brought additional changes, but these will be included in this section for simplicity. These changes include the following:

What’s new in Android 13 Beta 4?

Android 13 Beta 4 was released on July 13, 2022. According to Google’s announcement post, the fourth beta is a release candidate build, so users are encouraged to thoroughly test and report bugs ahead of the stable release in August. Developers, meanwhile, are encouraged to complete final compatibility testing and publish compatibility updates.


Article changelog

This article is updated very frequently to add new information or correct existing content. As such, we maintain a changelog of the last few updates to the article for readers to quickly see what information has been added since their previous visit. This changelog will not be comprehensive, however, but will instead summarize the changes that are made.

  • 7/31/2022
    • Added the following sections:
    • Updated the sections on:
      • 7-day view in privacy dashboard to reflect Google’s confirmation of the feature at I/O
      • Clipboard editor overlay to mention that the Nearby Share button will be shown as part of the new cross-device sharing feature, that developers can mark certain clip content as “sensitive”, and that the edit button has been swapped with a share button.
      • DNS over HTTPS to reflect Google’s announcement on the rollout of the new encrypted DNS protocol
      • Fast Pair in AOSP to mention its removal from settings
      • Foreground service manager to reflect that foreground service notifications are now dismissible and that a dot appears whenever the list updates
      • Kids mode for the navigation bar to reflect that the icons fade after a few seconds and the buttons persist in full screen mode
      • Launch an app in split screen from its notification to mention that this feature has been disabled by default for handhelds
      • Media Tap To Transfer to provide more detail on its architecture
      • More granular media file permissions to mention the warning that appears when granting an app targeting an older version access to media files owned by other apps
      • Per-app language preferences to mention that only apps that have a locales_config.xml resource file will be shown in the list
      • Privacy Sandbox on Android to reflect the release of developer previews 2-4, and to mention the fact that key components will be distributed as Mainline modules
      • Predictive back gesture navigation to link to a video of the new back-to-home animation and to mention the addition of the developer toggle in Beta 3
      • Quick Settings tiles for color correction, one-handed mode, Privacy Controls to mention that the Privacy Controls Quick Settings tile has been removed, but the activity is still there
      • Runtime permission for notifications to mention that Android will post a notification when upgrading to review notification settings for apps
      • Schedule exact alarms without user prompts to mention that the SCHEDULE_EXACT_ALARM permission was originally planned to not be granted by default
      • Shared UID migration to mention how developers can migrate away
      • Sideloaded apps may be blocked from accessing Accessibility APIs to reflect that Notification Listener API access may also be blocked
      • Spatial audio with head tracking support to reflect that Android 13 ships with a standard spatializer and head tracking protocol in the platform, and that ExoPlayer 2.17 can be used to try out spatial audio
      • Splash screen style to mention why the API was added
      • System Photo Picker to reflect that cloud media providers will be supported in an upcoming update, and that the picker has started to roll out on older OS versions.
      • Toggle to show the vibrate icon in the status bar to mention that the icon now appears when on the lock screen as well
      • What’s new in Android TV to reflect the release of Beta 2 and Beta 3
      • Miscellaneous changes to reflect the new Easter egg and Plat logo, the larger gesture pill, the fix for the chroma-limiting bug that made colors less vibrant, and the reappearance of the “always show keyboard” option.
  • 8/15/2022
    • Updated the article to reflect the public release of Android 13 to AOSP
  • 8/24/2022