Android 12 Compatibility Guide — All the changes for enterprise and device makers

Mishaal Rahman

October 28, 2021

Google typically releases a new version of its Android operating system in Fall of each year. This year saw the release of Android 12, Google’s most visually impressive and feature-rich Android update since 2014. After releasing the source code, Google also shared the new compatibility requirements that device makers must meet in order to distribute Android. If you wish to distribute Android software built on top of the latest Android 12 source code – either as pre-installed firmware or as an update to existing firmware – and you also wish to license Google Mobile Services (GMS), then you need to be aware of, and abide by, the latest compatibility requirements.

But wait! If Android is an open-source operating system, then why do you need to abide by Google’s compatibility requirements? To understand why, we have to briefly explain how Google distributes the Android OS and Google Mobile Services. Android is indeed an open-source operating system licensed under Apache 2.0, a license that allows anyone, from indie developers to big companies alike, to freely modify and distribute the OS. You can download the source code for Android 12 from the Android Open Source Project (AOSP) git repositories right now and compile a build for compatible hardware. This is how Esper develops Esper Enhanced Android (EEA), our customized version of the Android OS designed for dedicated fleet use.

If you only plan to build Android for your own personal hardware or other non-commercial use, then you don’t have to follow Google’s guidelines. If, however, you plan to use the trademarked “Android” name or logo on hardware, packaging, or marketing materials, or you wish to seek approval to distribute GMS, then your device must be Android-compatible. GMS includes applications like the Google Play Store, the predominant application store on Android, as well as frameworks like Google Play Services, which provides key application programming interfaces (APIs) for services like push notifications, location services, and more. Without GMS, end users of your hardware won’t have access to Google’s rich ecosystem of applications, and many applications that depend on Google’s APIs will fail to function properly. If you decide your Android device fleet needs GMS, then Esper can help you deploy a software build that passes GMS compatibility requirements.

With Android 12 comes a new set of compatibility requirements that enterprises and device makers need to be aware of, however. To receive a GMS license, enterprises and device makers must complete the following:

  1. Sign a Mobile Application Distribution Agreement, or MADA, with Google.
  2. Run the Compatibility Test Suite, or CTS, to verify that your firmware meets Android compatibility requirements. These requirements are enumerated in the Android Compatibility Definition Document, or CDD.
  3. Run the GMS Test Suite, or GTS, to verify that your firmware is compatible with GMS applications.
  4. Run other automated test suites, including but not limited to the Vendor Test Suite (VTS) and CTS Audio Quality Test Suite (CAT).

To reduce the time to deployment, it’s important you gain a thorough understanding of the new compatibility requirements before running any automated test suites. Fortunately, the Android 12 CDD outlines the requirements in clear, unambiguous, human readable language. Esper has read the over 140-page document to provide you with a comprehensive overview of all the new compatibility requirements. Our overview only covers the changes made to the CDD, so we won’t be summarizing the clauses that remain unchanged. Android software builds are still expected to abide by all of the clauses in the CDD, which can be read in its entirety here.

The simplest way to meet all requirements listed within the CDD is to refrain from deviating too much from AOSP, as AOSP is both the reference and preferred implementation of Android. If, however, your business requires a custom implementation of any component, then the implementation must follow the restrictions listed in the CDD, if any exist. To unambiguously indicate the requirement level of any clause, Google employs the terms defined in RFC2119 of the IETF standard. Google also lists an ID for each requirement, which is comprised of a Section ID/Device Type ID followed by a Condition ID and finally a requirement ID. Sections 1.1.2 and 1.1.3 define the structure of the ID.

As the CDD is a constantly-evolving document, its requirements are never truly set in stone. Based on feedback from partners, Google frequently updates the document to remove requirements, alter the language to be clearer, or relax requirements. The latest version of the Android 12 CDD will always be available here.

Finally, let’s dive into the changes in the CDD since Android 11.

Table of Contents

Section 2 – Device Types

Section 2.2 – Handheld Requirements

Section 2.2.1 – Hardware

  • A new requirement has been added that states that handheld devices “MUST support GPU composition of graphic buffers at least as large as the highest resolution of any built-in display.” This, ostensibly, is just saying that handheld devices must actually support rendering at their built-in display’s native resolution. Many television devices, for reference, render at 1080p resolution and then upscale to their native 4K resolution. Handheld devices do not commonly do this, so Google likely added this clause in response to the growing number of devices that can output to external displays, thus ensuring they can output at the same/native resolution as their built-in display.
  • Google is planning ahead to a future where all devices run 64-bit only versions of Android. To start, they’re trying to move lower-end devices away from 32/64-bit hybrid Android. “If Handheld device implementations include greater than or equal to 2GB and less than 4GB of memory available to the kernel and userspace, they: Are STRONGLY RECOMMENDED to support only 32-bit userspace (both apps and system code). If Handheld device implementations include less than 2GB of memory available to the kernel and userspace, they MUST support only 32-bit ABIs.” The benefits of 64-bit only Android are multifold: Better security, better performance, easier to support, and by extension, less expensive to maintain.
    • Better security: Higher available address space allows for better address space layout randomization (ASLR). Other security features like Memory Tagging Extension (MTE), Pointer Authentication (PAC), and Branch Target Identification (BTI) are available.
    • Better performance: Benchmark testing on devices with at least 4GB of RAM shows 5-10% improvements in performance and power efficiency. Larger address space also reduces crash rates.
    • Easier support: 32-bit support is being phased out, and most hardware and software optimizations are targeted at 64-bit.
    • Reduced costs: No need to test both 32-bit and 64-bit ABIs. Less RAM and storage needed than 32-bit or 32/64 hybrid modes.
  • Google is codifying audio latency requirements for all handheld devices, not just devices that self-declare themselves to support professional audio. The clause, “if Handheld device implementations declare android.hardware.audio.output and android.hardware.microphone, they MUST have a Mean Continuous Round-Trip latency of 800 milliseconds or less over 5 measurements, with a Mean Absolute Deviation less than 100 ms, over at least one supported path,” sets a liberal value for the maximum audio latency a handheld device can have. Audio output latency is defined as the time between an audio sample being generated by an app and the sample being played through the headphone jack or built-in speaker. Audio input latency, meanwhile, is the time between an audio signal being received by a device’s audio input, such as the microphone, and that same audio data being available to an app. The Mean Continuous Round-Trip latency, lastly, is the sum of the input latency, app processing time, and output latency. Google provides guidance on how to measure the audio latency of devices, using its own devices as an example. This requirement should not be difficult for handheld devices to meet, as the average audio latency of Android handhelds has considerably lowered over the years. In fact, the average audio latency of the 20 most popular Android phones has fallen under 40ms, according to Google.
  • Is the device’s vibration motor capable of supporting all primitive vibration effects, and if so, does the HAL report all of these effects so the OS knows what effects are supported? Google encourages device makers to verify the implementation status by querying the android.os.Vibrator.areAllEffectsSupported() and android.os.Vibrator.arePrimitivesSupported() methods. If the vibration motor does not implement the HAL constants for the vibration feedback effects, then the device maker should fallback to mapping constants between the HAL and the API.

Section 2.2.3 – Software

  • All handheld devices with a secure lock screen and at least 2GB of RAM available to the kernel and userspace must support device management features. They are required to declare support for managed profiles via the android.software.managed_users feature flag.

Section 2.2.5 – Security Model

  • Android 12 has added a new system API called VoiceInteractionService that supports a mechanism for secure, always-on hotword detection without indicating active mic access. If a handheld device implementation supports the HotwordDetectionService API or another similar mechanism, they*:
    • MUST do the following:
      • Make sure the service only sends data to the System or ContentCaptureService.
      • Make sure the service only transmits mic audio data or data derived from it to the system server through the API, or to ContentCaptureService through the ContentCaptureManager API.
      • Only transmit data out of the hotword detection service on a hotword validation request from the system server
      • Log the number of bytes included in every transmission from the hotword detection service to allow inspectability for security researchers
      • Support a debug mode that logs raw contents of every transmission from the hotword detection service to allow inspectability
      • Restart the process hosting the hotword detection service at least once an hour or every 30 hardware-trigger events, whichever is first
      • Display the mic indicator when a successful hotword result is transmitted to the voice interaction service or similar entity
    • And MUST NOT do the following:
      • Supply mic audio longer than 30 seconds for an individual hardware-triggered request to the hotword detection service.
      • Supply buffered mic audio older than 8 seconds for an individual request to the service.
      • Supply buffered mic audio older than 30 seconds to the voice interaction service or similar entity.
      • Let more than 100 bytes of data be transmitted out of the hotword detection service on each successful hotword result.
      • Let more than 5 bits of data be transmitted out of the service on each negative hotword result.
      • Don’t let user-installed apps provide the hotword detection service
      • Don’t surface in the UI quantitative data about mic use by the service.
    • And are STRONGLY RECOMMENDED to:
      • Notify users before setting an app as the provider of the hotword detection service.
      • Disallow transmission of unstructured data out of the hotword detection service.
  • If a device implementation includes an application that uses the HotwordDetectionService API or similar mechanism, it MUST explicitly notify the user about each supported hotword phrase, MUST NOT save raw audio data, or data derived from it, and MUST NOT transmit audio data or data that can be used to reconstruct (wholly or partially) the audio, or unrelated audio contents, except to the ContentCaptureService.*
  • If a device declares support for a microphone, it:
    • MUST display a mic indicator when an app accesses audio data from it, but not when the mic is accessed by HotwordDetectionService, SOURCE_HOTWORD, ContentCaptureService or apps holding the following roles: System UI Intelligence, System Ambient Audio Intelligence, System Audio Intelligence, System Notification Intelligence, System Text Intelligence, or System Visual Intelligence.
    • MUST display the list of Recent and Active apps using the mic as returned from PermissionManager.getIndicatorAppOpUsageData() and MUST NOT hide the mic indicator for system apps that have visible UIs or direct user interaction.
  • If a device declares support for a camera, it:
    • MUST display a camera indicator when an app accesses live camera data but not when the camera is being accessed by one of the previously mentioned roles
    • MUST display the list of Recent and Active apps using the camera as returned from PermissionManager.getIndicatorAppOpUsageData() and MUST NOT hide the camera indicator for system apps that have visible UIs or direct user interaction

* One of Android 12’s marquee features is the status bar indicator for active camera and mic use. Hotword detection requires the device’s mic to always be active, but it would be annoying for the user/make the indicator less useful if it were to be displayed at all times when the hotword detection service is active. Thus, this service can bypass this requirement, provided it abides by the rules listed above.

Section 2.2.7 – Handheld Media Performance Class

  • Although “Performance Class” is a new concept in Android 12. It will be useful for developers trying to figure out if their app is running on a high-performance device like the Galaxy S21 versus a low-performance device like a Galaxy A10. Developers need to know this so they can enable certain features, but only if the device is capable. There are methods to check the available memory capacity (RAM) or the number of CPU cores, but that doesn’t tell developers if the device is capable of specific highly-demanding tasks, such as real-time video processing effects. With the “Performance Class” API, though, developers can ask the OS to see which “class” the device falls into. This will make it easier for developers to decide which devices to enable fancy features on. Currently, many developers only explicitly support high-end phones from the top brands, leaving out capable phones from less popular brands. By targeting a more generic “performance class”, developers can enable features on devices that are capable of handling those features, regardless of brand. And in doing so, devices that can’t handle those features won’t have them enabled, saving app developers from poor user reviews.
  • There are different requirements tiered based on whether the device updates from Android 11 or launches with Android 12. If android.os.Build.VERSION_CODES.R is returned when android.os.Build.VERSION_CODES.MEDIA_PERFORMANCE_CLASS is queried, then the device MUST:
    • AV
      • Advertise the max number of hardware video decoder sessions and hardware video encoder sessions that can be run concurrently in any codec combination.
      • Support 6 instances of hardware video decoder and encoder sessions (AVC or HEVC) in any codec combination running concurrently at 720p@30.
      • Codec initialization latency must be 65ms or less for a 1080p or smaller video encoding session for all hardware video encoders when under load, 50ms or less for a 128kbps or lower bitrate audio encoding session.
      • NOT drop >1 frame in 10 sec for a 1080p30 video under load
      • NOT drop >1 frame in 10 sec during a video resolution change in a 30fps video session under load
      • Tap-to-tone latency <100ms using OboeTester tap-to-tone test or CTS Verifier tap-to-tone test.
    • Camera
      • Have a primary rear-facing camera that’s >= 12MP and supports 4K30.
      • Have a primary FFC that’s >= 4MP and supports 1080p30.
      • Rear primary camera must report FULL or better for android.info.supportedHardwareLevel, while front primary camera must report LIMITED or better.
      • Support CameraMetadata.SENSOR_INFO_TIMESTAMP_SOURCE_REALTIME for both primary cameras.
      • Have camera2 JPEG capture latency < 1000ms for 1080p resolution as measured by the CTS camera PerformanceTest under ITS lighting conditions (3000K) for both primary cameras.
      • Have camera2 startup latency (open camera to first preview frame) < 600ms as measured by CTS camera PerformanceTest under ITS lighting conditions for both primary cameras.
    • Hardware
      • Screen resolution >= 1080p, screen density >= 400dpi, and RAM >= 6GB
    • Performance
      • Sequential write performance >= 100 MB/s
      • Random write performance >= 10 MB/s
      • Sequential read performance >= 200 MB/s
      • Random write performance >= 25 MB/s
  • If android.os.Build.VERSION_CODES.S is returned when android.os.Build.VERSION_CODES.MEDIA_PERFORMANCE_CLASS is queried, then the device MUST:
    • AV
      • Advertise the max number of hardware video decoder sessions and hardware video encoder sessions that can be run concurrently in any codec combination.
      • Support 6 instances of hardware video decoder and encoder sessions (AVC, HEVC, or VP9) in any codec combination running concurrently at 720p@30. Only 2 instances required if VP9 is present.
      • Codec initialization latency must be 50ms or less for a 1080p or smaller video encoding session for all hardware video encoders when under load, 40ms or less for a 128kbps or lower bitrate audio encoding session.
      • NOT drop >2 frame in 10 sec for a 1080p60 video under load.
      • NOT drop >2 frame in 10 sec during a video resolution change in a 60fps video session under load.
      • Tap-to-tone latency <100ms using OboeTester tap-to-tone test or CTS Verifier tap-to-tone test.
    • Camera
      • Have a primary rear-facing camera that’s >= 12MP and supports 4K30.
      • Have a primary FFC that’s >= 5MP and supports 1080p30.
      • Front and rear primary cameras must report FULL or better for android.info.supportedHardwareLevel.
      • Support CameraMetadata.SENSOR_INFO_TIMESTAMP_SOURCE_REALTIME for both primary cameras.
      • Have camera2 JPEG capture latency < 1000ms for 1080p resolution as measured by the CTS camera PerformanceTest under ITS lighting conditions (3000K) for both primary cameras.
      • Have camera2 startup latency (open camera to first preview frame) < 500ms as measured by CTS camera PerformanceTest under ITS lighting conditions for both primary cameras.
      • For apps targeting API level 31 or higher, the camera MUST NOT support JPEG capture resolutions smaller than 1080p for both primary cameras.
      • Support CameraMetadata.REQUEST_AVAILABLE_CAPABILITIES_RAW and android.graphics.ImageFormat.RAW_SENSOR for the primary back camera.
    • Hardware
      • Screen resolution >= 1080p, screen density >= 400dpi, and RAM >= 6GB
    • Performance
      • Sequential write performance >= 125 MB/s
      • Random write performance >= 10 MB/s
      • Sequential read performance >= 250 MB/s
      • Random write performance >= 40 MB/s

Section 2.3 – Television Requirements

Section 2.3.5 – Security Model

  • If a TV declares a microphone, it MUST display the mic indicator when an app is accessing audio data from the mic (but not if the mic is being accessed by the HotwordDetectionService, SOURCE_HOTWORD, ContentCaptureService, or apps holding the roles identified in Section 9.1 with CDD identifier [C-4-X]) but must NOT hide the mic indicator for system apps that have visible UIs or direct user interaction. The same holds true if the TV declares a camera.

Section 2.5 – Automotive Requirements

Section 2.5.1 – Hardware

  • If an Automotive device supports OpenGL ES 3.1, then it MUST declare that as well as support Vulkan 1.1 and include Vulkan loader and export all symbols.
  • The requirement for Automotive devices that include a 3-axis gyroscope to also implement the TYPE_GYROSCOPE_UNCALIBRATED sensor has been dropped.

Section 2.5.3 – Software for Automotive

Section 3 – Software

Section 3.2 – Soft API Compatibility

Section 3.2.2 – Build Parameters

  • Google has defined 3 new build parameters, 2 of which device makers must define for their device:
    • SOC_MANUFACTURER – The trade name of the manufacturer of the primary SoC. This name should be provided by the SoC manufacturer.
    • SOC_MODEL – The model name of the primary SoC.
    • ODM_SKU – Optional value that contains the SKU of the device. This is a runtime-initialized property set during startup to configure device services.

Section 3.2.3 – Intent Compatibility

Section 3.3 – Native API Compatibility

Section 3.3.2 – 32-bit ARM Native Code Compatibility

  • The requirement to keep the SETEND instruction available has been dropped. This instruction in the ARM ISA allows code to change the current endianness, which is potentially dangerous.

Section 3.5 – API Behavioral Compatibility

Section 3.5.1 – Application Restriction

  • The threshold for which proprietary app restriction mechanisms are subject to CDD guidelines has been changed from “Rare” to “Restricted.” Now, if a proprietary mechanism is more restrictive than the Restricted App Standby Bucket, then it must abide by the guidelines listed in the CDD, of which there are two new clauses to account for:
    • The proprietary app restriction mechanism MUST report all app restriction events via the UsageStats API.
    • It further MUST NOT let an app be automatically placed in the RESTRICTED bucket within 2 hours of the most recent usage by a user.
  • Google now mandates that, if an app extends the app restrictions implemented in AOSP, they MUST follow the implementation described here.

Section 3.5.2 – Application Hibernation

  • Android 12 adds a new feature called app hibernation. If an app hasn’t been used for a long period of time, then Android will automatically hibernate the app. The system will not only revoke permissions that the user previously granted the app, but it will also force close the app and reclaim memory, storage, and other resources. The user can bring the app out of hibernation by launching it.
    • If a device implementation includes the App Hibernation feature included in AOSP or extends it, they must meet all the requirements in Section 3.5.1 – Application Restriction, except for C-1-6 (returning true for ActivityManager.isBackgroundRestricted()) and C-1-3 (must not auto apply restrictions without evidence of poor system health behavior). In addition, the restriction:
      • Must only be applied on an app that hasn’t been used for some period of time, which is recommended to be one month or longer. The usage must be defined by user interaction via UsageStats#getLastTimeVisible() API or anything causing the app to leave the force-stopped state.
      • Must not prevent the app from being able to respond to activity intents, service bindings, content provider requests, or explicit broadcasts.

Section 3.6 – API namespaces

  • A new clause states that device implementers may add custom APIs in native languages outside of the NDK API, but those APIs must not be in an NDK library or a library owned by another organization as described here. This was never explicitly disallowed before, but Google is now explicitly defining this as acceptable practice.

Section 3.8 – User Interface Compatibility

Section 3.8.3 – Notifications

Section 3.8.3.1 – Presentation of Notifications
  • Device implementations are only recommended to provide an affordance for the user to control notifications that are exposed to apps that have been granted the Notification Listener permission. If so, the granularity must be such that the user can control what notification types are bridged to that listener, including “conversations”, “alerting”, “silent”, and “important ongoing” notifications. Device implementers are also strongly recommended to provide an affordance for users to specify apps to exclude from notifying any specific notification listener.

Section 3.8.4 – Assist APIs

  • This section was previously called ‘Search’. The text referencing incorporating search into apps and exposing app data to the global system search has been removed.

Section 3.8.13 – Unicode and Font

  • Device implementations must not remove or modify NotoColorEmoji.ttf in the system image. (NotoColorEmoji is a colored font for Emoji that’s developed by Google and released under Apache 2.0.) Device implementers, however, can add a new emoji font to override emoji.

Section 3.8.14 – Multi-windows

  • The requirement that the default launcher be resizable if the device supports multi-window and split-screen has been dropped. The reason this is no longer required is because the launcher doesn’t need to be in split-screen for the new app-pairs split-screen model.

Section 3.9 – Device Administration

Section 3.9.1 – Device Provisioning

Section 3.9.1.1 – Device owner provisioning

* DPC = Device Policy Controller
** PO = Profile Owner
*** DO = Device Owner

Section 3.9.1.2 – Managed profile provisioning

Section 3.9.3 – Managed user support

  • If device implementations declare android.software.device_admin and provide a way to add secondary users, they are recommended to show the same AOSP DO disclosures that were shown in the flow initiated by android.app.action.PROVISION_MANAGED_DEVICE before allowing accounts to be added in the secondary User, so users know the device is managed.

Section 5 – Multimedia compatibility

Section 5.2 – Video encoding

  • If a device implementation provides HDR encoding, they are only recommended to provide a plugin for the new seamless transcoding API to convert content from HDR format to SDR.

Section 5.5 – Audio Playback

Section 5.5.1 – Raw audio playback

  • Device implementers are no longer recommended to support raw audio content playback with the sample rates of 24000 or 48000Hz.

Section 5.5.4 – Audio offload

  • If device implementations support audio offload playback, then the device implementer is  recommended to trim the played gapless audio content when specified by the AudioTrack gapless API and the media container for MediaPlayer.

Section 5.6 – Audio latency

  • Google now defines the mean absolute deviation (the average of the absolute values of the deviations from the mean for a set of values) and the tap-to-tone latency (the time between when the screen is tapped and a tone is generated as a result of the tap being heard on the speaker).
  • Google clarifies that if device implementations declare android.hardware.audio.output, the device implementer is recommended to achieve a cold output latency of 100 ms or less over the speaker data path. Google has postponed the requirement to make the cold output latency be 200ms or less. Likewise for the cold input latency, with the clarification that the data path is over the microphone.
  • Google now recommends that the tap-to-tone latency be 8ms or less.
  • Google has removed the recommendation that the continuous output latency be 45ms or less and the continuous input latency be 50ms or less.
  • Google has added a recommendation that device implementions declaring android.hardware.audio.output and android.hardware.microphone have a Mean Continuous Round-Trip Latency of 50ms or less over 5 measurements, with a Mean Absolute Deviation less than 10ms over at least one supported path.

Section 5.10 – Professional Audio

  • Google is dropping requirements around the OpenSL ES PCM buffer queue API in favor of the AAudio native audio API for low-latency needs.
  • Device implementations are recommended to meet latencies of 20ms or less over 5 measurements with a Mean Absolute Deviation less than 5ms over the speaker to microphone path.
  • Device implementations are recommended to meet the Pro Audio requirements for continuous round-trip audio latency, cold input latency, and cold output latency and USB audio requirements using the AAudio native audio API over the MMAP path.
  • The recommendation for devices with or without a 4 conductor 3.5mm audio jack to have a continuous round-trip audio latency of 10ms or less has been dropped.
  • Google has raised the requirement of the mean continuous round-trip audio latency from 20ms to 25ms or less, with a mean absolute deviation of 5ms or less, for devices that omit a 4 conductor 3.5mm audio jack and instead add a USB port with host mode.
  • Google has dropped the recommendation to meet the requirements for devices without 3.5mm jack and w/ USB audio interface using the AAudio native audio API over the MMAP path.

Section 7 – Hardware compatibility

Section 7.1 – Display and Graphics

Section 7.1.4 – 2D and 3D Graphics Acceleration

Section 7.1.4.1 – OpenGL ES
  • Device implementations that support any OpenGL ES version are now required to report the maximum version of the OpenGL ES dEQP tests they support via the android.software.opengles.deqp.level feature flag. They must at least support level 132383489 from March 1st, 2020. They must also pass all OpenGL ES dEQP tests in the test lists between version 132383489 and the version specified in the feature flag for each supported OpenGL ES version. (Devices self-report which OpenGL ES level they support. Support is determined by running the dEQP tests in all test lists from that level and earlier. These dEQP tests are found in the Android source tree at external/deqp/android/cts/master/glesXX-master-YYYY-MM-DD.txt.)
Section 7.1.4.2 – Vulkan
  • Device implementations that include support for Vulkan 1.0 MUST NOT enumerate support for the VK_KHR_video_queue, VK_KHR_video_decode_queue, or VK_KHR_video_encode_queue extensions. All extensions are defined in the Vulkan 1.0 spec.

Section 7.2 – Input Devices

Section 7.3 – Sensors

Section 7.3.6 – Thermometer

Section 7.3.8 – Proximity sensor

  • If a device implementation includes a proximity sensor, and if that sensor reports only a binary “near” or “far” reading, then it must use 0 cm as the “near” reading and 5 cm as the “far” reading as well as report the maximum range and resolution as 5.

Section 7.3.10 – Biometric sensors

  • Google has tweaked the requirements for biometric sensors that are treated as Class 1 (formerly Convenience). Device implementations with Class 1 biometric sensors must now challenge the user for the recommended primary authentication method after no more than 20 false trials and no less than 90 seconds backoff time for biometric verification. (A false trial is defined as an adequate capture quality [BIOMETRIC_ACQUIRED_GOOD] that doesn’t match an enrolled biometric.) Device implementations furthermore are recommended to lower the number of false trials for biometric verification if the spoof and imposter acceptance rates are higher than 7% as measured by the Android Biometrics Test Protocols. Implementations are still required to rate limit attempts for biometric verification, but it is now only recommended to rate limit attempts for at least 30 seconds after 5 false trials. All rate limiting logic is recommended to be moved to the Trusted Execution Environment (TEE). Next, device implementations must disable biometrics once primary authentication backoff has first triggered. Finally, after a 4-hour idle timeout period, OR either 3 failed biometric authentication attempts or the idle timeout period and the failed authentication count have reset after any successful confirmation of the device credentials, the system must challenge the user for the recommended primary authentication OR class 3 (STRONG) biometric. The user must still be challenged only for the recommended primary authentication once every 24 hours or less for devices launching with Android 10 or later or every 72 hours or less for devices upgrading from earlier versions.
  • Google now says that Class 2 (formerly Weak) biometric sensors must make that sensor available to third-party applications.
  • Google now says that Class 3 (formerly Strong) biometric sensors must re-generate Authenticator ID for all Class 3 biometrics supported on device if any of them are re-enrolled. Device implementers must also enable biometric-backed keystore keys to third-party applications.
  • If a device implementation contains an under-display fingerprint scanner, the implementer is recommended to prevent the touchable area of the scanner from interfering with 3-button navigation.

Section 7.4 – Data Connectivity

Section 7.4.1 – Telephony

Section 7.4.2 – IEEE 802.11 (WiFi)

  • Device implementations that implement WiFi are now required to randomize the source MAC address and sequence the number of probe request frames, once at the beginning of each scan, while STA is disconnected. They must use one consistent MAC address (ie. not randomize it halfway through a scan), iterate the probe request sequence number as normal (sequentially) between the probe requests in a scan, and randomize the probe request sequence number between the last probe request of a scan and the first probe request of the next scan.
  • Device implementations are strongly recommended to randomize the source MAC address used for all STA communication to an Access Point (AP) while associating and associated. If so, they must use a different randomized MAC address for each SSID (FQDN for Passpoint) they communicates with, and must provide the user with an option to control the randomization per SSID with non-randomized and randomized options, and must set the default mode for new WiFi configs to be randomized.
  • Device implementations are strongly recommended to use a random BSSID for any AP they create. If so, the MAC address must be randomized and persisted per SSID used by the AP. The implementation may provide the user an option to disable this feature, and if so, randomization must be enabled by default. 
Section 7.4.2.1 – WiFi Direct
  • Device implementations are recommended to randomize the source MAC address for all newly formed WiFi Direct connections.
Section 7.4.2.4 – WiFi Passpoint
  • Wi-Fi Passpoint support is now mandatory for all device implementations that support WiFi. As such, device implementations that support WiFi must declare the android.hardware.wifi.passpoint feature flag, follow the AOSP implementation, support the subset of device provisioning protocols as defined in the WiFi Alliance Passpoint R2 specification (EAP-TTLS authentication and SOAP-XML), process the AAA server certificate as described in the Hotspot 2.0 R3 specification, support user control of provisioning through the WiFi picker, and keep Passpoint configurations persistent across reboots. They are further recommended to support the Terms and Conditions acceptance feature and Venue information. If a global Passpoint disable user control switch is provided, implementations must enable Passpoint by default.
Section 7.4.2.5 – WiFi Location (WiFi Round Trip Time – RTT)
  • Device implementations that support WiFi location and expose the functionality to third-party applications must be accurate to within 2 meters at 800MHz bandwidth at the 86th percentile.
Section 7.4.2.8 – Enterprise WiFi Server Certificate Validation
  • If the WiFi server certificate is not validated or the WiFi server domain name is unset, device implementations are recommended NOT to provide the user an option to manually add an Enterprise Wi-Fi network in the Settings app. The AOSP R QPR1 branch first removed the ability for users to select the “do not validate” option. At the time, Google justified the removal by stating that the option is a security risk as it leaves the possibility of leaking user credentials. If an attacker performs a man-in-the-middle attack to take control of the network, then they are able to point client devices to illegitimate servers owned by the attacker, as the device is unable to perform certificate validation.

Section 7.4.3 – Bluetooth

  • Device implementations must not restrict access to any Bluetooth metadata (such as scan results) which can be used to derive the location of the device, unless the requesting application passes an android.permission.ACCESS_FINE_LOCATION permission check.
  • If an application manifest file does not declare that the app is NOT deriving location information from Bluetooth, then the device implementation MUST gate Bluetooth access behind android.permission.ACCESS_FILE_LOCATION.

Section 7.7 – USB

Section 7.9 – Virtual Reality

Section 7.9.2 – Virtual Reality Mode – High Performance

  • Device implementations that support VR mode must support OpenGL ES 3.2 – this was only recommended previously. They must also now support the GL_OVR_multiview_multisampled_render_to_texture extension (it is only recommended to do so in Android 11 implementations as it is only of use to tiled hardware architectures) and expose it in the list of available GL extensions.

Section 8 – Performance and Power

Section 8.3 – Power-Saving Modes

  • If a device implementation includes a feature to improve device power management that’s extended from an AOSP feature (eg. App Standby Bucket, Doze) and that feature applies stronger restrictions than the RESTRICTED App Standby Bucket (previously RARE), then:
    • The custom implementation must not deviate from the AOSP implementation, and it must use the global system settings (as well as DeviceConfig) of App Standby and Doze power-saving modes
    • The custom implementation must provide user affordance to display all applications that are exempted from App Standby and Doze power-saving modes or any battery optimizations and must implement the ACTION_REQUEST_IGNORE_BATTERY_OPTIMIZATIONS intent to ask the user to allow an app to ignore battery optimizations.

Section 9 – Security Model Compatibility

Section 9.1 – Permissions

  • Device implementations that declare the android.hardware.security.model.compatible feature are required to support all requirements listed in the subsections of Section 9 – Security Model Compatibility.
  • Device implementations must support the Android Roles Model in addition to the Android permissions models.
  • Android carves out two exceptions to the location permission properties: Apps that hold the RADIO_SCAN_WITHOUT_LOCATION permission and apps that hold the NETWORK_SETTINGS or NETWORK_SETUP_WIZARD permissions for device configuration/setup purposes. These apps don’t use location to derive or identify user location and as such are exempted.
  • Android allows for permissions to be marked as restricted to alter their behavior. New to Android 12 are the following requirements:
    • Device implementations cannot provide custom functions or APIs to bypass the permission restrictions defined in setPermissionPolicy and setPermissionGrantState APIs.
    • Device implementations must use the AppOpsManager APIs to record and track every programmatic access of data protected by dangerous permissions from Android activities and services, assign roles only to apps with functionalities that meet the role requirements, and must not define roles that are duplicates or superset functionality of roles defined by the platform.
  • If device implementations report android.software.managed_users, they must not have the following permissions silently granted by the admin: Location (ACCESS_BACKGROUND_LOCATION, ACCESS_COARSE_LOCATION, ACCESS_FINE_LOCATION), Camera (CAMERA), Microphone (RECORD_AUDIO), Body sensor (BODY_SENSORS), and physical activity (ACTIVITY_RECOGNITION).
  • If device implementations report android.software.device_admin, they must show a disclaimer during fully managed device setup (device owner setup) that the IT admin can let apps control settings on the phone including the microphone, camera, and location, with options for the user to continue or exit setup unless the admin has opted out of control of permissions on the device.
  • If device implementations pre-install applications that hold the System UI Intelligence, System Ambient Audio Intelligence, System Audio Intelligence, System Notification Intelligence, System Text Intelligence, or System Visual Intelligence roles, they must fulfill requirements outlined in Section 9.8.6 for Content Capture, must not have android.permission.INTERNET, and must not bind to other applications except for the following: Bluetooth, Contacts, Media, Telephony, SystemUI, and components providing Internet APIs.

Section 9.5 – Multi-user support

  • Android 12 adds support for clone user profiles with partial isolation (single additional user profile of type android.os.usertype.profile.CLONE). This is an optional feature, and the dual instance will appear in the launcher and recents view at the same time. If it’s implemented, then the device implementation must only provide access to storage or data that is either already accessible to the parent user profile or is directly owned by this additional user profile, must not have it as a work profile, must have isolated private app data directories from the parent user account, and must not let the additional user profile be created if there’s a Device Owner provisioned or allow a Device Owner to be provisioned without removing the cloned user first.

Section 9.7 – Security features

  • Google recommends that device implementations isolate each I/O device capable of Direct Memory Access (DMA) by using an Input-output memory management unit (IOMMU) such as the ARM System Memory Management Unit (SMMU), if the maker chooses to use an I/O device capable of DMA.
  • Google is also now making recommendations to reduce key classes of common bugs that contribute to poor quality and security. 
    • In order to reduce memory bugs, device implementations should be tested:
      • Using userspace memory error detection tools like Memory Tagging Extension (MTE) for ARMv9 devices, hardware-assisted AddressSanitizer (HWASan) for ARMv8+ devices, or AddressSanitizer (ASan) for other device types.
      • Using kernel memory error detection tools like KernelAddressSanitizer (KASAN)
      • Using memory error detection tools in production like MTE, GWP-ASan (“GWP-ASan Will Provide Allocation Sanity”), and Kernel Electric-Fence (KFENCE)
    • In addition, device implementations using an Arm TrustZone-based TEE should use a standard protocol for memory sharing between Android and the TEE, like Arm Firmware Framework for Armv8-A (FF-A). They should also restrict trusted apps to only accessing memory explicitly shared with them via the above protocol. If the device supports the Arm S-EL2 exception level, this should be enforced by the secure partition manager, otherwise, it should be enforced by the TEE OS.

Section 9.8 – Privacy

Section 9.8.2 – Recording

  • For the microphone and camera status bar indicators, Google says that after they’ve been displayed for 1 second, the indicator can change visually (such as becoming smaller) and is not required to be shown as originally presented and understood (AOSP already does this.) The microphone indicator may be merged with an actively displayed camera indicator (or vice versa) provided that the text/icons/colors indicate to the user that camera/microphone use has begun.
  • Device implementations that declare android.hardware.microphone or android.hardware.camera.any are strongly recommended to display the microphone/camera indicator when an application is accessing audio data from the microphone/live camera data respectively. They are also recommended to display the list of recent and active applications using the sensor as returned from PermissionManager.getIndicatorAppOpUsageData(), and lastly, that they should not hide the indicator for system apps that have visible user interfaces or direct user interaction. Note that handheld and television implementations are required to implement these, as outlined in their respective device-specific sections.

Section 9.8.6 – Content Capture (and App Search)

  • Google has updated this section of the CDD to include the new AppSearch API and SpeechRecognizer#onDeviceSpeechRecognizer, which provides the ability to perform speech recognition on-device. Any implementation of either AppSearch or SpeechRecognizer must follow the policies outlined in this section.
  • AppSearch implementations must also show a user affordance to opt-out of the data collected via the API to be shown in the android platform launcher. Implementations are also recommended not to request the Internet permission, but if they do, to only access the Internet through structured APIs backed by publicly open-source implementations.

Section 9.8.8 – Location

  • Google is providing guidance on what information can be considered location data. Location can be as fine as DGPS or as coarse as country-level locations (like the mobile country code [MCC]). Wireless technologies with unique identifiers like WiFi, Bluetooth, Ultra-wideband (UWB), or cell tower ID also count.

Section 9.8.9 – Installed apps

  • Device implementations cannot give applications the ability to read or write files in another application’s private data directory in external storage. The only exceptions to this are the external storage provider authority (DocumentsUI), Download Provider, platform-signed MTP apps, and apps holding the permission android.permission.INSTALL_PACKAGES that can install other apps (however, they’re limited to reading/writing to “obb” subdirectories). This is in line with Scoped Storage best practices.

Section 9.8.10 – Connectivity Bug Report

  • The prerequisite was changed from System API BUGREPORT_MODE_TELEPHONY to android.hardware.telephony feature flag.

Section 9.8.12 – Music Recognition

  • The System API MusicRecognitionManager lets device implementations request music recognition, given an audio record, and delegate that music recognition to a privileged application implementing the API. If this is implemented, they must:
  • If any audio data captured is stored, device implementations must not store any raw audio or audio fingerprints on-disk or in-memory for > 14 days and must not share said data beyond the MusicRecognitionService except with explicit user consent each time it’s shared.

Section 9.8.13 – SensorPrivacyManager

  • If a device implementation provides the user an affordance to turn off the camera and/or microphone input, they must return true for the relevant supportsSensorToggle() API method and must, when an app tries to access a blocked microphone or camera, present the user with a non-dismissable affordance indicating the sensor is blocked and give them a choice to continue blocking or unblock. They must also only pass blank/fake data to apps and not report an error when the user chooses to block them. The AOSP implementation meets this requirement. 

Section 9.9 – Data Storage Encryption

Section 9.9.2 – Encryption requirements

  • Device implementations on Android 11 are now allowed to implement per-user block-level encryption for data storage encryption. However, Android 12 devices must only implement file-based encryption (FBE) and metadata encryption.

Section 9.9.3 – Encryption methods

  • Google has updated the requirements when implementing FBE and metadata encryption. Implementations must: 
    • Ensure that all non-deleted blocks of encrypted file contents on persistent storage were encrypted using combinations or encryption key and initialization vector (IV) that depend on both the file and the offset within the file. In addition, all such combinations must be distinct, except where the encryption is done using inline encryption hardware that only supports an IV length of 32 bits.
    • Ensure that all non-deleted encrypted filenames on persistent storage in distinct directories were encrypted using distinct combinations of encryption key and initialization vector (IV).
    • Ensure that all encrypted filesystem metadata blocks on persistent storage were encrypted using distinct combinations of encryption keys and initialization vector (IV).
  • Google has also updated the requirements for the keys protecting CE and DE storage areas and filesystem metadata. Keys must:
    • Be securely erased during bootloader unlock/relock as described here.

Section 9.10 – Device Security

  • Device implementations that support Verified Boot are now required to securely erase all user data during bootloader unlock/relock, as per Section 9.12 on data deletion.

Section 9.11 – Keys and credentials

  • Device implementations must support IKeymasterDevice 4.0, IKeymasterDevice 4.1, or IKeyMintDevice version 1, but are strongly recommended to support IKeyMintDevice version 1.

Section 9.11.1 – Secure lock screen and authentication

  • If a device implementation adds or modifies the authentication methods used to unlock the device, the new authentication method must be disabled when the DPC app sets the password requirements policy via the DevicePolicyManager.setRequiredPasswordComplexity() with a more restrictive complexity constant than PASSWORD_COMPLEXITY_NONE or via the DevicePolicyManager.setPasswordQuality() method with a more restrictive constant than PASSWORD_QUALITY_BIOMETRIC_WEAK. Previously, the authentication method had to be disabled when the DPC app set the password quality policy with a more restrictive quality constant than PASSWORD_QUALITY_SOMETHING.
  • If a biometric authentication method doesn’t meet the requirements for Class 3, then the method must be disabled if the DPC has set the password requirements quality policy via the DevicePolicyManager.setRequiredPasswordComplexity() method with a more restrictive complexity bucket than PASSWORD_COMPLEXITY_LOW or using DevicePolicyManager.setPasswordQuality() method with a more restrictive quality constant than PASSWORD_QUALITY_BIOMETRIC_WEAK. Previously, the biometric authentication method had to be disabled when the DPC app set the password quality policy with a more restrictive quality constant than PASSWORD_QUALITY_UNSPECIFIED.
  • If a device implementation adds an authentication method based on a physical token or location, it must be disabled when the DPC app sets the password quality with a more restrictive constant than PASSWORD_QUALITY_NONE or the password complexity with a more restrictive complexity bucket than PASSWORD_COMPLEXITY_NONE. Previously, it had to be disabled if the DPC set a quality constant more restrictive than PASSWORD_QUALITY_UNSPECIFIED.
  • If a device implementation adds an authentication method to unlock a lock screen that is not a secure lock screen, that new method must be disabled when the DPC sets the password quality to a more restrictive constant than PASSWORD_QUALITY_NONE or the password complexity with a more restrictive complexity bucket than PASSWORD_COMPLEXITY_NONE. Previously, it had to be disabled if the DPC set a quality constant more restrictive than PASSWORD_QUALITY_UNSPECIFIED.
  • If a device implementation supports a separate display power state through DeviceStateManager AND supports separate display lock states through KeyguardDisplayManager, it is strongly recommended to utilize a credential meeting requirements defined in Section 9.11.1 or a biometric meeting at least Class 1 specifications defined in Section 7.3.10 to allow independent unlocking from the default device display, constrain separate display unlock via a defined display timeout, and allow the user to globally lock all displays through lockdown from the primary handheld device.

Miscellaneous Requirements

  • Handhelds, Televisions, Watch, and Automotive are all required to declare support for the android.hardware.security.model.compatible feature flag. According to AOSP, this feature flag indicates that the device supports the Android security model. Google has not provided further guidance on the purpose of declaring this feature flag.

Conclusion

Ready to start building Android 12? Download the Android source tree from Google’s Git repository and follow these steps to build the code. Alternatively, reach out to your SoC vendor to download their AOSP fork. When you’re ready, download the CTS for Android 12 and begin testing your build for compatibility issues.

If you have a specific question about the latest Android 12 compatibility requirements, then visit the Google Group for Android Compatibility or reach out to an expert at Esper today.