What’s new in Android Privacy Update | Google IO 2021

What’s new in Android Privacy Update | Google IO 2021

Hello, I’m Sara, and I’m joined today by my colleague Erik. We will take you through how we’re protecting our users through the enhancements we’ve built into our platform privacy. Keeping users in control of their privacy and safeguarding everything they do online is more important than ever before, particularly when it comes to the mobile operating system. This is because our users want an operating system they can trust with their most personal and sensitive information. They want the peace of mind that their smartphone not only brings them convenience, but also protects them from harm. Creating a safe ecosystem is a shared responsibility between all of us. We count on you, our developer community, to bring the latest improvements in privacy to our users. Privacy has always been core to Android’s product principles. With each release of Android, we continue to expand on an existing foundation that is designed to build trust with users. Android has a set of privacy principles that guide our product development. Transparency around what data is accessed by apps and when, while providing simple controls for users to make informed choices about enhancing or limiting an app’s access to their data. Data minimization reduces the scope of permissions so users are not surprised by the data that leaves their device and the Private Compute Core keeps sensitive sensor data separate from the rest of the OS and from apps. Transparency is about honesty. Increasingly, users want to be informed about what apps are accessing on their device. In Android 12, we’re introducing a number of features that will increase transparency, starting with microphone and camera. In this release, we’re adding transparency to microphone and camera accesses. Going forward, users will know in real-time every time an app accesses their mic or camera. By simply swiping down in Quick Settings, users can click on the indicators to view the app accessing their data. If the access is unwarranted, users can quickly navigate to the App Permission page and revoke the permissions. Developers should review your use of microphone and camera and proactively remove unexpected accesses. For example, make sure your app is not accessing these sensors before the user has clicked on a feature that needs access. Users often tell us they wanna understand what data apps are actually using. With the new Privacy Dashboard, users will have a simple and clear timeline view of the last 24 hour accesses for mic, camera and location. For the remaining one-time permissions, users will see whether or not an app accessed the data in the last 24 hours. We encourage all developers to review your code path and make sure all the accesses can be justified by your use cases, including third-party SDKs which will be attributed to your app. In Android 11, we added data access auditing APIs to make it easy for you to audit your current data access. Use the APIs to untangle mapping of your code by tracking which part of your code is accessing private data and track and control data access by third-party SDKs. The API instructs the system to back-trace into an app-specified callback each time the app accesses sensitive data. The callback provides information about the type of data being accessed and can record valuable details such as stack traces, the frequency and timing of the access. We know how sensitive content copied to clipboard can be, Users frequently copy emails, addresses, and even passwords. Android 12 notifies users every time an app reads from their clipboard. Users will see a toast at the bottom of the screen each time an app calls getPrimaryClip. The toast doesn’t show if clipboard data originates from the same app. You can minimize access by first checking getPrimaryClipDescription to learn about the type of data in the clipboard. The recommended best practice is to only access clipboard data when the user understands why. Android 12 brings a new level of transparency that is revolutionary and necessary and will lead to greater trust with users. I’ll now hand it off to Erik who’ll talk about how we’re adding thoughtful controls to the platform to improve decision-making. Thanks, Sara, we also need to offer controls that give users means to make informed decisions about who can access their private data and how much access they’re willing to share. It’s about balancing choice with safe default behavior. Over the last two releases, we’ve made location permission fine grained by first separating background and foreground access. We then added an Only This Time option which we also extended to mic and camera permissions. And finally, we restricted access to background location. We’re seeing that users are responding positively to these controls by choosing them more often. With more granular foreground when given the option, users select foreground location about 80% of the time. So in this release, we want to give users even more control over their location data by adding approximate location. Next time an app needs location, users will have a clear choice to reduce the accuracy of location provided to the app by selecting Approximate Location. We encourage all developers that need location to review your use case and only request ACCESS_COURSE_LOCATION if your features don’t need the user’s precise location. You should also be prepared for users to reduce location accuracy. Please make sure your app works with reduced location accuracy when users select Approximate. Let’s see how this works in practice. Let’s say a map app needs access to precise location for turn-by-turn navigation. Going forward, the app must request both ACCESS_FINE-LOCATION and ACCESS_COARSE_LOCATION, and be prepared for users to grant ACCESS_COURSE_LOCATION at runtime. The user should be able to do most tasks with just approximate location. The user clicks on Navigation, which needs precise location. The app can now ask the user to grant it precise location. As always, make sure users understand your location use case before requesting access. For more details on how to implement these changes, check out our codelab on approximate location. In addition to location, microphone and camera are the most sensitive permissions. Aside from putting stickers on cameras or adding audio blockers to phones, today there’s no guaranteed way for users to ensure no one has access to their mic and camera. That is about to change. In Android 12, we are adding two new controls which allow users to quickly cut off all access to mic and camera on the device. In case users launch an app that needs access to the sensors, they will be notified to quickly turn on the sensor. The system will take care of the end-to-end flow. Apps don’t need to do anything differently. As part of this launch, we’re also limiting motion sensors sampling rates to 200 Hertz when the mic toggle is muted. Two takeaways for developers: If you’re following existing permissions best practices, then you don’t need to do anything differently to handle the toggle state. The system will take care of notifying the user. Apps targeting Android 12 and up that need access to higher sampling rates are now required to request Access Higher Sensor Sampling Rate. We know in many parts of the world, it is common for users to share devices with friends and family members. In a privacy-focused survey of Android users, 56% of the U.S. and 81% of respondents in India claimed they share their device with friends and family members. We want to ensure these users feel protected when sharing their devices. We’re making sure guest mode switching is much easier. With our recent improvements to guest mode, users can share their device in a secure way with a couple of taps on the lock screen. Now I’ll hand it off to Sara to talk about what we’re doing to minimize data access so that there is less to share with apps in the first place. Thanks Erik. There’s no better way to build trust with users than to minimize the data you require for your feature to work. One way we’re minimizing data access in Android 12 is by adding a new runtime permission for nearby connections. Up until now, companion apps, such as watches and headphones, require the location permission to scan for nearby Bluetooth. We heard from users and developers that this was confusing and not to mention over-granting location when the app just wanted Bluetooth. Going forward, companion apps can connect to the associated devices by requesting the new Nearby permission. Last year, we launched permissions auto-reset. If an app is in use for an extended period of time, Android automatically revokes permissions for the app. Since the launch, permissions were reset for 8.5 million apps. This year, we’re building on permissions auto-reset by intelligently hibernating apps that have gone unused for an extended period of time, optimizing for device storage, performance and safety. The system not only revokes permissions granted previously by the user, but also force stops the app and reclaims memory storage and other temporary resources. In this state, the system also prevents apps from running jobs in the background or receiving push notifications. Users can bring apps out of hibernation by simply launching them. Similar to permissions auto-reset, the user will be prompted when an app has gone into hibernation and can disable hibernation in Settings. Last year we introduced scope storage and package visibility. The scope store has change restricts apps targeting SDK 30 and up to access contents of external app storage directory outside their own. We added a special app access permission for certain qualifying apps that permits them to manage all files in shared storage. Please note, that as of May 5th, 2021, if your app does not require access to MANAGE_EXTERNAL_STORAGE permission, you must remove it from the app’s manifest in order to successfully publish your app. If your app meets the policy requirements for acceptable use or is eligible for an exception, you will be required to declare this. Apps that fail to meet policy requirements or do not submit a declaration form may be removed from the Play Store. Package visibility restricts apps from querying list of apps on a device unless they’ve declared QUERY_ALL_PACKAGES permission. However, the permission is restricted to qualifying apps and the policy enforcement will begin later this year. If your app does not meet the requirements for use of QUERY_ALL_PACKAGES, it must remove it from the manifest to comply with Play policy. Please visit the Play Console Help website to learn more. To cover our fourth pillar, I’ll hand it back to Erik to share how we’re minimizing data access through the Private Compute Core. In Android 12, we’re using a Private Compute Core to deliver intelligence features to Android devices without compromising privacy. These private processes are provided by Android operating system and mean two things. One, features can operate on sensitive data; two, users can be sure that their data is safe. They remain in control. Android has been investing in intelligence features since we launched app suggestions in the Marshmallow release. Since then, we’ve launched more intelligence features with every release: Now Playing, Live Caption, Screen Attention, and more. These features all run locally on your Android device and don’t need to use the network or send the data they see to Google to work, so they don’t. In Android 12, we’re improving the protections around these features by creating a new concept of a Private Compute Core. Many features need to process sensitive data to work. For instance, Now Playing can only recognize music nearby by listening, but audio around you is really sensitive, so how can we protect it? To do this, Android 12 adds an OS private computing layer in a place which is isolated technically from the rest of the operating system, from other apps and from the network. It’s where data processing happens in a secure and confidential way. We’re using this private compute core in a few places. This isolated layer has enabled features users love such as Screen Attention, Now Playing and Live Caption, and we’re continuing to add more intelligent features that run locally. For example, Smart Reply lives inside Private Compute Core. When you’re typing in an app, Gboard asks Smart Reply to make suggestions based on the conversation on screen. Smart Reply needs to process the data on the screen, and thanks to Private Compute Core, this data processing happens in a secure and confidential way. The sensitive data is never shared with the app, the keyboard, or Google. In addition to adding new features, in Android 12, we’re also removing direct network permission from Private Compute. Features within Private Compute Core will talk to a new open-source APK we’re calling Private Compute Services. Private Compute Services exposes very narrow purposeful APIs to do things like download models, use federated learning and more. And because its APK is open source, researchers and other interested parties can see what it’s doing and verify that it’s working responsibly. We think this pattern’s a great way to deliver intelligence features, bringing the best of Google to Android devices. Code that processes sensitive data runs inside an isolated part of the Android OS, and communicates over channels which are laid down in open-source APIs that provide complete transparency. Android 12 is our most ambitious privacy release to date. The team has been working hard to bring users features that add transparency to data access, empower greater control of private data, reduce access to data, and migrate more intelligence and smarts to privacy preserving processes. We hope you found this session helpful. You can find a lot more details and documentation at the developer.android.com website. Thank you, everyone. Stay safe and healthy.

Leave a Reply

Your email address will not be published. Required fields are marked *

Sundar Pichai's Notifications    Yes No thanks