What’s new in Android 12 Update | Google IO Keynote 2021

Hi, and welcome to What’s New in Android, the annual talk we give at Google I/O, a little bit differently. Instead of giving it onstage to a live audience that’s here, we thought we would invite you into the intimacy of our own homes and just have a conversation with you instead. I’m Chet Haase on the developer relations team. Oh, and I’m here. Sorry, this is Dan Sandler from the Android system UI team. That was just a preview of what this talk is going to be like. That’s right.

It’s going to be just ironclad timings, front to back. And I’m Romain Guy from the Android Toolkit team. And today, we’re going to talk about everything going on in the Android… well, a bunch of stuff going on in the Android universe starting with, so we have this letter associated with the release and every year, the question is what does that letter stand for? And I’m here to tell you… We are here to tell you that it stands for… Android 12. I’m sorry, we’re engineers. We deal with numbers and we better get used to it. So let us talk about Android 12, starting with Romain. You wanted to talk about your favorite feature? Yeah, I want to talk about, I think, the best thing we’ve done in this release. Some of you may have noticed that in the Android developer preview 1 of Android 12, we’ve deprecated the deprecated annotation. I’m extremely happy to tell you that we’ve undeprecated the deprecated annotation after seeing your feedback. What truly happened was, there was a bug in our documentation generation script and it saw the deprecated annotation being used in the documentation, and so marked the class deprecated. So that’s fixed. I think this is a really good example of why we do these preview releases and why the entire Android ecosystem benefits from doing several of these preview releases before we’re actually done with it a release. So, welcome to the current preview release of Android 12, and Dan, you want to tell us about one of the bigger elements of this release? Thanks Chad. So Android 12, it’s our biggest overhaul to the Android user interface since the last biggest overhaul to the Android user interface. It’s actually pretty big. We dropped some hints about this in the first developer preview, & the second developer preview. Some may have found more of those hints in the second developer preview than we meant to leave in there, good job. Some others may have found builds off the back of trucks. Really, really have to start locking the doors on those trucks. There’s a lot more to show you in beta 1. Hopefully in the keynote, you saw some of the new hotness not even in the developer previews, not even in beta 1 yet, that’s coming later. Let’s talk a little bit about color. So every Android device, it has a very basic color scheme, right? In the device default theme you can get an accent color which, an AOSP might be a particular color of teal and on Pixel devices might be a color of blue and so forth. If you, as an app developer wanted to harmonize with that color, you can just use Theme.DeviceDefault, or more recently, Theme.DeviceDefault.DayNight, which allows you to adapt to a light theme and dark theme as the user chooses it. In Android 12, we expanded that pallet considerably, and all the systems services use it and harmonize together. You can see in the screenshots, the lock screen, home screen, notifications, settings are all using this much richer palette and are harmonized together. In fact, on Pixel devices, we’re going to go a step further and select this palette automatically, based on your wallpaper. Other devices may do a different thing, but this idea of a much richer, broader system palette that you can access with device default is now a part of Android 12. Importantly that means you can customize your device by choosing a wallpaper. You can change the mood of your device throughout the UI. Absolutely, and again, other OEMs may take this in a different direction. The API service, for you as a developer, if you want to harmonize, if you want to get in on this color scheme, is first of all, device default is here for you. It’ll automatically pick up whatever the current foreground, background, accent color is, that comes from that color palette, but if you want to go beyond device default, we’ve actually added a very complex and rich palette of five colors, two neutrals and three accent colors, and then the entire 12 shades of the material color numbering scheme everything from zero to 1000. This means that, you as an app developer can use… and this is a public API for these resources to grab Android color system accent 2800, and get a, sort of… basically index into this comprehensive color palette that the whole device is using and use it in lots of different subtle shades. You know Dan, it reminds me of a old video game consoles like the NES, where you can use all the colors you want, as long as it’s one of the 56 colors predefined by the system. True, but we’ve also brought in another feature from those old consoles, which is that if your app crashes, you can just blow on the contact to reboot it. So that’s a nice new feature too. Hey let’s talk about widgets. Don’t call it a comeback. They’ve been here for years and users love them. I saw a study recently that 84% of Android users have at least one widget on their home screen. About two thirds have two or more widgets installed on their home screens. But your widget and maybe some of ours might look like they haven’t gotten a lot of love recently in terms of updating the visual design. So we’ve done a lot of work in Android 12 to make widgets an even more delightful experience. We’ve done some work in the system UI to make selecting widgets a lot easier. There’s a better widget picker, you can search for widgets and things like that. We’ve removed the required configuration step when the user places a widget, so again, that whole process is streamlined. But for you the developer, to update your widgets to the new styles, all that stuff I described about colors, all that color palette stuff, you can use inside your widget layouts as well, so that they harmonize perfectly with the rest of the system UI, and just look really, really good on home screen. And there’s one more thing. I don’t even want to spoil it. There’s a whole new API to construct these same RemoteViews widgets that are backwards compatible to like what, Cupcake? There’s a whole new API that you can use to build these widgets in a new way. I don’t want to spoil it here, there’s an I/O talk called Refreshing Widgets that I want you to go watch. OK, I want to talk about launch animations. One of the jankiest things about the Android experience can be app cold start. When the system doesn’t already have a screenshot of your app’s current state to animate up you launch the app, and you see, maybe you see a hollow gradient, maybe you see a Gingerbread title bar. It’s real bad. So in Android 12, one of the things that we’re doing is adding a new splash screen, an animation that looks great for every app as a baseline, you have to do nothing to get this. Every app will get a nice new zoom-in animation that highlights your app’s icon, zooming it up from the launcher if necessary, and if possible, while your app is loading it’s mainly out in the background. And if this animation doesn’t meet your needs you can substitute yours, or you can customize the system version. You can specify an AnimatedVectorDrawable for your app icon, so you can actually get an animated build of your apps icon zooming around or unfolding or whatever it wants to do. You can call Activity.getSplashScreen() to modify properties of the animation itself the duration, the color of the screen and things like that. The launch animations will look really good. We’re excited for you to try it. So Dan, if an application or a developer has their own custom splash screen experience right now and now we’re introducing this new platform and system capability overall, are they supposed to do something to tune into this or to customize the thing that they have? You can continue to use the one that you’ve got. The nice thing about the system one is it’s ready immediately. Yours, you have to wait for your app process to start and load all your libraries, things like that. So for the best possible experience, the system one will give you everything, everything the user wants to see, which is an animation that starts immediately and looks really good. But again, you can plug into it in all these different ways and if necessary even substitute your own. Okay, you knew this was coming. Let’s talk about notifications. I’m going to keep this short and sweet, like the best notifications. Two points I wanted to talk about. One is we’ve redesigned all the templates, which I know, at this point, I should let you know when we don’t redesign the templates. That would probably be shorter. I can reuse the same slide from I/O that we’ve been using for the past eight or nine years in a row and it could say the same thing, By the way, we’ve redesigned the template.” Not just the slides, we could just reuse what Dan said a few years ago. -Wait, we could just reuse Dan overall. -It would’ve saved me a lot time. -Dan 1.0. -I could’ve taken a vacation day. Those who’ve been on this journey with us for a few years may notice familiar design elements from previous releases coming back, but we’ve updated those to match the new style, the new Material NEXT color palettes and themes, typography, even the corner rounding has changed. If you as a developer are using the standard templates, you know things will look great because you don’t have to do anything. The API takes care of adapting it to the new style, but if you’ve carefully imitated the templates using your own custom remote views, they’re going to look wrong again and you’re going to have a lot of work to do. In this release, you’re gonna have more work to do than the last one, because we’ve actually fully deprecated fully custom remote use. You can still use the decorated custom template style where you specify just a small amount of remote use layout to apply within the standard system decor. But we need to get away from this loop of developers having to reconstruct, carefully piece together the new templates every year and for every device as well. So fully custom remote views don’t work as expected anymore. They will actually get moved into the decorated custom view style, so you’ll get the system décor around it that looks correct, but you’ll need to change what you put inside to ensure it works well in that context. Okay, the number two thing about notifications I wanted to bring to your attention is trampolines. Continuing with our theme of what are the jankiest parts of Android, you’ve probably had this experience as a user, where you tap on a notification and then the shade closes and then nothing seems to happen for a few seconds. That’s because the app has used, instead of launching an activity directly from that notification, they’ve sent a broadcast or kicked off a service so that they can do some other work in the background, maybe a centralized metrics, or just have a single entry point for all their notifications before starting an activity from the background. We call that a trampoline, and it introduces this huge gap where it seems like the phone is just non-responsive. It’s measurable as latency, and we can’t generate a nice animation from the notification to your activity when those two things aren’t directly linked. So now you can’t do it. Once your app starts targeting S, you as a developer must start an activity from the content intent of your notification. Otherwise, nothing will happen. The notification won’t work. So trampolines, can’t do it anymore. I want to talk about toasts, which we almost never talk about. Toasts, those little pop-up gray boxes at the bottom of the screen that you’re not sure which app is sending you a message? We updated those again for Material NEXT. We took this opportunity to add attribution, to add the icon of the app that was sending the toast. So now the user knows, oh, that was my browser telling me that it had downloaded a thing, or Gmail telling me that it had archived a message or something like that. You now know where that app is coming… which app that toast is coming from. We’ve also reduced the amount of text you can put in a toast, so please keep it short. If you have more than just one or two very short lines of text to communicate to the user, that’s what a notification is for. Use a notification. And then finally, we’ve added some rate limiting to toast. I think we did it in a prior release, but it’s still there and it’s worth mentioning as long as we’re talking about toasts. It’s common for the user to get into a situation where they’ve got a loop of like 50 toasts stacked on top of each other. You have to wait for each one to time out before you can see the next one and they’re stuck at the bottom of the screen and it’s really annoying. So now we’ve capped that number of toasts each app can have outstanding at a given time & it’s a small number. Toast, the bread and butter of the system UI of Android. Can we make sure that we call that, where they stack up on each other, that’s like a toast rack. I think that’s the phrase for it. I think loaf is what… Loaf, yes. Alright. Something else that we’ve done to improve the visual quality of Android is to improve picture in picture. So until now, when the user was going back to home screen using the swipe up to home gesture, the system had to wait for that transition to finish before it could animate the transition to picture in picture from your app, which means it was very difficult for your application to control this experience. Now with Android 12, we have APIs where you can tell us ahead of time that’s what you want to happen, so the system can neatly choreograph the animation of all those different windows to deliver a much better experience. In the same vein, we do what’s called seamless resizing, when you resize a picture in picture window. This works great for video content. On every frame of the resize animation, we resize the content. Obviously that works great for a video but not so much for UI. So there’s a new API in Android 12 where you can use a crossfade instead that will work a lot better for any kind of content that’s not as simple as a video. Particularly for UI elements, I think that really works well. Stretching your UI in weird ways doesn’t look great, but a crossfade for UI layouts, much better. So Dan said that this is the biggest visual change that we’ve made since Android 5.0. What’s great for us as engineers is that it gave us the opportunity to add new features that you’ve been requesting for many years, and one of them is the ability to blur content. So we have a new API on view called setRenderEffect, and you can use it on any view. Here I have an example where I’m applying the blur on an image view, but again, it will work with anything and it works in real time. Much, much easier than trying to do it yourself. Now of course, we didn’t stop at views themselves. You can use blur in other places as well. You can blur what’s behind your window content, or you can blur what’s outside of your window content. You can do this from your theme, as you can see on screen, or you can do this programmatically using the window manager API. So you can control the blur differently on views, behind the window content and outside of the window content. Interestingly, blur is something that actually worked in the very first version of Android, because our first phone, the T-Mobile G1, had a special piece of hardware that could blur windows at composition time. But that was specific to that device, so it’s kind of stopped working afterwards and it became just a dim behind what we already know today. Yeah, we changed that one to dim window which is what people have been using since and we kind of avoided enabling blur for years because it was really taxing on the system, right? You don’t want to make it really easy for developers to tie themselves in knots by enabling features that basically brings the device to its knees, but I think the increase in CPU and GPU power means that now we can start to do some of this stuff. Not only that, but it wasn’t part of our visual language, and with the big refresh that we’re doing, we decided that blur was a better fit. So that’s why we’re introducing it again. Speaking of better fit, so we made your design and we introduced this ripple effect anytime you have a typeable area, like a list item or button, you see this nice ripple animation. So we’re refreshing this animation to add more subtlety to it. It’s not that subtle actually. So there’s a new sparkle effect that’s part of the ripple. It’ll be enabled by default everywhere. You needn’t do a thing in your application. If you see this effect, nothing’s wrong. We’re turning that on on your behalf. It won’t impact your application in any way. It’s completely compatible with the existing design, but that’s part of the new visual language of Android 12. Another effect that we’re adding to Android 12 and that we are turning on by default, is a new version of what we call the edge effect. In Gingerbread, we added an edge effect called the glow. Whenever you reached the end of a scrolling container, either through a fling or a scroll, we would release this burst of energy in the form of a glow coming from the top or the bottom of the container. But this effect hasn’t really changed well, since Gingerbread, and since we were redesigning the whole system for Android 12, we decided that it was time for a new fresh coat of paint. So now we’re doing a stretch, so it feels more organic. It feels more natural. Again, we’re going to turn this on by default everywhere. So as opposed to the ripple, this one may affect your application because when the effect is in action, what we do is that we re-target all the rendering into an extra texture, which means you’re not rendering directly into the window anymore. So if you’re using things like Advent’s blend modes, that might change the way your application looks like. We have APIs to let you control this effect, or you can just redesign your application or work around it in many ways. We’ll probably talk more about that in the coming months. I was just going to say, it makes getting to the end of your inbox data list a stretch goal. Oh my God. Oh my God. I’ve been dying to say that too, Chad. I was simply going to say that for the other way that you can… The other way that this might be a challenge that developers need to work out is when if you are currently implementing your scroll by having multiple different views on screen at the same time, kind of tracking the same scroll Y, only the scrolling container will get the stretch effect. You’ll have to figure out how to get those rendered into the same view to make sure they all get stretched at the same time Right. On to graphics. Android 12 adds support for a new image format called AVIF. We’ve added support for new formats in the past like HEIF or WebP, and those were better than JPEG or PNG on some axes. And I believe it will be the same with AVIF. It’s a single frame version of the AV1 video codec. So let’s take a look at an example. This is a picture of the Grand Canyon I took. It’s a JPEG that was about seven megabytes. Now, if we encode this image at a specific target size, in this case 800 kilobytes, we can compare the quality of AVIF and JPEG, and you can see that AVIF looks a lot better. It’s less blocky. We have less bending in the sky. So what you can do is create images that are the same size on disk and get better quality, or you can have the same quality as JPEG but use less space on disk. So it’s your choice. On top of that, AVIF supports other options. So you can do lossless compression. It has an alpha channel. it supports HDR, unlimited images, and 10 and 12 bit color depth. There are compressors available online and some web browsers like Google Chrome already support this format. If you want to know more, please check out What’s New in Android Media. On to videos. I know it’s supported the H.264 video format since I think the beginning of Android, but your application may not be able to support the formats that we’ve added afterwards, so for instance, H.265 or HDR10 or HDR+. So in Android 12, we’re adding the ability to automatically transcode media for applications that don’t handle those newer file formats. To do that, you can create a new media capabilities XML file in your application. In that file, you list the formats that you can handle and the ones that you cannot handle. You point to that file on your manifest and you’re done. Whenever you play a video, if it’s one of the formats we do not support, such as HDR10+, we’re going to transcode it to the best available format like H.265 if you can handle it. You can also do this using API. So when you create a content resolver to request media, you can tell which formats you can handle and which you can not. To know more, please check out What’s new in Android Media. I love it. I love that we took a situation where there were too many standards and solved it without adding another standard. It’s great. On to audio. So there’s a new feature called audio-coupled haptic playback. It’s a very fancy name for something that’s pretty simple and quite delightful. You can take an audio track from, say, a media player, and you can send it to the haptic generator and it’s going to generate vibration patterns or haptic patterns based on that audio track. So let’s say if you have a game, and you have an audio track with the footsteps of the main character or the sound of the rain, you can feed that to the haptic generator to automatically create these really nice vibration patterns for your user to increase immersion. Now, not all devices will be able do this. So you can use the isAvailable API on haptic generator to make sure that the device that the user is running on can support this feature. -I like it; it’s nice in any release to have some features that can really create some buzz. I’ll talk about some other stuff going on in the platform. A big area in recent releases has been privacy as we make it possible for the user to know what’s going on with their user data and also to stop applications from accessing it when they feel that that is not necessary to them. So transparency and clarity around access to data and making it clearer what’s going on in the system and giving more control to the user. A couple of the areas in this release are around, first of all, location. So, I talked to a developer a couple of years ago at an event back when we used to have live events and we could have conversations in person, and this was a developer that had an application which would scan for nearby Bluetooth devices and then pair with them, and they were upset because their application was getting bad reviews on the Play Store because they needed the location permission in order to scan for these devices. And they did not understand, the users did not understand why the application needed it. The developer didn’t understand why this permission needed to be necessary, needed to be used for their application, and in fact, we didn’t either. It was just a historical way of how all of these capabilities developed. So now in Android 12, we have a new Bluetooth permission and when you use that, you have the ability to then scan for these devices, as well as connect to them without needing that location permission. So, yay. Another change in the location area is it is possible for the user to say, “Yes you can have my location, but not my exact location. I don’t feel that you need that information.” Now they can tell the system they don’t want to share their exact location and the app has to deal with it. This probably means nothing for you. Best practice for this permission as well as all permissions is to be able to handle the situation where the user will disable it anyway because they can always go into settings and disable that permission, and this is just an easier way for them to do that in the moment when they are requested for. For the clipboards, it is possible for you to request data from the clipboard but now we are going to tell the user when that is happening. So we already sort of locked this down in terms of the only apps that have access to the clipboard are the keyboard and foreground application but now when that foreground application actually says, “Okay, give me that clip data,” then there will be a toast on the screen that looks a lot like this, telling the user the clipboard is being accessed. The best practice for this is you may wanna query the clipboard first and see what data is in there ’cause there’s a good chance the data in there is not applicable to your application anyway, then you can do the query without triggering the toast, and only when you request that data will the toast pop up in front of the user. Doesn’t that toast look nice? There’s the toast again. They’re all over the place. Foreground restriction. So we are restricting some access to launching foreground restrictions from the backgrounds. Foreground restrictions end up being a real battery drain ’cause a lot of things happen in the background triggering things in the foregrounds. There’s notifications popping up all the time. There’s a few changes we’re making. One is the notifications popping up in front of the user at all times. So a lot of these things are really limited. The application needed to run something really quickly, but it needed to be a foreground service and therefore there would be this notification popping up for just a second or two. So now we actually delay that. If the notification will be gone within the first 10 seconds, then it’s not going to appear at all, so less visual noise to the user. Also, we’re generally disallowing foreground services from being launched from the backgrounds. There is a new approach that you need to take instead so the migration path is to use this new method called setExpedited and this is a new platform method, but fortunately, we also introduce this into work manager in the unbundled library. So the way that you migrate off of foreground services is you call the Work Manager API, setExpedited, and on Android 12 and later, that will do the right thing and call the setExpedited API in the platform to do what it needs to do very quickly for this quick job that you needed to execute immediately. Otherwise on previous releases, it will do what you were probably doing anyway, which is to create the foreground service that you need. So very important for privacy. You should check out the following talks that are at I/O: What’s New in Android Privacy, Effective Background Tasks on Android, as well as Top 12 Tips to Get Ready for Android 12. What alliteration they’re having at these talks. Alright. Drag and drop and copy and paste and keyboard stickers. So these all sound very similar in my mind, right? There’s another application that had some data and you’d like it in your application. And this is great, except to get all three of these pads going, you basically had three code paths to enable, whether it got dragged and dropped onto there or copy and pasted from the clipboard, or you got this fancy keyboard sticker, got three code paths. That’s a lot of work. Maybe you didn’t get to all of them, meaning perhaps not all of them work for your application. What we’ve done in Android 12 is collapsed all of them into a single API encode path, and now you have this onReceiveContentListener that you implement and then set, and it does the right thing for all three of these use cases. Now you only need to implement that one thing. So the question is with this platform capability, what do we do on previous releases? There’s a couple of things to note. First of all, there is AndroidX support in current builds of AndroidX that is a little bit limited, so I want to explain it a little bit. For one thing, it is enabled for app compat edit text. They are looking into enabling it for… looking into enabling it for other views in the future, so stay tuned for that. Also at the time that I’m recording this, it was not plumbed all the way through the platform, so the best practice is going to be to use the AndroidX API and that will go through the platform API. All the dots were not connected quite yet but that should come online soon and you can use the AndroidX API. Meanwhile, if you wanna play with it, you may want to play with the Android Platform API directly. Otherwise, long-term, use the AndroidX API. Not only is it going to be easier and have some of that backward compatible support, but also there are some utilities in there that make it easier to use in general than the platform API directly. In the performance area, really quickly, there is a new library out there that helps you benchmark your app for larger use cases, like start-up or scrolling list, and you should check out the Measuring Jank and Startup with Macrobenchmark talk, and they’re going to go into details about that new Macrobenchmark library that just went into alpha. Also quickly in the wear area, there is a talk Now is the Time. See what they did there? What’s New with Wear. So they’re going to be talking about a couple of new capabilities for tiles which is a new capability that we had a couple of years ago, and now there’s a new API so you can get in on that UI action from your application. They’re also going to be talking about the next release of wear. In Android tools area, of course, there are new releases of Android Studio all the time. 4.1 is stable. 4.2 is, I think it’s stable by now. Arctic Fox is chugging along, and you should check out the following talks to learn more about all of those and more: What’s New in Android Development Tools, What’s New in Design Tools, and What’s New in Android Gradle Plugin. We also have many new announcements to make around Jetpack and Jetpack Compose, our new UI toolkit. Unfortunately, we don’t have time to cover this in this talk. I invite you to watch What’s New in Jetpack, where you’ll learn about the new stable versions of our libraries, the new alpha and beta versions and also what it means for Jetpack library to be alpha, beta or stable. If you’re interested in Compose, and you should be, we have two talks for you. we have What’s New in Jetpack Compose and Using Jetpack :ibraries in Compose. And that is it. Thanks for tuning in. Thanks for coming to our house and joining us today, and enjoy the preview release and everything going on with Android. Thank you. Thanks, see you soon

Leave a Reply

Your email address will not be published. Required fields are marked *

Sundar Pichai's Notifications    Yes No thanks