"With the latest I/O conference, Google has finally publicly made public"
Public, you say?
"Contrary to other mobile platforms such as iOS, Windows or Tizen, which run software compiled natively to their specific hardware architecture, the majority of Android software is based around a generic code language which is transformed from “byte-code” into native instructions for the hardware on the device itself."
What I don't get is why people say the Windows Phone stores cloud back compiles to native every time someone downloads an app. Can't they just keep the native code for each device, and compile it to the full amount of optimization just once? Yes they have more devices than Apple, but they also tightly control which SoCs WP uses.
.NET on Windows (desktop_ has supported AOT compilation since at least version 2.0, possibly before (I don't recall). It also caches the JIT images so It's not 100% comparable to the way Dalvik works. Heck, even the user can generate native images for .NET programs by running the ngen.exe tool on .NET code.
Most commercial .NET programs either use AOT compilation or compile the entire program on first run.
The first time I booted the Nexus 7 2013 on the L preview I actually killed the boot process. It was taking so long, it had to be frozen. I must've screwed up the flashing, I thought. So I flashed again, and this time was more patient. The initial boot took quite a while, but turns out it was probably related to these underlying changes in Android.
The Nexus 7 2013 never felt slow, but I didn't know it could run this fast. The browser scrolls to an almost iOS like smoothness. I say almost because there are (very) rare hiccups in the FPS but I actually believe those maybe 2/2 to the "preview" nature of this OS.
I am very happy and excited for where Android will go in the next year. I think Android can finally bring it to iOS and Windows Phone when it comes to interface/GUI smoothness.
I had no doubt Google would eventually get there in stock Android, only question is whether OEMs will muck it all up during their "optimization" process before they ship out their phones.
I experienced the same thing with my droid bionic. After flashing it, I think I had to wait 30 minutes before I could even login due to ART compiling apps. I just left my phone on and plugged in and walked away.
But the experience after was much better after that long wait. I assume that production firmware would already have that step included in the restore.
I have the 2012 Nexus 7, and while the performance is far from perfect (it is notably slow when app loading and horrendous when anything is accessing the SSD), the latest Chrome Beta has brought nearly flawless performance and (seeming) 60fps scrolling with little, if no jank for all but very heavy sites. In fact, there are a number of apps that I use regularly that scroll flawlessly on this ageing system.
There are still apps that run slower than they should, and typically apps load slower than I would like, and many exhibit jank when scrolling (in fact anything with a list of pictures), and I'm hoping that Android L improves these situations (if the 2012 Nexus 7 is updated at all). But of late, I'm pretty happy with the optimizations made. Ever since the recent 4.4.4 update, it seems that the OS is running apps much more smoothly than ever before.
If L improves the performance as you describe, then Android will attain that fluidity and swiftness that iOS and WP have been known for across most apps, which will be very welcome.
I agree to an extent. Google is making great strides, and the L release appears to bring some very useful user experience "fixes." The only issue at this time is the color schemes we have seen recently in Google app updates and in the demoed Material Design Gmail app. The colors remind me of the colors from the basic 12-pack of Crayola crayons. They don't quite fit the slick new interface.
What has me most excited is four initiatives by Google to take back control over Android. First, is the lack of customization for 3rd parties using Android Wear, Android Auto and Android TV. Second, is the introduction of stock Android phones under the One program. If the initiative takes off, that would be a lot of phones running Google controlled and delivered stock Android. Third, is the as-of-yet unofficial Android Silver program - bringing Google Play Editions to carriers, with the software side apparently also to be controlled and delivered by Google. Putting "Silver" devices running stock Android in direct competition with the manufacturer's skinned phones should and hopefully will force the Samsung's of the world to up their game. Fourth, is an iOS system for introducing new versions of Android. This sneak peak will, hopefully, allow the manufacturers to do their appropriate skinning and get updates out in a much more timely manner.
All told, exciting times for those who appreciate technology and the advances we've seen over the last 15-20 years. I'm not sure what that next BIG product category is. I'm not sold that it's smart watches. What is the elevator speech for a smart watch? It's not an intuitive buy or justification for a lot of folks.
For watches, once they are hard to distinguish from classic analog watches (thin design, top-quality screen tech, decent battery life) then the pitch is "high-tech fashion accessory" which you will be able to buy from Rolex and other expensive watch manufacturers.
But if you're talking about wearables as a class (i.e. not necessarily watches) then I think it has to be personal health monitoring. At first, it'll be just basic stuff like heart rate, blood pressure, exercise monitoring, but eventually (years from now) as the medtech improves, it may be able to do things like warn you of an impending heart attack or stroke, or perhaps a vitamin deficiency, etc.
If Google is willing to ship an update that effectively freezes the system on first launch for 30 minutes while providing no UI to explain what is going on, I don't think Apple has much to be worried about...
(Apple is not perfect on this score; in particular there are time when OSX shuts down when one has an uncomfortably long period of watching a spinner while the system is doing god knows what. But they at least understand the principle of user feedback, ESPECIALLY during first boot.)
Right now, L is a Developer Preview. Not even really a beta. I'm sure Google understands user feedback is useful, but this is not something really designed for end users.
Given discussions about this and other issues in various places though, it seems many people don't understand this concept and are expecting far too much out of the preview.
On current Android devices, when you switch the runtime to ART, you get the "Android is upgrading; X of Y" progress bar on first boot. I'm sure once the L release is finalised, it will have a similar UI.
What's perplexing is why this isn't currently in place on the Dev Preview.
Seems the rumor is the old Nexus 7 might not see L... It'll be over two years old by the time L arrives officially, and being based on an old Tegra perform that not much else is using anymore it's chances are probably on the low side. I think ART was never enabled as a dev option under KK for it either but don't hold me to that, you can check yourself tho (I've got a 2013, gave my sister a 2012 as a gift tho).
Google has just released the L-sources for the Nexus 7 2012.
Good update, although I find it interesting that all of a sudden Android after all WAS not as smooth as iOS (which it indeed never was, really).
What I'm still missing (and I hope L will address this at some point) are more privacy controls. If (stock) Android grows a way to manage permissions after an app is installed I would be very glad.
The way permissions work on Android is enforcement is based on if the app ever uses a specific permission, and it's announced on install. You can't have after the fact permission management, the app either has all it's permissions or it doesn't run. I believe it was done like that for performance reasons. It's also a hell of a lot easier for developers because you don't need to constantly check permissions before doing things.
If you don't like an apps permissions, don't run it. A system like you describe would be as horrible as the system that classic Blackberry used and that sometimes required explaining to users to go in and manually give apps X, X, X and X permissions.
Once rooted, you will have more choice to control app permissions. I am currently using XPrivacy(Xposed Framework)which has fine grained conttrol over what app can access. It almost becomes annoying with so many prompts for permission when an app runs.
Oh for crying out loud, give it a fscking rest. You're like the people who will excuse ANYTHING Apple does. The current Android permissions handling is a complete abortion, and it's obviously going to be changed to something more iOS-like in the future. And what are you going to do then, Mr "We have always been at war with EastAsia"? Complain that Google is making things worse with the new permissions system they introduce in Android P?
The "iOS browser smoothness" you are talking about is actually nowhere near the smoothness I experience with Dolphin on Jetpack on my Note 3. End of story
Dolphin is a good browser but scrolling is still a lot choppier than Chrome on either my Nexus 10 or Nexus 5. Smoothness is not exactly its best feature.
@darkich As difficult as it may be for you to believe, Dolphin 11.x with Jetpack enabled is not a consistently smooth experience across all devices. My HTC One M8 is such an example. A "heavy" website such as the newly redesigned androidcentral.com is buttery smooth on Chrome and the built-in HTC Internet app, but on Dolphin it loads slower and is a bit jittery when scrolling through the page. I realize that optimizing an app such as a web browser to be smooth across a large number of devices is difficult, but when you call people flat out liars because their experience differs from yours (on a different device nonetheless) just shows you don't know what you're talking about.
I know precisely what I am talking about. He was referring to scrolling performance specifically, and in general, scrolling on Dolphin Jetpack is by far the fastest and most fluid out of any browser. Yes, I can also conform that Dolphin has issues on some pages, but that doesn't change the overall picture when we talk performance and fluidity. Show me a browser that handles every page flawlessly and then I will give you a point
As for Androidcentral, well I just tested it on Chrome and Dolphin. A single swipe on Dolphin scrolls through the entire(in a typical Dolphin Jetpack fashion) front page in the desktop mode. Chrome? Gets only about halfway through! And Safari is even far worse. There is just no comparison
"Number of swipes to reach end of page" is not the same metric as "scrolling is buttery smooth at all times". In fact, they aren't even remotely related. The two of you are talking about completely different things, almost orthogonal to each other.
That was completely nonsensical. By the most basic and obvious logic, the speed of scrolling is the very first metric of its smoothness. If you have two wheels and spin them with the same amount of force, and one spins for twice longer than the other-which one would you regard as the "smoother" one?
In the past I would have singled you out as being stupid, but I've seen a number of android users make the exact same utterly bizarre connection between scrolling speed and smoothness. Has it ever occurred to you that high velocity is used to hide jank and stuttering?
Setting scrolling velocity is just a decision of the developers, this is just a parameter. You can make the crappiest hardware scroll like mad. Smoothness (and getting the behaviour close to a believable, consistent physical model of inertness and friction) is really hard work that requires lots of things in the system working right to even allow trying. Android was never good (or consistently good) at that. Google has improved it with every version though.
"and getting the behaviour close to a believable, consistent physical model of inertness and friction"
.. And that is exactly where Dolphin trumps everything else, at least for me. Sure, it's not perfect at all times and on all sides (no browser is, again) but at its best, Dolphin Jetpack is the prime example of the description you gave.. It feels like a real, oily smooth mechanism
Darkich: perhaps this helps: 'smooth' is about the dropping of frames or (in)frequently stalling the drawing. It has nothing to do with how quickly you go to the bottom of a page as the browser can simply stop drawing for 1/10th of a second and show the bottom of the page and be fastest to the bottom - yet it would not be smooth at all.
So uhuznaa is right, scrolling speed has nothing to do with how smooth and fluid the UI is. It can be slow but never drop frames or fast but drop frames all the time.
I don't know if this is the reason but on my old iPhone 4 I can install 5 apps and continue to use an app at the same time without even noticing the installs going on in the background while doing the same on my Nexus 7 only leads to frustration. Same with loading lots of emails or anything else going on in the background. My Nexus always gets seizures and seems to hang for seconds when this happens. iOS seems to prioritize user interaction over everything while in Android user input seems to be treated as just another task to be handled sooner or later.
It is possibly part of the reason, but for what you talk about the main reason, I think, is that the Linux kernel is not good at handling I/O while maintaining interactivity. This is actually currently being taken care off but with the linux kernel in android so far behind mainline (linux is at 3.15, android at what, 3.5?) this might take a while to get fixed.
throw in the fact that nexus7 (2012) has a terrible nand controller and some really cheap nand chips and you have the result that you have aka stuttering slow trash device.
Unlikely. Java applications should be the ones that worked anyway on x86. The applications least likely to work would be native applications, which a developer may not compile and distribute for x86. Those are most likely to be games, particularly since Google (bafflingly) discourages use of the NDK.
Why do you find Google discouraging the use of the NDK baffling? The whole reason is the subject of your conversation. Poor compatibility with multiple ISAs.
It's not a great reason to discourage the NDK. Many applications are written to be cross platform and run successfully on multiple architectures. It doesn't even increase your test load, since even if you're writing your application in Java you still do need to fully test it on every platform. Test is usually the expensive part. The exact wording of the page is just odd. It says that preferring C++ isn't a good reason to write your application in C++. That's a pretty obviously false assertion. Preferring C++ is a great reason to write your application in C++.
Not directly, since this applies to DEX which never had a compatibility issue. But it might convince some app developers to stop using NDK due to the improved performance of DEX binaries.
The frame drop counts seem very odd with respect to the total milliseconds delayed. (Or I'm bad at math.) At 60 FPS a frame is 16 ms. A 4ms GC sweep might drop a single frame at 60fps. The output indicates it dropped 30 frames. That's 750fps. Plausible if you're running without vsync or framerate limiting on a static screen like a splash screen, but that's not really a meaningful example, nor is it especially noticeable to the end user. More interesting would be the frequency of a frame drop in an application with extensive animation running at an average of 30fps. That's going to be a situation where you notice every frame drop.
It's a mistake in the article. Those log lines have nothing to do with each other. The Choreographer is reporting a huge amount of dropped frames because <unknown> took a really long time on the UI thread, *NOT* because specifically the GC took that time. This is actually pretty normal, as when an application is launched the UI loading & layout all happens on the UI thread, which the Choreographer reports as "dropped" frames even though there wasn't actually any frames to draw in the first place as the app hadn't loaded yet. So the 30 dropped frames there means the application took about 500ms to load, which isn't fantastic but it's far from bad.
If the compiler can statically prove that a given piece of code won't throw, there's no need to insert the exception handling support. Not all exceptions will be removed by the compiler though.
Re: "Because ART compiles an ELF executable, the kernel is now able to handle page handling of code pages - this results in possibly much better memory management, and less memory usage too."
There shouldn't really be any difference. Dalvik was carefully designed so that its odex format could be mmapped in to RAM, allowing the kernel to do the same page handling as with ELF executables. (Actually a little better than regular ELF executables, since odex doesn't need any relocations that cause dirty pages.)
The "ProcessStateJankPerceptible" and "ProcessStateJankImperceptible" are the process coming in and out of the foreground. When it goes out of the foreground, the GC switches to a compacting mode that takes more time to run but saves more RAM. When it switches back to the foreground, it switches to the faster GC you have been looking at. The GC pauses here won't cause any jank, because these switches are never done when the app is on screen.
KSM mostly kicks in on virt loads, but the ksmd overhead is so small that including it on memory starved, multiprocess devices isn't a bad idea. BTW, I believe that android use the dalvik cache partition to avoid unnecessary re-jitting, but the space is limited, and therefore dynamic, so apps can be vacated.
I haven't noticed much improvement/difference in "UI smoothness" since enabling ART on my LG G2. What I *have* noticed is an almost 2 hour increase in my Screen-on-Time!
With Mahdi (Android 4.4.4) running Dalvik, I consistently get between 4 and 5 hours of SoT, depending on whether I'm reading books or playing games.
With Mahdi (Android 4.4.4) running ART, I now consistently get between 6 and 7.5 hours of SoT. Same apps installed. Same usage patterns. But much more screen usage between charges.
It's now at the point where I'm debating disabling things like JuiceDefender (radio management) to see how well the OS handles things by itself now.
I can't believe Google hasn't also adopted F2FS in Android L. I would've been perfect. How is it that they put it in Motorola devices a year ago, and they still can't make it default on stock Android?
Not really. It just depends on how the update is done.
If it's a "nuke'n pave" restore (like the Dev Preview or System Images), then it's not an issue. Backup your data to the PC/cloud, reformat all partitions, install, carry on.
If it's an in-place upgrade, then it becomes tricky. Unless, of course, you are using F2FS for the /data filesystem, which (really) is the only one that benefits from it. You don't need to make /sdcard (internal storage) F2FS, and you don't want to make /ext-sd (SDCard) F2FS as then you lose all non-Linux reader support. Nothing stopping you from using those as F2FS, though.
I'd really like to get a custom recovery for the G2 that allowed you to select which FS to use for each partition, and a ROM with a kernel that supported it, though. Just to try it out, and see how it works. :) Any takers? ;)
Yeah, I am really hoping for a big push towards F2FS in the coming months. I mean Moto has showed the significant increase in performance which we can get.
> but bad programming practices such as overloading the UI thread is something that Android has to deal with on a regular basis.
I believe they've also added a new UI thread now to L. You should look into that. I think it's in one of Chet Hasse's sessions, possibly in "What's new in Android".
> Google claims that 85% of all current Play Store apps are immediately ready to switch over to 64 bit - which would mean that only 15% of applications have some kind of native code that needs targeted recompiling by the developer to make use of 64-bit architectures.
Does this means that OEMs could use soon "pure" Aarch64 architectures? I think you can use ARMv8 purely for the 64-bit mode, with no compatibility for 32-bit, too. I imagine that would make the chips less expensive and also more efficient for OEMs.
I'm not familiar with how Intel has its chips, but I think it would be a lot harder for Intel to get rid of the "32-bit" parts, and they are pretty much stuck with their chips being both 32-bit and 64-bit, at least for the next few years, until nobody in the world needs 32-bit anymore on any platform Intel chips runs, and then they could just redesign their architecture to be 64-bit only.
I've long suggested that this is exactly what Apple will do. I don't think they'll ditch 32-bit support for the A8, but I honestly would not be surprised if the A9 comes without 32-bit support and iOS9 has a 32-bit SW emulator to handle old apps. Then by iOS 11 or so they just ditch the 32-bit emulator.
Other vendors have the problem that they don't have a tight control over the entire eco-system. Qualcomm, for example, are not making Android chips, they're making ARM chips --- for anyone who wants an ARM chip. It's something of a gamble to just ditch 32-bit compatibility and tell anyone who wants that "Sorry, you should go buy from one of these competitors". Most companies (foolishly, IMHO) weigh the cost of backward compatibility as very low, and the cost of losing a sale (even if it's to a small and dying industry segment) as very high; so I suspect they're not even going to think about such an aggressive move until years after Apple does it.
Can somebody confirm or deny that the ART from KitKat is the same as the ART from L? What I have read points to ART from Kitkat being different from ART on L.
"Google was not happy with this and introduced a new memory allocator in the Linux kernel, replacing the currently used “malloc” allocator" - Malloc allocator is not in the kernel. I dont think there was any change to the linux kernel in this. Malloc and Rosalloc are both done in user space in the ART lib. Both probably use the sbrk() system call to get memory from the kernel. Also a quick look at Rosalloc.cc code shows it is written in C++. So definitely cannot be in the linux Kernel.
The article mentions that startup times for devices will be worse with ART, but I don't understand why; surely if the code has already been compiled it will simply be cached somewhere, so it's just a case of executing it directly. In fact, this should mean that startup should be faster than normal.
In fact, the space requirement is another question mark; once an application has been compiled, does the byte code even need to be retained? Surely it can be discarded in that case? Though I suppose it's required to ensure that signatures don't change, it seems like the OS could enforce that differently (i.e - as long the byte code validated pre-compilation, then the compiled code is considered signed as well)?
I dunno, it just seems to me like there are plenty of ways to not only avoid slow-downs or extra storage use, but in fact there are ways to use ahead of time compilation to accelerate startup and reduce storage use.
I think you're correct. First time device startup and app installations will be longer, but once the compilation is done startup times shouldn't be slower.
It only makes sense the the application's first startup will take a long time. That first startup is where the Ahead of Time compilation is happening. Where else would it happen? Application startups after that will be much quicker, though, since the AOT compilation was already done beforehand.
Use ANY other benchmark. Who the hell knows how antutu works? For micro benchmarks try geekbench. If you're willing to do some compiling, linaro has a bunch of benchmarks it uses to determine progress.
I didn't realize you had to pay for it. Regardless, antutu is junk. Why? Because we don't know exactly what it does, or how it does it. The other option I mentioned is pick some of the linaro benchmark tools and compile them. I won't call you crazy for not buying apps because I don't know your situation. What I do, however, is try free versions and if they are good I buy them. They don't cost much and I don't waste battery with ads I'll ignore.
I thought it was clear that the ART in L is NOT the one in KitKat, and has been revamped quite a bit. The final one, 5 months from now, will probably have big changes, too.
Too much has been made regarding AOT and JIT. Note that Dalvik generally only JITs the DEX once, storing the result in /data/dalvik-cache.
The big difference between Dalvik and ART is simply that ART was rewritten from the ground up based upon everything they learned from the Dalvik experience.
It never ceases to amaze me how many problems that were solved decades ago in computing are problems on modern computing platforms.
Real compilation of code has been around forever -- the norm, in fact, for desktop and server computing with a few notable exceptions. Yet somehow taking what effectively amounts to interpreting code (just-in-time compilation is very similar to interpretation) and switching to compiling it ahead of execution is being touted as a new idea.
The fact that Android has pretty much been completely reliant upon JIT running in a VM has always made me scratch my head. As clearly spelled out in the article, it cause huge performance issues, along with significant hits to battery life. And we're talking about mobile devices where we've got relatively low-power CPUs and GPUs, little memory, and finite battery capacity. But it has been the way that Android has worked from the beginning. Crazy that it hasn't really been addressed until now.
And the idea that operating systems and development languages be in charge of garbage collection, and people being surprised that it causes performance hits, seems odd to me too. Managing your own memory isn't that hard to do. And it is a hell lot more efficient doing it yourself than making the language or OS figure out how to do it. It's a "clean up your own mess and put things back where you want them" vs. "make someone else do it and let them try to figure out where things go" situation. It might make development easier for entry-level developers, but it certainly isn't an efficient way to do things when performance and user experience are important.
Because the developers that I work with aren't accustomed to managing memory, we're constantly running into issues. We've got scripts that allocate dozens or hundreds of megabytes of RAM and don't free it when they're done. They'll go through 3, 4, or 5 more of these processes within a single script, not freeing memory they're done with along the way, so by the time the script is done running hundreds of megabytes that aren't needed are still tied up. Because the language can't be sure if data is going to be used again it hangs around until the script has finished running.
Create dozens or hundreds of instances of one of those scripts and you've got a performance nightmare. Relying on a language or OS to do garbage collection will have the same net result.
Nothing you say is wrong, but I think you hit the nail on the head with this sentence when it comes to Android: "It might make development easier for entry-level developers, but it certainly isn't an efficient way to do things when performance and user experience are important."
I personally think Android didn't care that performance was so bad in the early days. The point of Android, from what I can tell, was to make things Open Source and make it easy for developers. As you said, having the OS manage memory itself, it's meant to make programming easy. I think that's what made it attractive to the likes of Motorola, Samsung, and HTC in the beginning. I think that's what made it popular with the OEMs, and eventually, that's what users were getting used to.
Yes, precompiled code in interpreters are nothing new. But ART is changing what Android can do. It's not a new concept, I agree with you. But again, Android has had different priorities from the beginning than, say, writing purely in C and/or assembly for mission critical or safety critical systems where real time better be real time or else that car/plane/space shuttle will crash, or even in other not as critical embedded systems like HDDs and SSDs where performance and power matters more than anything. I think Android has always been about the easiness in its development environment, just like Java, and that's just where they put their priorities first. Now that their development environment has been pretty well founded, I think they're making the right steps with improving performance, first with the JIT compiler in 2.2, "Project Butter" in Jelly Bean, and now making the default environment ART instead of Dalvik in Android "L". They just had different priorities, and well... look at where Android is now.
I think you're completely right about ease of development being the priority for Android early on, after all they had to establish a market and needed apps quickly and easily. After Google bought the OS it suddenly got lots of developer attention and they just ran with the setup as it was. If Google had made lots of changes at that time they might as well have rolled their own.
The answer is in the article, it was about memory management really and once it was baked in all the development was to improve what already existsed.
After Oracle sued them (pending) over Dalvik and creating their own VM it became abundantly clear that they needed to tear down the whole thing and start over.
Google adopted Java for Android because it was a mature programming language, popular with developers, that they didn't have to create from scratch and had features (i.e. running in a VM) that made it easy to create secure apps that would run on a multitude of different hardware platforms. Java also had an affordable (i.e. free) development environment (Eclipse) that Google could build their development tools around.
Clearly, with the incredible growth Android has enjoyed over the last six years, the decision to go with Java was anything but a mistake.
As for compiler technology, the necessity to run the same apps on multiple hardware architectures precluded the use of traditional desktop and server based compilers, and the technology behind JIT compilers certainly hasn't been standing still over the last decade. The performance and battery deficits caused by the current VM environment are certainly not as bad as you think they are, given that modern Android tablets come pretty close to matching IOS which only has one hardware platform and architecture to worry about and where the software can be tightly integrated with that sole platform. It's not as good, no, but it's good enough for Samsung to sell millions of phones in direct competition with the iPhone.
Yes, the time has come for Google to move on, but there should be nothing amazing about their use of a Java-based platform that has served them very well over the past six years. It was the right decision at the time.
I think they could have produced a much better product if they had used C++ instead - native performance and battery life when it was needed in the early days, and probably faster than ios performance today.
I think you are absolutely right there. I doubt that merely doing AOT compiling is going to produce faster results and that's exactly what I experienced when I switched from Dalvik to ART in 4.4. Of course there are going to be more improvements in L since the code itself has improved. I mean who was launching an app on Android and wishing it would *launch* faster? There may have been apps that took their time launching. But not too many. On the other hand, better garbage collection and other improvements will certainly help in run-time performance. AOT is not doing anything much compared to JIT.
I always wondered why Google didn't buy Sun. Both companies have similar DNA (certainly better than Oracle and Sun) and Android could have used all the expertise Sun had in building JVMs and Real Time Java in Android and the rest of Google. They could have sold off the hardware division to IBM/Oracle and not have had to deal with the heart ache and drama of the lawsuit.
You'd be amazed on how can evolve a compiler in development stage. Most of the performance advantage from ART comes from AOT compilation. It can take the whole code and optimize it agressively. For example, when compiling GCC with the fastest optimizations you can get the whole program executing in the main function, with loop unrollings and vectorizations while taking into account the difference of having the functions inlined, optimizing references to variables and parameter passing.
A JIT can only focus on the "hot spots", improving some parts of the program but it can't improve it as a whole because there's not enough performance history storage space to achieve that.
Then, you've got the new Garbage Collection algorithms which improve interactivity quite a lot.
So many incorrect statements about jvms in this article it would take a half hour to list them all. Plus nothing at all was said about Googles major motivator which is it is obvious Dalvik was stolen from Sun and the lawsuits aren't over. Finally this is still a long way from true 64 bit and it's benefits. For example the only reason Apple can encrypt and decrypt fingerprints in real time is because encryption operations are dramatically faster in 64 bit. Way beneath Anandtech standards.
You are correct about apple's decision to use 64bit was partly because of the fingerprint scanner, but you are wrong that L is not fully 64bit compatible. In fact, it is easier for android to move to 64 bit because of the VM it runs on. The Linux kernel has always supported 64bit, but Google's runtime and libraries have not, and consequntly neither have the apps. Android L replaces the libraries and runtime with 64bit compatible versions and "enables" 64bit support for 85% of apps automatically with no work from the developers. That's pretty impressive.
Contrary to other mobile platforms such as iOS, Windows or Tizen, whjich run software compiled natively to their specific hardwware architecture, the majority of Android software is based around a generic code language whicjh is transformed from “byte-code” into native insstructions for the hardware on the device itself. The performance gains over Dalvik are significant num.to/427-837-276-945
Contrary to other mobile platforms such as iOS, Windows or Tizen, whjich run software compiled natively to their specific hardwware architecture, the majority of Android software is based around a generic code language whicjh is transformed from “byte-code” into native insstructions for the hardware on the device itself. The performance gains over Dalvik are significant http://num.to/427-837-276-945
<i>they are using reference compression to avoid the usual memory bloat that comes with the switch to 64-bit. The VM retains simple 32-bit references.</i> This feature was implemented in JDK6. Google just imported it into their new VM once ARMv8 (ARM 64 bit) has become available. Still, 64 bit android applications will use more memory, if they're compiled by ART for 64 bits.
Because ART compiles an ELF executable, the kernel is now able to handle page handling of code pages - this results in possibly much better memory management, and less memory usage too. I’m curious what the effect of KSM (Kernel same-page merging) has on ART, it’s definitely something to keep an eye on.
Also, the work won't end with this release. Like Dalvik before, ART will be improved as time goes by.
"vast increases in available storage space on today’s devices" Oh, you mean the vast increase to 16 GB on the Galaxy S5, the same as my iPhone 3GS from 2009?
"and is at the whim of the system to correctly manage things in an optimal manner"
You're showing your bias. Aside from large heaps, show me where the JVM is not handling memory in an optimal manner, and to clarify this should from a cost benefit perspective outweigh the time it would take to implement in a lower level language.
Millions pay the penalty thousands of times for successful programs - I think lots of development time could be justified if you looked at everyone's time. Imagine Android not needing so many tries at optimization and speed-up, and how that development time could have been spent instead.
Shortly after updating to Kiit Kat 4.4.4. on my Nexus 5, I switched to ART. It took about 10 minutes to recompile. I really didn't notice any significant storage loss. However, I notice significant improvement in speed and overall responsiveness. For me, very noticeable at first but now that it's become norm ..... as it should be. The N5 is already fast but since ART .... it flies. Stock Kit Kat with ART on 4.4.on a Nexus 5 just smokes. Love it!
Now would be a very good time those Android liers to come out and admit the old Android simply isn't up to iPhone's standard. I would know, I believed their lies and bought a Note 3 and it lags like I am using a single core computer back in the 2000s.
There is nothing new in AOT and compiled code caching techniques. Google is just Copying what Sun Microsystems did long ago, not to mention the Entire android API's are copied from JAVA.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
136 Comments
Back to Article
Gigaplex - Tuesday, July 1, 2014 - link
"With the latest I/O conference, Google has finally publicly made public"Public, you say?
"Contrary to other mobile platforms such as iOS, Windows or Tizen, which run software compiled natively to their specific hardware architecture, the majority of Android software is based around a generic code language which is transformed from “byte-code” into native instructions for the hardware on the device itself."
Windows has a fair amount of .NET.
jeffkibuule - Tuesday, July 1, 2014 - link
I was about to make that comment but learned that at least for Windows Phone 8, it's not true. It uses a cloud compiler: http://www.reddit.com/r/programming/comments/1njas...Gigaplex - Tuesday, July 1, 2014 - link
I wasn't aware of that, thanks for the link. Windows Phone 8 isn't strictly the only Windows mobile platform though.tipoo - Thursday, July 3, 2014 - link
What I don't get is why people say the Windows Phone stores cloud back compiles to native every time someone downloads an app. Can't they just keep the native code for each device, and compile it to the full amount of optimization just once? Yes they have more devices than Apple, but they also tightly control which SoCs WP uses.skiboysteve - Wednesday, July 2, 2014 - link
Yeah C# is very similar to Java in this regard. It even has a large object heap just like ART does.However, as pointed out in other places... Its not JITted on the device. Its done in the cloud.
Flunk - Wednesday, July 2, 2014 - link
.NET on Windows (desktop_ has supported AOT compilation since at least version 2.0, possibly before (I don't recall). It also caches the JIT images so It's not 100% comparable to the way Dalvik works. Heck, even the user can generate native images for .NET programs by running the ngen.exe tool on .NET code.Most commercial .NET programs either use AOT compilation or compile the entire program on first run.
usama_ah - Tuesday, July 1, 2014 - link
The first time I booted the Nexus 7 2013 on the L preview I actually killed the boot process. It was taking so long, it had to be frozen. I must've screwed up the flashing, I thought. So I flashed again, and this time was more patient. The initial boot took quite a while, but turns out it was probably related to these underlying changes in Android.The Nexus 7 2013 never felt slow, but I didn't know it could run this fast. The browser scrolls to an almost iOS like smoothness. I say almost because there are (very) rare hiccups in the FPS but I actually believe those maybe 2/2 to the "preview" nature of this OS.
I am very happy and excited for where Android will go in the next year. I think Android can finally bring it to iOS and Windows Phone when it comes to interface/GUI smoothness.
jeffkibuule - Tuesday, July 1, 2014 - link
I had no doubt Google would eventually get there in stock Android, only question is whether OEMs will muck it all up during their "optimization" process before they ship out their phones.darwinosx - Thursday, July 3, 2014 - link
Google isn't there. Not true 64 bit and using any JIT still doesn't best compiled code which has also gotten a lot more efficient.Alexey291 - Sunday, July 6, 2014 - link
erm you actually don't know what you're talking about do you :)Dalvik is JIT. Dalvik is the OLD runtime.
ART is AOT. ART is the NEW runtime. That's precisely the precompiled code you were talking about.
And the 64bit is about as "true" as the said 64bit in ios.
/sigh
hechacker1 - Tuesday, July 1, 2014 - link
I experienced the same thing with my droid bionic. After flashing it, I think I had to wait 30 minutes before I could even login due to ART compiling apps. I just left my phone on and plugged in and walked away.But the experience after was much better after that long wait. I assume that production firmware would already have that step included in the restore.
highlnder69 - Thursday, July 3, 2014 - link
Where did you get the new firmware for the Droid Bionic that includes ART?bengildenstein - Wednesday, July 2, 2014 - link
I have the 2012 Nexus 7, and while the performance is far from perfect (it is notably slow when app loading and horrendous when anything is accessing the SSD), the latest Chrome Beta has brought nearly flawless performance and (seeming) 60fps scrolling with little, if no jank for all but very heavy sites. In fact, there are a number of apps that I use regularly that scroll flawlessly on this ageing system.There are still apps that run slower than they should, and typically apps load slower than I would like, and many exhibit jank when scrolling (in fact anything with a list of pictures), and I'm hoping that Android L improves these situations (if the 2012 Nexus 7 is updated at all). But of late, I'm pretty happy with the optimizations made. Ever since the recent 4.4.4 update, it seems that the OS is running apps much more smoothly than ever before.
If L improves the performance as you describe, then Android will attain that fluidity and swiftness that iOS and WP have been known for across most apps, which will be very welcome.
Samus - Wednesday, July 2, 2014 - link
If Google can deliver a product as polished as iOS, Apple has a lot more to be worried about that simply making phones with bigger screens.buckschris - Wednesday, July 2, 2014 - link
I agree to an extent. Google is making great strides, and the L release appears to bring some very useful user experience "fixes." The only issue at this time is the color schemes we have seen recently in Google app updates and in the demoed Material Design Gmail app. The colors remind me of the colors from the basic 12-pack of Crayola crayons. They don't quite fit the slick new interface.What has me most excited is four initiatives by Google to take back control over Android. First, is the lack of customization for 3rd parties using Android Wear, Android Auto and Android TV. Second, is the introduction of stock Android phones under the One program. If the initiative takes off, that would be a lot of phones running Google controlled and delivered stock Android. Third, is the as-of-yet unofficial Android Silver program - bringing Google Play Editions to carriers, with the software side apparently also to be controlled and delivered by Google. Putting "Silver" devices running stock Android in direct competition with the manufacturer's skinned phones should and hopefully will force the Samsung's of the world to up their game. Fourth, is an iOS system for introducing new versions of Android. This sneak peak will, hopefully, allow the manufacturers to do their appropriate skinning and get updates out in a much more timely manner.
All told, exciting times for those who appreciate technology and the advances we've seen over the last 15-20 years. I'm not sure what that next BIG product category is. I'm not sold that it's smart watches. What is the elevator speech for a smart watch? It's not an intuitive buy or justification for a lot of folks.
tacitust - Thursday, July 3, 2014 - link
For watches, once they are hard to distinguish from classic analog watches (thin design, top-quality screen tech, decent battery life) then the pitch is "high-tech fashion accessory" which you will be able to buy from Rolex and other expensive watch manufacturers.But if you're talking about wearables as a class (i.e. not necessarily watches) then I think it has to be personal health monitoring. At first, it'll be just basic stuff like heart rate, blood pressure, exercise monitoring, but eventually (years from now) as the medtech improves, it may be able to do things like warn you of an impending heart attack or stroke, or perhaps a vitamin deficiency, etc.
name99 - Wednesday, July 2, 2014 - link
If Google is willing to ship an update that effectively freezes the system on first launch for 30 minutes while providing no UI to explain what is going on, I don't think Apple has much to be worried about...(Apple is not perfect on this score; in particular there are time when OSX shuts down when one has an uncomfortably long period of watching a spinner while the system is doing god knows what. But they at least understand the principle of user feedback, ESPECIALLY during first boot.)
Devo2007 - Wednesday, July 2, 2014 - link
Right now, L is a Developer Preview. Not even really a beta. I'm sure Google understands user feedback is useful, but this is not something really designed for end users.Given discussions about this and other issues in various places though, it seems many people don't understand this concept and are expecting far too much out of the preview.
darwinosx - Thursday, July 3, 2014 - link
Everything Google does is a beta.Alexey291 - Tuesday, July 8, 2014 - link
so even alpha is a beta? kayphoenix_rizzen - Wednesday, July 2, 2014 - link
On current Android devices, when you switch the runtime to ART, you get the "Android is upgrading; X of Y" progress bar on first boot. I'm sure once the L release is finalised, it will have a similar UI.What's perplexing is why this isn't currently in place on the Dev Preview.
tacitust - Thursday, July 3, 2014 - link
It was probably not high enough on the priority list. Stability comes first, even for previews.NetMage - Tuesday, July 8, 2014 - link
So, never released then? :)darwinosx - Thursday, July 3, 2014 - link
They can't and even if they could Apple has nothing to worry about.Alexey291 - Sunday, July 6, 2014 - link
you're really mad ain't ya :)sprockkets - Tuesday, July 8, 2014 - link
He's our favorite apple troll. In reality he probably can't even afford to buy an apple ipod, so his mom got one for him.Alexey291 - Wednesday, July 9, 2014 - link
yup but his arguments are so inane... its almost as if he's paid to do this...Impulses - Wednesday, July 2, 2014 - link
Seems the rumor is the old Nexus 7 might not see L... It'll be over two years old by the time L arrives officially, and being based on an old Tegra perform that not much else is using anymore it's chances are probably on the low side. I think ART was never enabled as a dev option under KK for it either but don't hold me to that, you can check yourself tho (I've got a 2013, gave my sister a 2012 as a gift tho).uhuznaa - Wednesday, July 2, 2014 - link
Google has just released the L-sources for the Nexus 7 2012.Good update, although I find it interesting that all of a sudden Android after all WAS not as smooth as iOS (which it indeed never was, really).
What I'm still missing (and I hope L will address this at some point) are more privacy controls. If (stock) Android grows a way to manage permissions after an app is installed I would be very glad.
nathanddrews - Wednesday, July 2, 2014 - link
Yeah, I thought "butter" took care of that.darwinosx - Thursday, July 3, 2014 - link
I laughed when they said butter would do this and I'm laughing at this too.tacitust - Thursday, July 3, 2014 - link
Having fun trolling the Android threads?Alexey291 - Monday, July 7, 2014 - link
you sound too desperate to be actually laughing.Flunk - Wednesday, July 2, 2014 - link
The way permissions work on Android is enforcement is based on if the app ever uses a specific permission, and it's announced on install. You can't have after the fact permission management, the app either has all it's permissions or it doesn't run. I believe it was done like that for performance reasons. It's also a hell of a lot easier for developers because you don't need to constantly check permissions before doing things.If you don't like an apps permissions, don't run it. A system like you describe would be as horrible as the system that classic Blackberry used and that sometimes required explaining to users to go in and manually give apps X, X, X and X permissions.
edwpang - Wednesday, July 2, 2014 - link
Once rooted, you will have more choice to control app permissions. I am currently using XPrivacy(Xposed Framework)which has fine grained conttrol over what app can access. It almost becomes annoying with so many prompts for permission when an app runs.name99 - Wednesday, July 2, 2014 - link
Oh for crying out loud, give it a fscking rest. You're like the people who will excuse ANYTHING Apple does.The current Android permissions handling is a complete abortion, and it's obviously going to be changed to something more iOS-like in the future. And what are you going to do then, Mr "We have always been at war with EastAsia"? Complain that Google is making things worse with the new permissions system they introduce in Android P?
sprockkets - Wednesday, July 2, 2014 - link
Android L will have privacy controls built in. Announced in the keynote, not in the dev preview yet...darwinosx - Thursday, July 3, 2014 - link
Oh you better do some more reading. Google made privacy far worse.Alexey291 - Sunday, July 6, 2014 - link
its cool bro in ios there are no visible permissions. Your data is already being sold to the highest bidder :)sprockkets - Tuesday, July 8, 2014 - link
You need to just sod off apple trolldarkich - Wednesday, July 2, 2014 - link
The "iOS browser smoothness" you are talking about is actually nowhere near the smoothness I experience with Dolphin on Jetpack on my Note 3.End of story
Flunk - Wednesday, July 2, 2014 - link
Dolphin is a good browser but scrolling is still a lot choppier than Chrome on either my Nexus 10 or Nexus 5. Smoothness is not exactly its best feature.darkich - Wednesday, July 2, 2014 - link
Lol, that is a flat out lie!Have you used Dolphin Jetpack?
henrybravo - Wednesday, July 2, 2014 - link
@darkich As difficult as it may be for you to believe, Dolphin 11.x with Jetpack enabled is not a consistently smooth experience across all devices. My HTC One M8 is such an example. A "heavy" website such as the newly redesigned androidcentral.com is buttery smooth on Chrome and the built-in HTC Internet app, but on Dolphin it loads slower and is a bit jittery when scrolling through the page. I realize that optimizing an app such as a web browser to be smooth across a large number of devices is difficult, but when you call people flat out liars because their experience differs from yours (on a different device nonetheless) just shows you don't know what you're talking about.darkich - Wednesday, July 2, 2014 - link
I know precisely what I am talking about.He was referring to scrolling performance specifically, and in general, scrolling on Dolphin Jetpack is by far the fastest and most fluid out of any browser.
Yes, I can also conform that Dolphin has issues on some pages, but that doesn't change the overall picture when we talk performance and fluidity.
Show me a browser that handles every page flawlessly and then I will give you a point
darkich - Wednesday, July 2, 2014 - link
As for Androidcentral, well I just tested it on Chrome and Dolphin.A single swipe on Dolphin scrolls through the entire(in a typical Dolphin Jetpack fashion) front page in the desktop mode.
Chrome? Gets only about halfway through!
And Safari is even far worse.
There is just no comparison
phoenix_rizzen - Wednesday, July 2, 2014 - link
"Number of swipes to reach end of page" is not the same metric as "scrolling is buttery smooth at all times". In fact, they aren't even remotely related. The two of you are talking about completely different things, almost orthogonal to each other.darkich - Thursday, July 3, 2014 - link
That was completely nonsensical.By the most basic and obvious logic, the speed of scrolling is the very first metric of its smoothness.
If you have two wheels and spin them with the same amount of force, and one spins for twice longer than the other-which one would you regard as the "smoother" one?
sonicmerlin - Wednesday, July 2, 2014 - link
In the past I would have singled you out as being stupid, but I've seen a number of android users make the exact same utterly bizarre connection between scrolling speed and smoothness. Has it ever occurred to you that high velocity is used to hide jank and stuttering?darkich - Thursday, July 3, 2014 - link
Read the above comment.Are you saying the high scrolling velocity is unnecessary and has no practical benefit!?!?!
If that's indeed what you think, then you definitely won the stupid mark.
uhuznaa - Thursday, July 3, 2014 - link
Setting scrolling velocity is just a decision of the developers, this is just a parameter. You can make the crappiest hardware scroll like mad. Smoothness (and getting the behaviour close to a believable, consistent physical model of inertness and friction) is really hard work that requires lots of things in the system working right to even allow trying. Android was never good (or consistently good) at that. Google has improved it with every version though.darkich - Thursday, July 3, 2014 - link
"and getting the behaviour close to a believable, consistent physical model of inertness and friction".. And that is exactly where Dolphin trumps everything else, at least for me.
Sure, it's not perfect at all times and on all sides (no browser is, again) but at its best, Dolphin Jetpack is the prime example of the description you gave.. It feels like a real, oily smooth mechanism
jospoortvliet - Thursday, July 3, 2014 - link
Darkich: perhaps this helps:'smooth' is about the dropping of frames or (in)frequently stalling the drawing. It has nothing to do with how quickly you go to the bottom of a page as the browser can simply stop drawing for 1/10th of a second and show the bottom of the page and be fastest to the bottom - yet it would not be smooth at all.
So uhuznaa is right, scrolling speed has nothing to do with how smooth and fluid the UI is. It can be slow but never drop frames or fast but drop frames all the time.
sonicmerlin - Wednesday, July 2, 2014 - link
Lol no. As long as Android continues to run the UI thread on the core thread it will never be as smooth as iOS or WP.uhuznaa - Thursday, July 3, 2014 - link
I don't know if this is the reason but on my old iPhone 4 I can install 5 apps and continue to use an app at the same time without even noticing the installs going on in the background while doing the same on my Nexus 7 only leads to frustration. Same with loading lots of emails or anything else going on in the background. My Nexus always gets seizures and seems to hang for seconds when this happens. iOS seems to prioritize user interaction over everything while in Android user input seems to be treated as just another task to be handled sooner or later.jospoortvliet - Thursday, July 3, 2014 - link
It is possibly part of the reason, but for what you talk about the main reason, I think, is that the Linux kernel is not good at handling I/O while maintaining interactivity. This is actually currently being taken care off but with the linux kernel in android so far behind mainline (linux is at 3.15, android at what, 3.5?) this might take a while to get fixed.Alexey291 - Tuesday, July 8, 2014 - link
throw in the fact that nexus7 (2012) has a terrible nand controller and some really cheap nand chips and you have the result that you have aka stuttering slow trash device.dealcorn - Tuesday, July 1, 2014 - link
Does ART eliminate residual X86 compatibility issues? If so, ARM loses home field advantage.johncuyle - Tuesday, July 1, 2014 - link
Unlikely. Java applications should be the ones that worked anyway on x86. The applications least likely to work would be native applications, which a developer may not compile and distribute for x86. Those are most likely to be games, particularly since Google (bafflingly) discourages use of the NDK.Flunk - Wednesday, July 2, 2014 - link
Why do you find Google discouraging the use of the NDK baffling? The whole reason is the subject of your conversation. Poor compatibility with multiple ISAs.johncuyle - Wednesday, July 2, 2014 - link
It's not a great reason to discourage the NDK. Many applications are written to be cross platform and run successfully on multiple architectures. It doesn't even increase your test load, since even if you're writing your application in Java you still do need to fully test it on every platform. Test is usually the expensive part. The exact wording of the page is just odd. It says that preferring C++ isn't a good reason to write your application in C++. That's a pretty obviously false assertion. Preferring C++ is a great reason to write your application in C++.Exophase - Monday, July 7, 2014 - link
Not directly, since this applies to DEX which never had a compatibility issue. But it might convince some app developers to stop using NDK due to the improved performance of DEX binaries.johncuyle - Tuesday, July 1, 2014 - link
The frame drop counts seem very odd with respect to the total milliseconds delayed. (Or I'm bad at math.) At 60 FPS a frame is 16 ms. A 4ms GC sweep might drop a single frame at 60fps. The output indicates it dropped 30 frames. That's 750fps. Plausible if you're running without vsync or framerate limiting on a static screen like a splash screen, but that's not really a meaningful example, nor is it especially noticeable to the end user. More interesting would be the frequency of a frame drop in an application with extensive animation running at an average of 30fps. That's going to be a situation where you notice every frame drop.kllrnohj - Tuesday, July 1, 2014 - link
It's a mistake in the article. Those log lines have nothing to do with each other. The Choreographer is reporting a huge amount of dropped frames because <unknown> took a really long time on the UI thread, *NOT* because specifically the GC took that time. This is actually pretty normal, as when an application is launched the UI loading & layout all happens on the UI thread, which the Choreographer reports as "dropped" frames even though there wasn't actually any frames to draw in the first place as the app hadn't loaded yet. So the 30 dropped frames there means the application took about 500ms to load, which isn't fantastic but it's far from bad.Stonebits - Tuesday, July 1, 2014 - link
"Overhead such as exception checks in code are largely removed, and method and interface calls are vastly sped up"This doesn't make sense to me--are exceptions handled some other way, or do you just keep executing?
extide - Tuesday, July 1, 2014 - link
If an exception is not handled, your application crashes.Gigaplex - Thursday, July 3, 2014 - link
If the compiler can statically prove that a given piece of code won't throw, there's no need to insert the exception handling support. Not all exceptions will be removed by the compiler though.Impulses - Tuesday, July 1, 2014 - link
Hah, that PBT link was pretty funny... PBT FTWhackbod - Wednesday, July 2, 2014 - link
Re: "Because ART compiles an ELF executable, the kernel is now able to handle page handling of code pages - this results in possibly much better memory management, and less memory usage too."There shouldn't really be any difference. Dalvik was carefully designed so that its odex format could be mmapped in to RAM, allowing the kernel to do the same page handling as with ELF executables. (Actually a little better than regular ELF executables, since odex doesn't need any relocations that cause dirty pages.)
The "ProcessStateJankPerceptible" and "ProcessStateJankImperceptible" are the process coming in and out of the foreground. When it goes out of the foreground, the GC switches to a compacting mode that takes more time to run but saves more RAM. When it switches back to the foreground, it switches to the faster GC you have been looking at. The GC pauses here won't cause any jank, because these switches are never done when the app is on screen.
tuxRoller - Wednesday, July 2, 2014 - link
KSM mostly kicks in on virt loads, but the ksmd overhead is so small that including it on memory starved, multiprocess devices isn't a bad idea.BTW, I believe that android use the dalvik cache partition to avoid unnecessary re-jitting, but the space is limited, and therefore dynamic, so apps can be vacated.
jwcalla - Wednesday, July 2, 2014 - link
I predict that real-world performance gains are not going to be significant.Impulses - Wednesday, July 2, 2014 - link
Based on...peterfares - Wednesday, July 2, 2014 - link
Probably because every major release Google claims to have finally made android smooth as "butter"Hopefully this time it is, but probably not
phoenix_rizzen - Wednesday, July 2, 2014 - link
I haven't noticed much improvement/difference in "UI smoothness" since enabling ART on my LG G2. What I *have* noticed is an almost 2 hour increase in my Screen-on-Time!With Mahdi (Android 4.4.4) running Dalvik, I consistently get between 4 and 5 hours of SoT, depending on whether I'm reading books or playing games.
With Mahdi (Android 4.4.4) running ART, I now consistently get between 6 and 7.5 hours of SoT. Same apps installed. Same usage patterns. But much more screen usage between charges.
It's now at the point where I'm debating disabling things like JuiceDefender (radio management) to see how well the OS handles things by itself now.
Elrando_Horse - Wednesday, July 2, 2014 - link
I'm running it as my daily driver on my Nexus 5, and put forward that your prediction is inaccurate.jabber - Wednesday, July 2, 2014 - link
Been using ART on my Nexus 4 since Jellybean came out. No issues. Life carries on as normal.Krysto - Wednesday, July 2, 2014 - link
I can't believe Google hasn't also adopted F2FS in Android L. I would've been perfect. How is it that they put it in Motorola devices a year ago, and they still can't make it default on stock Android?uhuznaa - Wednesday, July 2, 2014 - link
Because changing the FS in an update sucks. You may see this in new devices, but not in updates for existing devices.phoenix_rizzen - Wednesday, July 2, 2014 - link
Not really. It just depends on how the update is done.If it's a "nuke'n pave" restore (like the Dev Preview or System Images), then it's not an issue. Backup your data to the PC/cloud, reformat all partitions, install, carry on.
If it's an in-place upgrade, then it becomes tricky. Unless, of course, you are using F2FS for the /data filesystem, which (really) is the only one that benefits from it. You don't need to make /sdcard (internal storage) F2FS, and you don't want to make /ext-sd (SDCard) F2FS as then you lose all non-Linux reader support. Nothing stopping you from using those as F2FS, though.
I'd really like to get a custom recovery for the G2 that allowed you to select which FS to use for each partition, and a ROM with a kernel that supported it, though. Just to try it out, and see how it works. :) Any takers? ;)
moh.moh - Wednesday, July 2, 2014 - link
Yeah, I am really hoping for a big push towards F2FS in the coming months. I mean Moto has showed the significant increase in performance which we can get.Krysto - Wednesday, July 2, 2014 - link
> but bad programming practices such as overloading the UI thread is something that Android has to deal with on a regular basis.I believe they've also added a new UI thread now to L. You should look into that. I think it's in one of Chet Hasse's sessions, possibly in "What's new in Android".
I think I found it: https://www.youtube.com/watch?v=3TtVsy98ces#t=554
Krysto - Wednesday, July 2, 2014 - link
> Google claims that 85% of all current Play Store apps are immediately ready to switch over to 64 bit - which would mean that only 15% of applications have some kind of native code that needs targeted recompiling by the developer to make use of 64-bit architectures.Does this means that OEMs could use soon "pure" Aarch64 architectures? I think you can use ARMv8 purely for the 64-bit mode, with no compatibility for 32-bit, too. I imagine that would make the chips less expensive and also more efficient for OEMs.
I'm not familiar with how Intel has its chips, but I think it would be a lot harder for Intel to get rid of the "32-bit" parts, and they are pretty much stuck with their chips being both 32-bit and 64-bit, at least for the next few years, until nobody in the world needs 32-bit anymore on any platform Intel chips runs, and then they could just redesign their architecture to be 64-bit only.
_zenith - Wednesday, July 2, 2014 - link
x86 also has a 16bit mode AFAIK, so its more complicated than that still. [80]x86 is just a bitch of an ISA.name99 - Wednesday, July 2, 2014 - link
I've long suggested that this is exactly what Apple will do. I don't think they'll ditch 32-bit support for the A8, but I honestly would not be surprised if the A9 comes without 32-bit support and iOS9 has a 32-bit SW emulator to handle old apps. Then by iOS 11 or so they just ditch the 32-bit emulator.Other vendors have the problem that they don't have a tight control over the entire eco-system. Qualcomm, for example, are not making Android chips, they're making ARM chips --- for anyone who wants an ARM chip. It's something of a gamble to just ditch 32-bit compatibility and tell anyone who wants that "Sorry, you should go buy from one of these competitors". Most companies (foolishly, IMHO) weigh the cost of backward compatibility as very low, and the cost of losing a sale (even if it's to a small and dying industry segment) as very high; so I suspect they're not even going to think about such an aggressive move until years after Apple does it.
coachingjoy - Wednesday, July 2, 2014 - link
Thanks for the info.Nice article.
moh.moh - Wednesday, July 2, 2014 - link
Can somebody confirm or deny that the ART from KitKat is the same as the ART from L? What I have read points to ART from Kitkat being different from ART on L.p3ngwin1 - Wednesday, July 2, 2014 - link
ART in the existing Preview release of "L" already is more advanced than KitKat's.the final release of ART on "L" will be even more changed than the current Preview of "L".
phoenix_rizzen - Wednesday, July 2, 2014 - link
Yeah, it's an evolutionary upgrade, not a revolutionary whole-hog replacement.Just as Dalvik in 4.4 is different from Dalvik in 2.3; it's an evolutionary upgrade.
tipoo - Thursday, July 3, 2014 - link
The current build of L is more developed and better performing with ART than Kitkat, as will the final be.raghu.ncstate - Wednesday, July 2, 2014 - link
"Google was not happy with this and introduced a new memory allocator in the Linux kernel, replacing the currently used “malloc” allocator" - Malloc allocator is not in the kernel. I dont think there was any change to the linux kernel in this. Malloc and Rosalloc are both done in user space in the ART lib. Both probably use the sbrk() system call to get memory from the kernel. Also a quick look at Rosalloc.cc code shows it is written in C++. So definitely cannot be in the linux Kernel.jospoortvliet - Thursday, July 3, 2014 - link
On that C++ point - Linus has been coding C++ - http://liveblue.wordpress.com/2013/11/28/subsurfac... so who knows what the future holds ;-)Haravikk - Wednesday, July 2, 2014 - link
The article mentions that startup times for devices will be worse with ART, but I don't understand why; surely if the code has already been compiled it will simply be cached somewhere, so it's just a case of executing it directly. In fact, this should mean that startup should be faster than normal.In fact, the space requirement is another question mark; once an application has been compiled, does the byte code even need to be retained? Surely it can be discarded in that case? Though I suppose it's required to ensure that signatures don't change, it seems like the OS could enforce that differently (i.e - as long the byte code validated pre-compilation, then the compiled code is considered signed as well)?
I dunno, it just seems to me like there are plenty of ways to not only avoid slow-downs or extra storage use, but in fact there are ways to use ahead of time compilation to accelerate startup and reduce storage use.
Stochastic - Wednesday, July 2, 2014 - link
I think you're correct. First time device startup and app installations will be longer, but once the compilation is done startup times shouldn't be slower.metayoshi - Wednesday, July 2, 2014 - link
It only makes sense the the application's first startup will take a long time. That first startup is where the Ahead of Time compilation is happening. Where else would it happen? Application startups after that will be much quicker, though, since the AOT compilation was already done beforehand.phoenix_rizzen - Wednesday, July 2, 2014 - link
AoT happens when the app is installed on the phone; or during the first boot after changing the runtime to ART.hahmed330 - Wednesday, July 2, 2014 - link
One Stone... Three birds...Notmyusualid - Wednesday, July 2, 2014 - link
Just switched my GS5 over to Art, from Dalvik, and Antutu result dropped by 8%...Yes, the choice is on the stock ROM, just goto developer options, and select runtime.
tuxRoller - Thursday, July 3, 2014 - link
Use ANY other benchmark. Who the hell knows how antutu works?For micro benchmarks try geekbench.
If you're willing to do some compiling, linaro has a bunch of benchmarks it uses to determine progress.
Notmyusualid - Thursday, July 3, 2014 - link
Call me crazy, but I don't pay for apps....I take only the free ones.
I see no free Geekbench on the Play Store.
tuxRoller - Friday, July 4, 2014 - link
I didn't realize you had to pay for it.Regardless, antutu is junk. Why? Because we don't know exactly what it does, or how it does it.
The other option I mentioned is pick some of the linaro benchmark tools and compile them.
I won't call you crazy for not buying apps because I don't know your situation. What I do, however, is try free versions and if they are good I buy them. They don't cost much and I don't waste battery with ads I'll ignore.
Krysto - Thursday, July 3, 2014 - link
I thought it was clear that the ART in L is NOT the one in KitKat, and has been revamped quite a bit. The final one, 5 months from now, will probably have big changes, too.Notmyusualid - Thursday, July 3, 2014 - link
Will keep an eye out for it, but I'm expecting this to be no big deal now.ergo98 - Wednesday, July 2, 2014 - link
Too much has been made regarding AOT and JIT. Note that Dalvik generally only JITs the DEX once, storing the result in /data/dalvik-cache.The big difference between Dalvik and ART is simply that ART was rewritten from the ground up based upon everything they learned from the Dalvik experience.
errorr - Thursday, July 3, 2014 - link
That and because of the Oracle lawsuit over Dalvik which is nicely mooted by ART.doubledeej - Wednesday, July 2, 2014 - link
It never ceases to amaze me how many problems that were solved decades ago in computing are problems on modern computing platforms.Real compilation of code has been around forever -- the norm, in fact, for desktop and server computing with a few notable exceptions. Yet somehow taking what effectively amounts to interpreting code (just-in-time compilation is very similar to interpretation) and switching to compiling it ahead of execution is being touted as a new idea.
The fact that Android has pretty much been completely reliant upon JIT running in a VM has always made me scratch my head. As clearly spelled out in the article, it cause huge performance issues, along with significant hits to battery life. And we're talking about mobile devices where we've got relatively low-power CPUs and GPUs, little memory, and finite battery capacity. But it has been the way that Android has worked from the beginning. Crazy that it hasn't really been addressed until now.
And the idea that operating systems and development languages be in charge of garbage collection, and people being surprised that it causes performance hits, seems odd to me too. Managing your own memory isn't that hard to do. And it is a hell lot more efficient doing it yourself than making the language or OS figure out how to do it. It's a "clean up your own mess and put things back where you want them" vs. "make someone else do it and let them try to figure out where things go" situation. It might make development easier for entry-level developers, but it certainly isn't an efficient way to do things when performance and user experience are important.
Because the developers that I work with aren't accustomed to managing memory, we're constantly running into issues. We've got scripts that allocate dozens or hundreds of megabytes of RAM and don't free it when they're done. They'll go through 3, 4, or 5 more of these processes within a single script, not freeing memory they're done with along the way, so by the time the script is done running hundreds of megabytes that aren't needed are still tied up. Because the language can't be sure if data is going to be used again it hangs around until the script has finished running.
Create dozens or hundreds of instances of one of those scripts and you've got a performance nightmare. Relying on a language or OS to do garbage collection will have the same net result.
metayoshi - Wednesday, July 2, 2014 - link
Nothing you say is wrong, but I think you hit the nail on the head with this sentence when it comes to Android: "It might make development easier for entry-level developers, but it certainly isn't an efficient way to do things when performance and user experience are important."I personally think Android didn't care that performance was so bad in the early days. The point of Android, from what I can tell, was to make things Open Source and make it easy for developers. As you said, having the OS manage memory itself, it's meant to make programming easy. I think that's what made it attractive to the likes of Motorola, Samsung, and HTC in the beginning. I think that's what made it popular with the OEMs, and eventually, that's what users were getting used to.
Yes, precompiled code in interpreters are nothing new. But ART is changing what Android can do. It's not a new concept, I agree with you. But again, Android has had different priorities from the beginning than, say, writing purely in C and/or assembly for mission critical or safety critical systems where real time better be real time or else that car/plane/space shuttle will crash, or even in other not as critical embedded systems like HDDs and SSDs where performance and power matters more than anything. I think Android has always been about the easiness in its development environment, just like Java, and that's just where they put their priorities first. Now that their development environment has been pretty well founded, I think they're making the right steps with improving performance, first with the JIT compiler in 2.2, "Project Butter" in Jelly Bean, and now making the default environment ART instead of Dalvik in Android "L". They just had different priorities, and well... look at where Android is now.
Hyper72 - Friday, July 4, 2014 - link
I think you're completely right about ease of development being the priority for Android early on, after all they had to establish a market and needed apps quickly and easily. After Google bought the OS it suddenly got lots of developer attention and they just ran with the setup as it was. If Google had made lots of changes at that time they might as well have rolled their own.errorr - Thursday, July 3, 2014 - link
The answer is in the article, it was about memory management really and once it was baked in all the development was to improve what already existsed.After Oracle sued them (pending) over Dalvik and creating their own VM it became abundantly clear that they needed to tear down the whole thing and start over.
tacitust - Thursday, July 3, 2014 - link
Google adopted Java for Android because it was a mature programming language, popular with developers, that they didn't have to create from scratch and had features (i.e. running in a VM) that made it easy to create secure apps that would run on a multitude of different hardware platforms. Java also had an affordable (i.e. free) development environment (Eclipse) that Google could build their development tools around.Clearly, with the incredible growth Android has enjoyed over the last six years, the decision to go with Java was anything but a mistake.
As for compiler technology, the necessity to run the same apps on multiple hardware architectures precluded the use of traditional desktop and server based compilers, and the technology behind JIT compilers certainly hasn't been standing still over the last decade. The performance and battery deficits caused by the current VM environment are certainly not as bad as you think they are, given that modern Android tablets come pretty close to matching IOS which only has one hardware platform and architecture to worry about and where the software can be tightly integrated with that sole platform. It's not as good, no, but it's good enough for Samsung to sell millions of phones in direct competition with the iPhone.
Yes, the time has come for Google to move on, but there should be nothing amazing about their use of a Java-based platform that has served them very well over the past six years. It was the right decision at the time.
grahaman27 - Saturday, July 5, 2014 - link
Well said.NetMage - Tuesday, July 8, 2014 - link
I think they could have produced a much better product if they had used C++ instead - native performance and battery life when it was needed in the early days, and probably faster than ios performance today.iAPX - Wednesday, July 2, 2014 - link
So why not people upgrade if it works so well on Android side?zodiacsoulmate - Thursday, July 3, 2014 - link
Very impressivemstestzzz000 - Thursday, July 3, 2014 - link
Inaccuracy in the article:"This new allocator, “rosalloc” or Rows-of-Slots-Allocator, ..."
If you look at the source code for rosalloc (line 39 of https://android.googlesource.com/platform/art/+/ma... they call it "A runs-of-slots memory allocator"
Milind - Thursday, July 3, 2014 - link
I think you are absolutely right there. I doubt that merely doing AOT compiling is going to produce faster results and that's exactly what I experienced when I switched from Dalvik to ART in 4.4. Of course there are going to be more improvements in L since the code itself has improved. I mean who was launching an app on Android and wishing it would *launch* faster? There may have been apps that took their time launching. But not too many. On the other hand, better garbage collection and other improvements will certainly help in run-time performance. AOT is not doing anything much compared to JIT.I always wondered why Google didn't buy Sun. Both companies have similar DNA (certainly better than Oracle and Sun) and Android could have used all the expertise Sun had in building JVMs and Real Time Java in Android and the rest of Google. They could have sold off the hardware division to IBM/Oracle and not have had to deal with the heart ache and drama of the lawsuit.
Filiprino - Saturday, July 5, 2014 - link
You'd be amazed on how can evolve a compiler in development stage.Most of the performance advantage from ART comes from AOT compilation. It can take the whole code and optimize it agressively. For example, when compiling GCC with the fastest optimizations you can get the whole program executing in the main function, with loop unrollings and vectorizations while taking into account the difference of having the functions inlined, optimizing references to variables and parameter passing.
A JIT can only focus on the "hot spots", improving some parts of the program but it can't improve it as a whole because there's not enough performance history storage space to achieve that.
Then, you've got the new Garbage Collection algorithms which improve interactivity quite a lot.
seoagile - Thursday, July 3, 2014 - link
thanks for information.darwinosx - Thursday, July 3, 2014 - link
So many incorrect statements about jvms in this article it would take a half hour to list them all. Plus nothing at all was said about Googles major motivator which is it is obvious Dalvik was stolen from Sun and the lawsuits aren't over. Finally this is still a long way from true 64 bit and it's benefits. For example the only reason Apple can encrypt and decrypt fingerprints in real time is because encryption operations are dramatically faster in 64 bit.Way beneath Anandtech standards.
grahaman27 - Saturday, July 5, 2014 - link
You are correct about apple's decision to use 64bit was partly because of the fingerprint scanner, but you are wrong that L is not fully 64bit compatible. In fact, it is easier for android to move to 64 bit because of the VM it runs on. The Linux kernel has always supported 64bit, but Google's runtime and libraries have not, and consequntly neither have the apps. Android L replaces the libraries and runtime with 64bit compatible versions and "enables" 64bit support for 85% of apps automatically with no work from the developers. That's pretty impressive.PearlCParks - Thursday, July 3, 2014 - link
Contrary to other mobile platforms such as iOS, Windows or Tizen, whjich run software compiled natively to their specific hardwware architecture, the majority of Android software is based around a generic code language whicjh is transformed from “byte-code” into native insstructions for the hardware on the device itself. The performance gains over Dalvik are significant num.to/427-837-276-945PearlCParks - Thursday, July 3, 2014 - link
Contrary to other mobile platforms such as iOS, Windows or Tizen, whjich run software compiled natively to their specific hardwware architecture, the majority of Android software is based around a generic code language whicjh is transformed from “byte-code” into native insstructions for the hardware on the device itself. The performance gains over Dalvik are significant http://num.to/427-837-276-945Filiprino - Saturday, July 5, 2014 - link
<i>they are using reference compression to avoid the usual memory bloat that comes with the switch to 64-bit. The VM retains simple 32-bit references.</i>This feature was implemented in JDK6. Google just imported it into their new VM once ARMv8 (ARM 64 bit) has become available.
Still, 64 bit android applications will use more memory, if they're compiled by ART for 64 bits.
Filiprino - Saturday, July 5, 2014 - link
But that will probably be outweighted by this:Because ART compiles an ELF executable, the kernel is now able to handle page handling of code pages - this results in possibly much better memory management, and less memory usage too. I’m curious what the effect of KSM (Kernel same-page merging) has on ART, it’s definitely something to keep an eye on.
Also, the work won't end with this release. Like Dalvik before, ART will be improved as time goes by.
editorsorgtfo - Monday, July 7, 2014 - link
"vast increases in available storage space on today’s devices"Oh, you mean the vast increase to 16 GB on the Galaxy S5, the same as my iPhone 3GS from 2009?
LetsGo - Wednesday, July 9, 2014 - link
You do know Samsung Galaxy S5 has an memory slot that can handle 128GB cards.Thought not You're a Apple user.
editorsorgtfo - Wednesday, July 9, 2014 - link
Google ruined that in KitKat.chadwilson - Tuesday, July 8, 2014 - link
"and is at the whim of the system to correctly manage things in an optimal manner"You're showing your bias. Aside from large heaps, show me where the JVM is not handling memory in an optimal manner, and to clarify this should from a cost benefit perspective outweigh the time it would take to implement in a lower level language.
NetMage - Tuesday, July 8, 2014 - link
Millions pay the penalty thousands of times for successful programs - I think lots of development time could be justified if you looked at everyone's time. Imagine Android not needing so many tries at optimization and speed-up, and how that development time could have been spent instead.goobersnatcher - Saturday, July 12, 2014 - link
Shortly after updating to Kiit Kat 4.4.4. on my Nexus 5, I switched to ART. It took about 10 minutes to recompile. I really didn't notice any significant storage loss. However, I notice significant improvement in speed and overall responsiveness. For me, very noticeable at first but now that it's become norm ..... as it should be. The N5 is already fast but since ART .... it flies. Stock Kit Kat with ART on 4.4.on a Nexus 5 just smokes. Love it!Peichen - Tuesday, July 15, 2014 - link
Now would be a very good time those Android liers to come out and admit the old Android simply isn't up to iPhone's standard. I would know, I believed their lies and bought a Note 3 and it lags like I am using a single core computer back in the 2000s.pankajdoharey - Monday, November 10, 2014 - link
Google is just making it look new, but it simply is the same strategy employed by Sun while building JVM some 10 yrs ago during HOTSPot Java Project.pankajdoharey - Monday, November 10, 2014 - link
There is nothing new in AOT and compiled code caching techniques. Google is just Copying what Sun Microsystems did long ago, not to mention the Entire android API's are copied from JAVA.