Comments Locked

43 Comments

Back to Article

  • Phylyp - Wednesday, January 3, 2018 - link

    Coming on the heels of the Apple SoC throttling news, I wonder how these high-performance cores clocked at 2.9 GHz (up from 2.3 GHz) will perform once the batteries age, especially since it is still Samsung's 10 nm process with a small improvement.

    Samsung was quick to announce they don't throttle SoCs on old batteries, but that doesn't take away from the fact that Apple's solution was technically very sensible (they made a mess out of communications though).
  • Phylyp - Wednesday, January 3, 2018 - link

    As in - 10 nm LPP offers 10% performance increase over 10 nm LPE, but the clocks have gone up by more than 0.23 GHz.
  • Zeratul56 - Wednesday, January 3, 2018 - link

    I do wonder what the industry wide effect of old batteries are, many were quick to demonize Apple on this but I feel like something that was related to the basic nature of batteries couldn’t be completely limited. I know two things affected Apple that don’t affect most, Apple maintaining smaller than average batteries over most smart phones(especially the 4.7” variant) and the iPhone 6s which seemed to be hit har by this(some flaw in the Soc?)

    With all that said, I don’t think other phones are not affect. Especially when considering things like clock speed which are generally higher year to rear on android.

    I wish we could get an in-depth analysis on the battery issue from a anandtech butvthat might be too much to hope for.
  • Jhlot - Friday, January 5, 2018 - link

    I recall when my Samsung S3 got around the 2 year mark I definitely noticed a loss in battery time and then it started to randomly shut off between 10-20% batter left which was very frustrating. I assume this occurred because the aging battery couldn't maintain the voltage or current when I fired up something that took more resources. So limiting the processor to stop random shutoffs from an old battery not being able to keep up is interesting and might be a fair trade for more predictable battery life because random shut down when you think you have 10-20% left sucks if you were relying on it.
  • lilmoe - Thursday, January 4, 2018 - link

    Samsung's Exynos doesn't have the efficiency problems that Apple's SoCs have had for years, the problems everyone refuses to write about. The only "blunder" I can think of was the 5410, and that was ARM's fault with the CCI400. Neither do Qualcomm's recent snapdragons nor the recent 2 Kirins after ARM fixed their interconnect and post the a57. They're much more sophisticated than what Apple is doing.

    Keep in mind that these chips are ruining much heavier, open software on much higher resulting screens for apparently more time than Apple's.

    No, Apple's solution is in no way sensible as you'd hope. There is nothing extraordinary about core CPU architecture nowadays, since we know pretty much everything there is to know about their design and the behavior is silicon. If you have the time and money, you can build whatever core you want.

    What matters more is what you do with what you got, and how much it can support, AND how much you can offload in the most efficient way and with least overhead.

    If you need to drop the clocks you well below half to prevent huge power consumption spikes, then you have one serious issue with your core architecture and/or platform as a whole. Apple knows this, hence the gradual move to the little cluster the past 2 iterations.

    This, mind you, is giving Apple the benefit of the doubt, since lithium batteries are a pretty damn known quantity at this point, and an oem knows before hand how big of a battery needs to be fitted to support a platform's longevity. If you still believe that Apple's SoCs have no problems, then that's even worse. Apple is either intentionally under-engineering the power delivery in their products as to not last and perform as advertised for more than 1-2 years, or they're intentionally slowing down their devices via software. Either way they're getting what they deserve over this.

    It's Apple's job to do the apology and fix, not yours.
  • Alistair - Thursday, January 4, 2018 - link

    You wrote too much. They have no SoC problem, just a battery current draw problem (can be fixed with better batteries, or larger ones.
  • lilmoe - Thursday, January 4, 2018 - link

    They do. You guys need to stop trying to cover for Apple. Underclocking to 600mhz is a problem with the SoC.
    And yes, they also have problem with small batteries and power delivery.
  • StormyParis - Thursday, January 4, 2018 - link

    Apple (and most others, but Apple certainly was at the forefront of "courage" on that one too) have a battery problem because of a cascade of unforced errors:
    1- make the batteries small to start with (nevermind it's a known fact they'll get even smaller with use)
    2- make the batteries non-removable
    3- make the batteries not user-replaceable
    4- sell battery swaps ($3 battery, 5 minutes of work) for $80. Even at $30, they're making a killing, thank you.
    5- throttle old batteries, but don't give uses a choice nor a warning
  • Spunjji - Thursday, January 4, 2018 - link

    Transient current draw is directly linked to the SoC and thus its design. You literally invalidated your own argument.
  • philehidiot - Thursday, January 4, 2018 - link

    Surely this is just an overall design issue, not just an SoC issue or battery issue. When designing a device like this you calculate the overall peak current draw you can take from the battery and then you budget that to different components. Other manufacturers have clearly taken the budget from what the battery can provide after a couple of years of life. Apple have taken a decision to allow better performance at the beginning of life knowing that this is not sustainable. I would argue this is misrepresenting the product as it will perform differently in stores and for reviewers than it will for the reasonable life of the product.
  • BurntMyBacon - Thursday, January 4, 2018 - link

    @spunjji

    YES!!! This is a transient current draw issue. There is more than one way to fix this problem. A few obvious(?) solutions are:
    1) Software safeguards try to prevent actions that cause larger transients (What Apple selected)
    2) Using a SoC or other hardware with less current draw or transient variability
    3) Use a larger battery (oversized batteries relative to the task have less voltage drop on transient events)
    4) Use a more advanced power regulation circuit (This is not entirely unlike how more advanced VRM circuitry on motherboards will have less voltage droop due to transients)

    Each of these solutions involve trade-offs in cost, performance, size, and weight. The software solution could have been chosen for several reason including, but not limited to:
    1) The problem may not have been obvious until after the phones were already shipping preventing hardware solutions to existing models. This scenario is weak, but not impossible as Apple released the software solution prior to major issues if I recall correctly.
    2) Hardware solutions involved trade-offs that Apple was unwilling to make. (Lower performance, larger device, higher manufacturing cost, etc.)
    3) The decision makers decided that a software solution was cheaper and wouldn't be noticeable within the average usage cycle of their target audience. (U.S. carriers have a two year "free" upgrade cycle). Motivating people to upgrade from older phones would be a windfall in this scenario.
    4) (Conspiracy theory to which I don't subscribe) There is no major battery issue or the issue was purposefully build into the device. The software "solution" is the latest in a long line of attempts by Apple to force their customers to upgrade.

    I won't go into why the conspiracy theory is unlikely other that to say that there are simpler explanations that seem more likely.
  • Spunjji - Thursday, January 4, 2018 - link

    "Technically very sensible"

    Not even slightly, it's a kludge. I know of a few Android devices in my time that have had similar premature-shutdown issues after reaching 2-3 years of age. Meanwhile nearly every iPhone 6 / 6S owner I know has had this problem starting after iOS 10 was released and between 12-18 months of age. Speaking less anecdotally, I used to work in an Apple store and the number of people coming in with battery issues was tremendous.

    Apple made out that it was a problem with a batch of batteries, which the very testing we performed in-store suggested was utterly untrue. They have been thoroughly opaque all the way through (as pretty much every manufacturer is when they make a cock-up of this magnitude, to be fair).

    What seems to be unique to Apple is the degree to which people are prepared to run defence for them on a problem that is entirely of their own making and does not affect any other manufacturer's devices to the same extent.
  • FullmetalTitan - Thursday, January 4, 2018 - link

    I'm more inclined to believe it is a systematic design issue given how tight the performance specs are for apple devices. The acceptable tolerance window for an A8/9/10 chip was significantly narrower than the same generation Exynos or Snapdragon part, and yet the Apple parts are overwhelmingly more likely to need to be throttled in this way to maintain basic functionality.
  • Zingam - Saturday, January 6, 2018 - link

    But does it melt down and vomit Specters?
  • shabby - Wednesday, January 3, 2018 - link

    When are we going to go past 29gb/s of memory bandwidth? The sd810 had 25gb/s and the 820/835 had 29gb/s, what's the holdup?
  • webdoctors - Wednesday, January 3, 2018 - link

    SoCs have gone past that, but not in phones:
    https://www.anandtech.com/show/10596/hot-chips-201...

    Maybe there's no performance demand in phones to drive more BW ?

    It does indicate perf will be limited for any high BW applications like graphics on mobile for the foreseable future. Will need to wait for Apple to bring high quality apps to mobile.
  • MrSpadge - Thursday, January 4, 2018 - link

    Bandwidth costs power, so it's not wise to use more than you really need. The recent ARM designs focussed on extracting better real world performance from the same maximum bandwidth, which I'd say is better than simply adding more hardware.
  • ZeDestructor - Thursday, January 4, 2018 - link

    Also pin-count and memory controller die-area. If you want to keep the PoP setup for space contraints, you become unable to grow the pin count. Result: we're probably stuck with a 64bit bus on phone until HBM/HMC and interposers/EMIB get cheap enough.
  • jjj - Wednesday, January 3, 2018 - link

    Any chance you guys are testing the fixes for Spectre and Meltdown on ARM? And if you do test, can you please also look at power and not just perf?
  • Ryan Smith - Thursday, January 4, 2018 - link

    https://www.anandtech.com/show/12214/understanding...
  • jjj - Thursday, January 4, 2018 - link

    The danger with Spectre (and this is gonna sound way crazy and it is but nowadays it's possible), is that it might change how folks look at security and some vendors might end up accepting higher security/privacy risks in consumer to gain some perf, lower costs and faster time to market.. They can even charge extra for "more secure" SKUs- to some degree they already do that.
    Modern software collects most data anyway so some hardware folks might decide that only some data has to remain sacred but most of it can be freely available. Yeah it does sound nuts and it is really bad but most of the data is already compromised by software so it's pointless to secure the hardware.
  • skavi - Wednesday, January 3, 2018 - link

    Dang, they're leapfrogging Qualcomm in terms of cache. Twice the big L2 and twice the L3 compared to the 845.
  • lilmoe - Thursday, January 4, 2018 - link

    Samsung needs to completely ditch Qualcomm and go fully vertical, at least for their S and Note series. They've been held back for a while. Just do it and worry about the lawsuits later... Or just pay up for the necessary licensing in the US and China.
  • tuxRoller - Friday, January 5, 2018 - link

    Held back?
    Did you read the AT article comparing 835 & the 8895?
  • lilmoe - Friday, January 5, 2018 - link

    Ah, yes, benchmark scores, testing workstation workloads, at constant max clock speeds, on a smartphone SoC. Cool story.

    People are too lazy/stubborn to change how they think about "performance", let alone efficiency. But I guess the current methods are fine for publications. They generated adequate hits and clicks, for cheap. I'll leave it to someone who has the time and means to make practical/meaningful comparisons. Before, Anandtech had really great articles, in which the trained eye could discern the metrics that truly matter. Not anymore... Oh well.

    Anyway, Exynos has been consistently faster, more efficient and (more importantly) more "feature rich" than Snapdragons for years in real world applications. Exynos is simply superior in image, audio and video processing. The photos and videos taken using the international variant have always been better. The media encoder/decoder has been the industry's gold standard since the Hummingbird.

    As an example. Samsung's SoCs have had 4K120p HEVC encode/decode since the last iteration (8895 on the S8) and 4k60p since the 8890 (S7) almost 2 years ago (They added 10bit this year). Snapdragons are getting 4k60p just now with the 845. Other than the FPS, this tells me that the Exynos will be significantly more efficient in recording video at 4k60p compared with the Snapdragon. It wouldn't be "fair" for US/Chinese customers if the international variant had better features. The Exynos version can clock higher, but they still clock it conservatively to maintain "benchmarking parity"... So yea, they've been held back, by both Qualcomm on the hardware side and Google on the software side. This isn't anything new, we've been saying this for years.
  • StrangerGuy - Saturday, January 6, 2018 - link

    Exynos is all around better than SD835, and shockingly even in radio performance according to some Chinese guys with an industrial RF tester.
  • Santoval - Friday, January 12, 2018 - link

    I do not necessarily subscribe to the "held back" theory, but you do realize that when they release the same phone globally with two distinct SoCs they need to roughly match their performance, right?
  • Arnulf - Thursday, January 4, 2018 - link

    There are no "ISO-power" and "ISO-performance". There are "iso-power" and "iso-performance" and neither have anything to do with the International Organization for Standardization.
  • MrSpadge - Thursday, January 4, 2018 - link

    "Iso" in this context means the latin word for "same", as often used in science. I don't think the article implies anywhere it would mean "International Organization for Standardization".
  • phoenix_rizzen - Friday, January 5, 2018 - link

    I believe the complaint is over the capitalisation of ISO- in a previous version of the article.

    In the current version of the article, there's still a type (isi-power), but at least it's all in lower-case now.
  • Santoval - Friday, January 12, 2018 - link

    "there's still a type (isi-power)"
    A typo about a typo : a metatypo!
  • SydneyBlue120d - Thursday, January 4, 2018 - link

    Please verify if there is L5 location support on par with Broadcom BCM4775X:

    • GPS L1 C/A
    • GLONASS L1
    • BeiDou (BDS) B1
    • QZSS L1
    • Galileo (GAL) E1
    • GPS L5
    • Galileo E5a
    • QZSS L5

    Thanks a lot!
  • tuxRoller - Friday, January 5, 2018 - link

    Afaict brcm is still the only vendor to offer l5 signalling support.
  • smalM - Thursday, January 4, 2018 - link

    "The single-thread performance claim would be the single biggest performance jump in the industry and if we're even just talking simple GeekBench scores that would put the Exynos 9810 at the performance levels of Apple's A10 and A11."

    Single-thread performance up 100% and multi-thread performance up only 40% doesn't make sense at all.
    Try it the other way round:
    Single-thread performance up 40% and multi-thread performance up 100%.
  • MrSpadge - Thursday, January 4, 2018 - link

    Why would it not make sense? Mobile chips are power limited under all-core load. So if you make the cores twice as fast for e.g. 50% more power (better efficiency), you'd have to throttle the all-core load clocks to 2/3 of the previous value to stay at the same all-core load power.
  • phoenix_rizzen - Friday, January 5, 2018 - link

    It makes sense if you can run a single core at 2.9 GHz, but can run two cores at only 2.7 GHz, and can run four cores at only 2.5 GHz, due to thermal/power limitations.
  • Santoval - Friday, January 12, 2018 - link

    These numbers, if they are accurate, clearly suggest 2.9 Ghz single core clocks and a full-core clock for the big core block in the ~2.2 Ghz range. The "small" cores will probably stay at 1.9 Ghz in both single and multi-core, since they are very power efficient.
  • CuriosUser - Thursday, January 4, 2018 - link

    But why samsung still use Qualcomm if they have this chip in their sleeve?
  • shabby - Thursday, January 4, 2018 - link

    My guess would be verizon/sprint modem patents, maybe its cheaper to put a qualcomm soc in their phone than license those patent.
  • ZeDestructor - Thursday, January 4, 2018 - link

    As shabby said: CDMA licensing crap, plus the fact that they'd have to build a CDMA2000 stack from the ground up for just for NAM (which afaik is only ~1/3rd of their Galaxy S/Note market), which wouls only be used until 2019-2025 (mostly depending on when Sprint decides to sunset their CDMA2000 network - Verizon has already stated December 2019). Cheaper to just buy some Snapdragon for the NAM folks until they catch up and go all-LTE.
  • StrangerGuy - Saturday, January 6, 2018 - link

    I agree with CuriosUser and call bullshit on this one. 2x ST over 8890 is a score of ~4000 in GB4 ST, while SD845 is tested to be only at ~2500. If Samsung had a monster SoC they wouldn't have bothered with QC at all, and even if they needed CDMA they can easily just pair it with QC's standalone baseband.
  • darkich - Thursday, January 4, 2018 - link

    That touted single core improvement is just mind boggling..
    I don't think anyone expected this.
  • tuxRoller - Friday, January 5, 2018 - link

    I also bet that figure isn't close to being generally representative of perf.

Log in

Don't have an account? Sign up now