Android has no color management so if you use Adapt Display the gamut is no longer constrained to sRGB like in the basic mode and everything looks horrifying. Not Samsung's fault, but I couldn't possibly recommend using anything other than the basic mode to get proper sRGB rendering on the Galaxy phones.
Oh, I see. So when it's in basic mode, it doesn't adapt to the ambient color temperature? I personally use basic mode and it's fantastic until I take it outdoors and it automatically switches to the adaptive mode in order to crank up the brightness.
Android doesn't but Samsung added in the ability to do so. You can use their adaptive oversaturated colors (who knows what color space that is) or the basic profile (srgb) .
Yes it is. The true tone display copied the same exact tech that Samsung created 5 years ago. It uses two color sensors, one in front and one in back to dynamically shift the white balance. As usual the Apple biased media suppressed coverage of it until Apple did it.
The only source willing to cover it was display mate since theyre an independent resource.
I was actually thinking of sending you an email requesting to investigate the color management with enough depth to actually answer the question of "does it actually work", but refrained because I trusted Anandtech to do it even without external advice.
I have wondered for a long time if it is possible to create a display that actually covers the full visible spectrum? Does anyone know. Assuming it is even possible I don't expect such a thing would become common because wouldn't it have to have a bunch more base colors, instead of just RGB, to be able to simulate the range to our eyes. Still like current professional displays it might have a valid niche use if it can be done by tech progress.
But 4 'colour' components would do it? RGB plus an illuminance 'fader' applied uniformly to all 3.
For the technology, I'm assuming this would need each RGB triplet in a display to be controlled by the 4th illuminant component (from 0 to 100%).
My understanding is that 3 colour component displays effectively have this 4th component, I think I shall call it brightness :), set to a constant value.
You can, with the CIE colour space: https://en.wikipedia.org/wiki/CIE_1931_color_space Which encode the tristimulus values roughly analogues to responses of the three color cones in human eye. However, 3 monochromatic sources such as RGB will not be able to cover the entire gamut, you need a tunable wideband light source basically.
RGB certainly isn't capable but, I would think, a far violet component and a far red component, coupled with another somewhere around "green" could manage. Do you happen to have a reference?
I'd be curious to see how it fares outdoors in bright light (overcast or direct sunlight). That's when I really notice that the Adaptive mode on my Galaxy S6 looks good (though otherwise has far over saturated colours). The 6500k Basic mode looks extremely yellow and washed in such cases. I'd imagine True Tone would have a similar efficacy.
It's very interesting that Apple chose this rather than Adobe RGB (1998), and modified successors to that.
When Adobe came to me when they were about to come out with their RGB standard, back in the mid 1990's, I wasn't too happy about it.
First of all, I should say that my commercial photo lab was one of the first to become an Adobe shop, which happened shortly after photoshop first came out. We were also one of the first to do digital correction in photos back in 1988, when I bought the first system for the lab, which was the Crossfield system. That consisted of a Crossfield drum scanner, a Mac IICi, and Crossfield's software, which was a pre Photoshop correction and editing software package.
I also beta tested Photoshop, and later, the Creative Suite, pretty much from the beginning.
When they told me what they were planning with RGB, I was not happy. We were a shop that processed Kodachrome film, one of the very few around the world that wasn't Kodak. We had developed a professional version of Kodachrome, which shifted the color balance, among other things.
The problem was, despite all its other virtues, Kodachrome was an amateur film. As an amateur film, the color balance was shifted towards the cyan, blue, green part of the spectrum. The reason was that people on vacation would use this film mostly outdoors. They wanted the sky to look cyan (it's not really blue), the water blue, and the foliage green. Face colors weren't as important, because they were usually shot too small.
But pros really wanted Kodachrome because of the fine grain, micro contrast, and the fact that Kodachrome was much more resistant to fading than other films, due to the way it was made and processed. We developed a processing method that would move that color balance towards the neutral, close to Ektachrome Professional.
So when Adobe came to me, and I asked what this standard was based upon, and they said that it approximated Kodachrome (one of the reasons they chose it, but not the only reason), I was, to be frank, appalled. As a result, flesh tones have always been a problem in Adobe RGB.
When the film industry looked to make a new digital colorspace standard, they looked at what they shot for film. It turns out that films are much more about people than sky, grass and water. The DCI-P3 standard that resulted is therefor more biased towards yellow/red and less towards the cyan, blue, green. It actually has a bit greater gamut than Adobe RGB does, but less than Pro Color, which many pros have been moving to inside photoshop, due to Adobe RGB limitations.
Since macs remain the premier platform for video editing, no matter what software is being used, it makes sense for Apple to move toward the film standard, rather than to the still market. There are very few DCI-P3 monitors out there, and that very few are also very expensive. It's only recently that moving to this standard for less expensive equipment became practical.
I believe, from talking to some contacts in the industry, that we may see, at some point, still cameras offering this standard along with the usual sRGB for cheap cameras, and the additional Adobe RGB option for better models. I hope so.
-- The problem was, despite all its other virtues, Kodachrome was an amateur film.
well, yes and no. produced by Kodak's non-professional division, I suppose. but used nearly universally by professional 35mm color photographers. I studied with Haas, and he wouldn't use any E film with us. nope. such folk might use AgfaChrome, but never Ektachrome. Kodachrome, for those who never looked into it, is just 3 layers of b/w emulsion filtered. no dye parts in the film. thus, it lasts nearly as long as a dye transfer print.
as to the color balance, the pros generally bought stock a year ahead and "aged" it, running a test roll every now and again. when it got ripe, then into the freezer.
This was my business for many years. When I say amateur, I mean specific things. Sure pros used it. But they weren't very happy about several things. One was that color balance, and gamer. The other was that Kodak treated it as an a ether film all the way, and that included support.
While their pro films had an emulsion disparity between manufacturing batches of one third stop in speed, and one third stop in color in each of the three colors, Kodachrome had a one half difference in all of those factors between emulsions, as well as a greater variability within an emulsion batch.
This caused all sorts of problems. You shouldn't ever "age" film. That's always been a very bad idea. It causes latent image shifts, ISO changes, and it's variable emulsion to emulsion. You need to refrigerate the film. What was hopefully done was to buy the 300 roll cartons, pull 10, or so, rolls out, test them and filter if required, and keep the rest in the fridge, taking out what is needed for the day, the night before.
We had a lot of top pros use our lab. Often, we would buy the film for them, as our buying power gave us very good pricing direct from Kodak.
Kodachrome was, by far, the most complex film on the market. In fact, the Kodachrome development process has been described as the most complex chemical process of any kind in the world. That was true. No made up chemicals there. We had to make everything from scratch, using Kodak's chemicals and dyes. We needed a complete qualitative analysis lab with an atomic spectrometer. Dan Jones, who had been the director of Kodachrome processing was our head chemist for several years.
-- I was, to be frank, appalled. As a result, flesh tones have always been a problem in Adobe RGB.--
Is this post legit ? Seems like a major misconception on basic knowledge regarding color management so surprise you are claiming having worked in a lab related to photo.
Skin tone are well within sRGB color space. As such, having a wider color space than sRGB has virtually 0 impact on skin tone accuracy (provided the calibration is correct). What is your point about aRGB or wider color space gamut related to accuracy of the skin tone ?
Also, it is funny that people are clamoring about wide gamut screen. I used to have a U2711H (so a Dell wide color gamut screen) and I was shooting full frame cameras (Canon 5d II then Sony A7R, both in aRGB) with a fully calibrated workflow and display. i think that after more than 70 000 pictures, I may have 5 photos where the colors went in some small area a little bit beyond the sRGB color space. In all practical purpose, for natural color, there is small to virtually no benefit whatsoever to be able to saturate more than sRGB color space.
-- I was, to be frank, appalled. As a result, flesh tones have always been a problem in Adobe RGB.--
I really wonder if this is post is legit as it shows gross basic misconception about color management and surprised you claimed having work in a photo lab.
Skin tone are well within sRGB color space. As such, having access to wider gamut, be aRGB or ProPohoto or DCI-P3 has virtually no impact on the ability to display accurate skin tone (if properly calibrated).
Going beyond that, I used to have a Dell U2711 (so a wide gamut screen) with pictures taken in aRGB from a Canon 5d II and then A7R. But in more than 70 000, I may have 5 photos where some areas of the picture was going very slightly beyond the sRGB saturation (and only visible when switching from one to the other). In all practical purpose, for real life color, there is little to no advantage to have a wider color screen. So I sell it and I have now a U2713UHM (which is not wide gamut) because I was fed up to have 2 copies of each picture : one in aRGB, only visible on my setup at home and sRGB for all the other applications (sharing to friend, displaying on mobile, ...).
I do not understand this trend, especially on a mobile OS where applications are not color managed. It causes more trouble than bringing benefit.
And if this is to have Apple implementation which is basically a display ICC profile, then, what's the point ?
That's not how the implementation works. Content is rendered according to how it defines itself, and as sRGB if it fails to explicitly specify its color space. That is exactly how color management should work. Existing sRGB content renders correctly, and wide gamut content renders 'correctly' in the sense that it's transformed using a perceptual transform if required, and rendered natively if not.
My experience has been similar. A vast majority of my photos fall fully within the sRGB colorspace. When an exception pops up, you notice that the saturation is really strong and it looks lovely. It's just not very common. I would be much more interested in more contrast/dynamic range than greater gamut. Having both is, of course, best. This person took DCI-P3 to task when the iMacs came out and found similar results to us: http://www.astramael.com/1
I had heart that the iPad automatically turns of True Tone when you launch a photo editing app. Is that the case? If it is possible for apps to control this, it could be a feature that people could leave on for normal use without interfering with accuracy where it matters.
Can you also test, in the future, if the true tone display operates on a preset with a predetermined range (ie 3000-3500K make display warmer, 4500-5000K make display a little more neutral) or is it like the true tone flash where it's more granular.
I'd definitely like to try. Right now I'm a bit limited with my lighting setup, especially because I'm moving at the moment so setting up a bunch of lamps with different bulbs is a bit difficult. Keep an eye out for the full review.
Adobe RGB is aimed further up in the CIE chart, while DCI-P3 is aimed downwards, and more to the right. That being said, it also goes further up than sRGB does, so Adobe RGB images would look better, depending on how the color management handles that. I have the 12.9" model which doesn't have this, so I'll have to wait until later this year, when the new one comes out, to find out.
I did buy my daughter, who is a pro fashion and product photographer, a late 2015 27" iMac which also has this display. Using my calibration equipment, it seems to have about a 90% congruence with Adobe RGB, but continues the spectrum past it in the yellows and red. It nails the gamut at the blue point and stretches the red point down, and to the right, while pushing the green slightly down.
Overall, it's pretty close, but I hope Adobe supports it soon, as well as camera and scanner manufacturers. The motion picture industry is much bigger than the still industry, so we might see that.
I wonder about …“color accuracy drops significantly when using True Tone in warm ambient lighting.”
Yes, you confirm that True Tone is operating as expected, but the whole point is to make the image *look* more accurate. I gather that our “retinex” process that affects how we see color is an awfully complicated subject, but if TrueTone actually makes the colors “look” right, how can they be less accurate just because some calibration scope (which intentionally ignores the environment) says so?
Well, technical color accuracy is very different from perceptual color accuracy, as well as brightness accuracy.
While the meter says that "it isn't white", our brain, which is a pattern matching machine, thinks that "that's the whitest, and brightest, thing here, so it must be white"
It is very interesting to have this article from Anandtech as many people still assume that a wider color gamut than sRGB is automatically a good thing and it is good to clarify that without color management, this is a catastrophe. Now, if I understand correctly, what Apple is doing is just applying a system wide color ICC display profile (like a display ICC profile on any Windows device). But the real question is : is there API for the developer to take control of that ?
Because so far, as such, I do not understand the interest to have a wider color gamut screen on this iPad. But worse, what is the benefit for the user ? It seems that it has 0 advantage compared to the iPad Air 2, isn't it ?
That's not what's happening. Color management is not just applying a single profile to the system, as you imply. Neither OS X or Windows does that. Color management looks at an ice profile and then compares it to the gamut of the output device, whether a printer or monitor, and maps the colors to that device so as to keep the proper colors throughout the scale without stretching or compressing the values. So it may not use the entire gamut of the device, or it may cut the gamut of the image, depending on whether the image profile containers a larger or smaller gamut than the device it's being viewed on.
There can be settings to compress the gamut at the extremes in some cases, but it's rarely done.
--Color management looks at an ice profile and then compares it to the gamut of the output device, whether a printer or monitor-- Which is exactly what i said regarding a display ICC profile.
So my point stands that it seems to have no benefit at all here...
I'm not sure exactly what you're saying, and I don't know what you mean why there being no advantage. Any content that targets a wider color space than sRGB will show a greater range of colors on this iPad. I'm not sure how that isn't advantageous, and I'm not really sure how you've concluded that there's no benefit at all. If the article wasn't clear enough on this point please let me know.
How can it do that ? Let's say there is a photo to displayed with an embedded aRGB color space, which iOS application knows how to read that ? And how does it pass the information to the OS ? Is there iOS function for that ?
Normally, in a color managed OS, the application is sending the info to the OS to display 255 in sRGB color space so the OS is converting for instance the 255 to 200 so that the red appears ok on the wider gamut screen than sRGB. Than if it sends 255 in aRGB, it would become 248 because it is a more saturated color. How can iOS manage that today ? I do not see any mention in the article that any application has any possibility to pass this information and therefore display (on purpose) more saturated color than sRGB.
As I stated in the article, this is managed through the frameworks that essentially all iOS applications are built on. Applications built on AppKit, Core Animation, Core Graphics, Quartz, etc already now work with ColorSync to display colors correctly. For example, you can drop an Adobe RGB image into your Dropbox and it'll display with a wider range of colors than older iPads in the Dropbox app, without Dropbox having to change their application at all.
That's an interesting article, but it's only partly right. First of all all software is buggy, and that's particularly true for open source. It's going to be a cold night in hell before more than a fraction of 1% of professionals use open software and some Linux distributions for professional use.
While it's true that the Windows color management is particularly buggy, that's because Microsoft doesn't consider it to be of much importance. They really just stuck it in there. Apple's also has bugs, but they aren't showstoppers.
A problem with what is happening in open software and Linux, is that there's no consistency. There are too many methods to apply this, and they aren't compatible. Maybe someday, but that seems very far off.
Well, your post was interesting but lacking, somewhat, in the data arena. There was a report made by Coverity a few years ago that examined the "bugs"/1000sloc of various open source projects and closed source codebases (obviously not identified). You'll probably be surprised by the results. http://www.coverity.com/press-releases/coverity-sc...
Judging from the rest of your comment you don't seem to have a lot of direct knowledge of color management in Linux (and, well, I'm not sure what you mean by "consistency")... or even Linux in general. Data always welcome!
While I couldn't tell you what percentage of "professionals" use Linux, I can tell you that the best animators in the industry run Linux on their workstations (not just in the render farm).
The post is a bit biased, as it's from a site targeting mainly hardcore open-source software users. It only spends a sentence or two claiming OS X has "a constant stream of bug reports" but doesn't specify what any actual issues are. The speaker isn't familiar with iOS but _guesses_ that color management probably doesn't work? And since this is an OSS event, the speaker of course praises only Linux and Android.
I mean, heck, Apple developed the standard for ICC profiles and then co-founded the International Color Consortium back in the 90s. They're the basically ones that started this whole thing in the first place.
And your comment is full of fallacies. You're assuming it's biased because it's coming from lwn (a site that, for the most part, is both highly technical and focused on development, not on beating any ideological drums; that oss is a good thing is assumed, so there's not a great need to proselytize). You assuming that since apple contributed to a standard that their implementation is better than another's. If you can actually refute the claims made in the article then please do so.
I don't feel the previous commenter is necessarily saying the article is wrong but is saying that the speaker didn't quantify anything. The speaker doesn't qualify what the stream of bugs are but my reading of that phrase is that users report bugs on the software that may or may not be accurate because the earlier statement on Windows has the phrase "no one ever reports bugs about it if they can't use it". This makes me wonder is if it's end users who are reporting what they see as bugs due to the fact that Apple has colour management always enabled. I must admit I also agree that the fallacy of "nobody uses it, it must be broken" from the speaker is an interesting jump.
Thanks for the thoughtful comment:) While I don't agree with your reading of ThreeDee912's comment (I interpreted him as saying that since there might be bias, that is reason enough to doubt... his tone certainly didn't help matters when it comes to alternate interpretations), your points are well taken. I think it would be useful to consider the perspective of the speaker. That is to say, his background. Here's the link to his lgm 2015 talk along with a bit of background (http://libregraphicsmeeting.org/2015/program/##chr... From his site, he seems like he mostly works with Macs. So the bugs he's speaking of, if I may attempt to interpret him, seem to be from his interactions with users as their trainer. IOW, these would be bugs he's encountered in the course of working with clients (which is his job). I completely agree that he made an assumption about the usefulness/abilities of the ios color management api. I've got to assume this was a poor choice of words that the author used to summarize him as Chris sent on to speak about a particular ios app that used the API:
"As far as he is aware, there is a single app that leverages it: a proprietary tool from X-Rite; even then, the app is largely inconsequential since it does not make any of its features accessible to other apps."
So, at around 7:50 he starts talking about the state of color management on osx. Apparently the state is a cycle of bugs->fixes->regression->bugs (which, to be honest, sounds like the norm for everybody, but my guess is he's especially bitter about this because he's mostly working with Macs so he's often encountering these issues and has to explain to his customers that there's not a lot he can do). Having watched the section about Linux I can say that he doesn't claim Linux doesn't have bugs but it seems to have less regressions (implication of greater software stability). Again, if you're interested in color standards, he talks about them a bit more for the first 3-5m(along with discussing a serious upcoming problem with the slowing of updating standards and less new color science being made public). Lastly, his comments about ios are in this lightning talk http://video.constantvzw.org/LGM15/day-02/19-Chris...
He says that, he's been told, that ColorSync exists in iOS but it's more of a placeholder. Regardless, he says that one app uses it but, because of how it works he doesn't consider it color management (or, at least, a good experience).
I assume it's still just much too early in terms of tech to switch to BT.2020 based displays, particularly since that really needs higher bit depth channels. It looks like this move may set a good foundation for Apple to make that switch when the time comes, which is certainly what I hope is the case rather then it making them tempted to stay on DCI-P3.
This just seems like an anti-Apple comment and nothing else. What better 9.7 inch tablet are you going to get. If you really want the higher performance buy a 12 inch one. Apples only 'crime' here is offering you choice and the opportunity to snipe at Apple.
Even if you for emotional reasons would never buy an Apple it still is in your interest to app load better technology like this 9.7 Pro. You never know, it might spur on your favourite manufacturer to try harder.
Interesting. I am not a fan of the iPad but for what application would you ned USB 3 instead of 2 ? In wich case would you need higher speed transfer and what would be the gain of time ?
Thanks for the interesting article. Will we have to wait for UltraHD video to see the benefits of the wider color gamut, or are there apps out now that can take advantage of it? I don't suppose the built-in photo/video app uses it?
How ? If I send a aRGB picture by email, will it be displayed properly ? Same in the photo app ? Do 2 RAW pictures imported in the iPad, one in aRGB, the other in sRGB be displayed the same or differently ?
Exactly. That is what Brandon is saying. You just put wide gamut pictures with corret profile in to the ipad in any way you can think of and "it just works".
App makers have to deliberately break color management for it to fail.
Current UHD HDR movies use PQ gamma, rec2020 container, 10 bit Color. Mastered to P3 gamut (for now) inside a rec2020 container. Wondering how this new Color Manegement system will cope with all that. If you calibrate this display to the P3 gamut, it will not line up with the rec2020 gamut lines. Apple needs to adopt rec2020 container support, HDR & PQ gamma ASAP.
I don't like Apple's hardware prices and their closed software ecosystem. But I have to admit: it's impressive to see them caring even about such seemingly small things like matching the display color temperature to the ambient. On windows I'm using f.lux for that, without external sensor input. Once you get used to it, it's unbelievable how blue regular displays look in "evening light". However, the functionality is so not-integrated that I only have the choice of keeping the cursor at its regular color temperature (hardware cursor) or to have it matched but disappear in most games (software rendering) :/
Agreed the 9.7 iPad Pro is the best iPad yet. When you have the pencil, keyboard, and a good light source, like Lumiy Lightline LED task lamp, you can really work long hours on it.
Need good accessories to reduce fatigue and increase productivity.
How do i upgrade my 12.9 inch iPad Pro to true tone display mode. Please I looked for weeks all over the Internet, only art1icle I find is this? I can pay someone to help
>> Apple's own applications interpret untagged content as sRGB, and also properly understand tagged images and videos and display them correctly.
This seem to say that videos mastered in P3 color space and "properly tagged" will be displayed with a P3 gamut on the iPad Pro 9.7". Is this correct? Are there example of such sample P3 video files that are available?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
75 Comments
Back to Article
kurahk7 - Thursday, April 21, 2016 - link
Can you also do testing on the S7 display that should have this same feature from the S4? http://www.anandtech.com/show/6914/samsung-galaxy-...fanofanand - Thursday, April 21, 2016 - link
No, it's not an apple product, therefore there will not be 17 follow-up articles about how it is the greatest thing ever ever.Brandon Chester - Thursday, April 21, 2016 - link
Android has no color management so if you use Adapt Display the gamut is no longer constrained to sRGB like in the basic mode and everything looks horrifying. Not Samsung's fault, but I couldn't possibly recommend using anything other than the basic mode to get proper sRGB rendering on the Galaxy phones.kurahk7 - Thursday, April 21, 2016 - link
Oh, I see. So when it's in basic mode, it doesn't adapt to the ambient color temperature? I personally use basic mode and it's fantastic until I take it outdoors and it automatically switches to the adaptive mode in order to crank up the brightness.darwinosx - Thursday, April 21, 2016 - link
Your dumb and so is your comment.Android has zero color management so this sort of screen is not possible.
kurahk7 - Thursday, April 21, 2016 - link
Android doesn't but Samsung added in the ability to do so. You can use their adaptive oversaturated colors (who knows what color space that is) or the basic profile (srgb) .Brandon Chester - Thursday, April 21, 2016 - link
That isn't color management.Neomancr - Friday, July 7, 2017 - link
Yes it is. The true tone display copied the same exact tech that Samsung created 5 years ago. It uses two color sensors, one in front and one in back to dynamically shift the white balance. As usual the Apple biased media suppressed coverage of it until Apple did it.The only source willing to cover it was display mate since theyre an independent resource.
marvdmartian - Thursday, April 28, 2016 - link
"Your dumb and so is your comment."Says the man who cannot understand the difference between the words "your" (possessive) and "you're" (contraction of 'you are')??
Your comment lost any validity, at that point in time.
Lieuchikaka - Thursday, June 2, 2016 - link
http://mavangvn.vn/ma-vang-dien-thoai/dien-thoai-s...Integr8d - Saturday, April 30, 2016 - link
"Your dumb" <--- Hope everyone gets the irony in that statement.My 6P has a developer option to constrain the OLED display to sRGB.
zepi - Thursday, April 21, 2016 - link
I was actually thinking of sending you an email requesting to investigate the color management with enough depth to actually answer the question of "does it actually work", but refrained because I trusted Anandtech to do it even without external advice.Well done sir, what a nice way of delivering!
Thrawn - Thursday, April 21, 2016 - link
I have wondered for a long time if it is possible to create a display that actually covers the full visible spectrum? Does anyone know.Assuming it is even possible I don't expect such a thing would become common because wouldn't it have to have a bunch more base colors, instead of just RGB, to be able to simulate the range to our eyes. Still like current professional displays it might have a valid niche use if it can be done by tech progress.
Brandon Chester - Thursday, April 21, 2016 - link
It's not possible with a 3 component color system.timbotim - Thursday, April 21, 2016 - link
But 4 'colour' components would do it? RGB plus an illuminance 'fader' applied uniformly to all 3.For the technology, I'm assuming this would need each RGB triplet in a display to be controlled by the 4th illuminant component (from 0 to 100%).
My understanding is that 3 colour component displays effectively have this 4th component, I think I shall call it brightness :), set to a constant value.
MobiusPizza - Thursday, April 21, 2016 - link
You can, with the CIE colour space: https://en.wikipedia.org/wiki/CIE_1931_color_spaceWhich encode the tristimulus values roughly analogues to responses of the three color cones in human eye. However, 3 monochromatic sources such as RGB will not be able to cover the entire gamut, you need a tunable wideband light source basically.
tuxRoller - Thursday, April 21, 2016 - link
RGB certainly isn't capable but, I would think, a far violet component and a far red component, coupled with another somewhere around "green" could manage.Do you happen to have a reference?
zodiacfml - Friday, April 22, 2016 - link
Yesdjayjp - Thursday, April 21, 2016 - link
I'd be curious to see how it fares outdoors in bright light (overcast or direct sunlight). That's when I really notice that the Adaptive mode on my Galaxy S6 looks good (though otherwise has far over saturated colours). The 6500k Basic mode looks extremely yellow and washed in such cases. I'd imagine True Tone would have a similar efficacy.melgross - Thursday, April 21, 2016 - link
It's very interesting that Apple chose this rather than Adobe RGB (1998), and modified successors to that.When Adobe came to me when they were about to come out with their RGB standard, back in the mid 1990's, I wasn't too happy about it.
First of all, I should say that my commercial photo lab was one of the first to become an Adobe shop, which happened shortly after photoshop first came out. We were also one of the first to do digital correction in photos back in 1988, when I bought the first system for the lab, which was the Crossfield system. That consisted of a Crossfield drum scanner, a Mac IICi, and Crossfield's software, which was a pre Photoshop correction and editing software package.
I also beta tested Photoshop, and later, the Creative Suite, pretty much from the beginning.
When they told me what they were planning with RGB, I was not happy. We were a shop that processed Kodachrome film, one of the very few around the world that wasn't Kodak. We had developed a professional version of Kodachrome, which shifted the color balance, among other things.
The problem was, despite all its other virtues, Kodachrome was an amateur film. As an amateur film, the color balance was shifted towards the cyan, blue, green part of the spectrum. The reason was that people on vacation would use this film mostly outdoors. They wanted the sky to look cyan (it's not really blue), the water blue, and the foliage green. Face colors weren't as important, because they were usually shot too small.
But pros really wanted Kodachrome because of the fine grain, micro contrast, and the fact that Kodachrome was much more resistant to fading than other films, due to the way it was made and processed. We developed a processing method that would move that color balance towards the neutral, close to Ektachrome Professional.
So when Adobe came to me, and I asked what this standard was based upon, and they said that it approximated Kodachrome (one of the reasons they chose it, but not the only reason), I was, to be frank, appalled. As a result, flesh tones have always been a problem in Adobe RGB.
When the film industry looked to make a new digital colorspace standard, they looked at what they shot for film. It turns out that films are much more about people than sky, grass and water. The DCI-P3 standard that resulted is therefor more biased towards yellow/red and less towards the cyan, blue, green. It actually has a bit greater gamut than Adobe RGB does, but less than Pro Color, which many pros have been moving to inside photoshop, due to Adobe RGB limitations.
Since macs remain the premier platform for video editing, no matter what software is being used, it makes sense for Apple to move toward the film standard, rather than to the still market. There are very few DCI-P3 monitors out there, and that very few are also very expensive. It's only recently that moving to this standard for less expensive equipment became practical.
I believe, from talking to some contacts in the industry, that we may see, at some point, still cameras offering this standard along with the usual sRGB for cheap cameras, and the additional Adobe RGB option for better models. I hope so.
FunBunny2 - Thursday, April 21, 2016 - link
-- The problem was, despite all its other virtues, Kodachrome was an amateur film.well, yes and no. produced by Kodak's non-professional division, I suppose. but used nearly universally by professional 35mm color photographers. I studied with Haas, and he wouldn't use any E film with us. nope. such folk might use AgfaChrome, but never Ektachrome. Kodachrome, for those who never looked into it, is just 3 layers of b/w emulsion filtered. no dye parts in the film. thus, it lasts nearly as long as a dye transfer print.
as to the color balance, the pros generally bought stock a year ahead and "aged" it, running a test roll every now and again. when it got ripe, then into the freezer.
melgross - Thursday, April 21, 2016 - link
This was my business for many years. When I say amateur, I mean specific things. Sure pros used it. But they weren't very happy about several things. One was that color balance, and gamer. The other was that Kodak treated it as an a ether film all the way, and that included support.While their pro films had an emulsion disparity between manufacturing batches of one third stop in speed, and one third stop in color in each of the three colors, Kodachrome had a one half difference in all of those factors between emulsions, as well as a greater variability within an emulsion batch.
This caused all sorts of problems. You shouldn't ever "age" film. That's always been a very bad idea. It causes latent image shifts, ISO changes, and it's variable emulsion to emulsion. You need to refrigerate the film. What was hopefully done was to buy the 300 roll cartons, pull 10, or so, rolls out, test them and filter if required, and keep the rest in the fridge, taking out what is needed for the day, the night before.
We had a lot of top pros use our lab. Often, we would buy the film for them, as our buying power gave us very good pricing direct from Kodak.
Kodachrome was, by far, the most complex film on the market. In fact, the Kodachrome development process has been described as the most complex chemical process of any kind in the world. That was true. No made up chemicals there. We had to make everything from scratch, using Kodak's chemicals and dyes. We needed a complete qualitative analysis lab with an atomic spectrometer. Dan Jones, who had been the director of Kodachrome processing was our head chemist for several years.
melgross - Thursday, April 21, 2016 - link
Oops. I didn't catch the errors. I meant gamut instead of gamer, and amateur instead of a ether.jlabelle - Thursday, April 21, 2016 - link
-- I was, to be frank, appalled. As a result, flesh tones have always been a problem in Adobe RGB.--Is this post legit ? Seems like a major misconception on basic knowledge regarding color management so surprise you are claiming having worked in a lab related to photo.
Skin tone are well within sRGB color space. As such, having a wider color space than sRGB has virtually 0 impact on skin tone accuracy (provided the calibration is correct).
What is your point about aRGB or wider color space gamut related to accuracy of the skin tone ?
Also, it is funny that people are clamoring about wide gamut screen. I used to have a U2711H (so a Dell wide color gamut screen) and I was shooting full frame cameras (Canon 5d II then Sony A7R, both in aRGB) with a fully calibrated workflow and display. i think that after more than 70 000 pictures, I may have 5 photos where the colors went in some small area a little bit beyond the sRGB color space.
In all practical purpose, for natural color, there is small to virtually no benefit whatsoever to be able to saturate more than sRGB color space.
jlabelle - Thursday, April 21, 2016 - link
-- I was, to be frank, appalled. As a result, flesh tones have always been a problem in Adobe RGB.--I really wonder if this is post is legit as it shows gross basic misconception about color management and surprised you claimed having work in a photo lab.
Skin tone are well within sRGB color space. As such, having access to wider gamut, be aRGB or ProPohoto or DCI-P3 has virtually no impact on the ability to display accurate skin tone (if properly calibrated).
Going beyond that, I used to have a Dell U2711 (so a wide gamut screen) with pictures taken in aRGB from a Canon 5d II and then A7R. But in more than 70 000, I may have 5 photos where some areas of the picture was going very slightly beyond the sRGB saturation (and only visible when switching from one to the other).
In all practical purpose, for real life color, there is little to no advantage to have a wider color screen.
So I sell it and I have now a U2713UHM (which is not wide gamut) because I was fed up to have 2 copies of each picture : one in aRGB, only visible on my setup at home and sRGB for all the other applications (sharing to friend, displaying on mobile, ...).
I do not understand this trend, especially on a mobile OS where applications are not color managed. It causes more trouble than bringing benefit.
And if this is to have Apple implementation which is basically a display ICC profile, then, what's the point ?
Brandon Chester - Thursday, April 21, 2016 - link
That's not how the implementation works. Content is rendered according to how it defines itself, and as sRGB if it fails to explicitly specify its color space. That is exactly how color management should work. Existing sRGB content renders correctly, and wide gamut content renders 'correctly' in the sense that it's transformed using a perceptual transform if required, and rendered natively if not.Spoony - Thursday, April 21, 2016 - link
My experience has been similar. A vast majority of my photos fall fully within the sRGB colorspace. When an exception pops up, you notice that the saturation is really strong and it looks lovely. It's just not very common. I would be much more interested in more contrast/dynamic range than greater gamut. Having both is, of course, best. This person took DCI-P3 to task when the iMacs came out and found similar results to us: http://www.astramael.com/1gwcoffey - Thursday, April 21, 2016 - link
I had heart that the iPad automatically turns of True Tone when you launch a photo editing app. Is that the case? If it is possible for apps to control this, it could be a feature that people could leave on for normal use without interfering with accuracy where it matters.Brandon Chester - Thursday, April 21, 2016 - link
The photos app does, but from what I've seen there is no public API for it yet.melgross - Thursday, April 21, 2016 - link
You can turn it off manually.kurahk7 - Thursday, April 21, 2016 - link
Can you also test, in the future, if the true tone display operates on a preset with a predetermined range (ie 3000-3500K make display warmer, 4500-5000K make display a little more neutral) or is it like the true tone flash where it's more granular.Brandon Chester - Thursday, April 21, 2016 - link
I'd definitely like to try. Right now I'm a bit limited with my lighting setup, especially because I'm moving at the moment so setting up a bunch of lamps with different bulbs is a bit difficult. Keep an eye out for the full review.Pinn - Thursday, April 21, 2016 - link
Colors seem to have 5 names for every concept. Confusing. Is it 10-bit or not?Lieuchikaka - Thursday, June 2, 2016 - link
http://mavangvn.vn/ma-vang-dien-thoai/dien-thoai-s...zepi - Thursday, April 21, 2016 - link
Some questions:Does the color management and screen handle also AdobeRGB images in good manner in addition to sRGB and DCI-P3 images / movies?
10 bit colors?
melgross - Thursday, April 21, 2016 - link
Adobe RGB is aimed further up in the CIE chart, while DCI-P3 is aimed downwards, and more to the right. That being said, it also goes further up than sRGB does, so Adobe RGB images would look better, depending on how the color management handles that. I have the 12.9" model which doesn't have this, so I'll have to wait until later this year, when the new one comes out, to find out.I did buy my daughter, who is a pro fashion and product photographer, a late 2015 27" iMac which also has this display. Using my calibration equipment, it seems to have about a 90% congruence with Adobe RGB, but continues the spectrum past it in the yellows and red. It nails the gamut at the blue point and stretches the red point down, and to the right, while pushing the green slightly down.
Overall, it's pretty close, but I hope Adobe supports it soon, as well as camera and scanner manufacturers. The motion picture industry is much bigger than the still industry, so we might see that.
WaltFrench - Thursday, April 21, 2016 - link
I wonder about …“color accuracy drops significantly when using True Tone in warm ambient lighting.”Yes, you confirm that True Tone is operating as expected, but the whole point is to make the image *look* more accurate. I gather that our “retinex” process that affects how we see color is an awfully complicated subject, but if TrueTone actually makes the colors “look” right, how can they be less accurate just because some calibration scope (which intentionally ignores the environment) says so?
melgross - Thursday, April 21, 2016 - link
Well, technical color accuracy is very different from perceptual color accuracy, as well as brightness accuracy.While the meter says that "it isn't white", our brain, which is a pattern matching machine, thinks that "that's the whitest, and brightest, thing here, so it must be white"
jlabelle - Thursday, April 21, 2016 - link
It is very interesting to have this article from Anandtech as many people still assume that a wider color gamut than sRGB is automatically a good thing and it is good to clarify that without color management, this is a catastrophe.Now, if I understand correctly, what Apple is doing is just applying a system wide color ICC display profile (like a display ICC profile on any Windows device).
But the real question is : is there API for the developer to take control of that ?
Because so far, as such, I do not understand the interest to have a wider color gamut screen on this iPad. But worse, what is the benefit for the user ? It seems that it has 0 advantage compared to the iPad Air 2, isn't it ?
melgross - Thursday, April 21, 2016 - link
That's not what's happening. Color management is not just applying a single profile to the system, as you imply. Neither OS X or Windows does that. Color management looks at an ice profile and then compares it to the gamut of the output device, whether a printer or monitor, and maps the colors to that device so as to keep the proper colors throughout the scale without stretching or compressing the values. So it may not use the entire gamut of the device, or it may cut the gamut of the image, depending on whether the image profile containers a larger or smaller gamut than the device it's being viewed on.There can be settings to compress the gamut at the extremes in some cases, but it's rarely done.
jlabelle - Thursday, April 21, 2016 - link
--Color management looks at an ice profile and then compares it to the gamut of the output device, whether a printer or monitor--Which is exactly what i said regarding a display ICC profile.
So my point stands that it seems to have no benefit at all here...
Brandon Chester - Thursday, April 21, 2016 - link
I'm not sure exactly what you're saying, and I don't know what you mean why there being no advantage. Any content that targets a wider color space than sRGB will show a greater range of colors on this iPad. I'm not sure how that isn't advantageous, and I'm not really sure how you've concluded that there's no benefit at all. If the article wasn't clear enough on this point please let me know.jlabelle - Friday, April 22, 2016 - link
How can it do that ?Let's say there is a photo to displayed with an embedded aRGB color space, which iOS application knows how to read that ? And how does it pass the information to the OS ? Is there iOS function for that ?
Normally, in a color managed OS, the application is sending the info to the OS to display 255 in sRGB color space so the OS is converting for instance the 255 to 200 so that the red appears ok on the wider gamut screen than sRGB. Than if it sends 255 in aRGB, it would become 248 because it is a more saturated color.
How can iOS manage that today ? I do not see any mention in the article that any application has any possibility to pass this information and therefore display (on purpose) more saturated color than sRGB.
Brandon Chester - Friday, April 22, 2016 - link
As I stated in the article, this is managed through the frameworks that essentially all iOS applications are built on. Applications built on AppKit, Core Animation, Core Graphics, Quartz, etc already now work with ColorSync to display colors correctly. For example, you can drop an Adobe RGB image into your Dropbox and it'll display with a wider range of colors than older iPads in the Dropbox app, without Dropbox having to change their application at all.Lieuchikaka - Thursday, June 2, 2016 - link
http://mavangvn.vn/ma-vang-dien-thoai/dien-thoai-s...tuxRoller - Thursday, April 21, 2016 - link
<blockquote>OS X is by far the best color managed OS in the desktop world,</blockquote>
You might want to rethink that.
https://lwn.net/Articles/643623/
Also, from the same article, ios apparently has had a color management api for awhile... but nobody uses it.
melgross - Thursday, April 21, 2016 - link
That's an interesting article, but it's only partly right. First of all all software is buggy, and that's particularly true for open source. It's going to be a cold night in hell before more than a fraction of 1% of professionals use open software and some Linux distributions for professional use.While it's true that the Windows color management is particularly buggy, that's because Microsoft doesn't consider it to be of much importance. They really just stuck it in there. Apple's also has bugs, but they aren't showstoppers.
A problem with what is happening in open software and Linux, is that there's no consistency. There are too many methods to apply this, and they aren't compatible. Maybe someday, but that seems very far off.
tuxRoller - Friday, April 22, 2016 - link
Well, your post was interesting but lacking, somewhat, in the data arena.There was a report made by Coverity a few years ago that examined the "bugs"/1000sloc of various open source projects and closed source codebases (obviously not identified). You'll probably be surprised by the results.
http://www.coverity.com/press-releases/coverity-sc...
Judging from the rest of your comment you don't seem to have a lot of direct knowledge of color management in Linux (and, well, I'm not sure what you mean by "consistency")... or even Linux in general.
Data always welcome!
tuxRoller - Friday, April 22, 2016 - link
While I couldn't tell you what percentage of "professionals" use Linux, I can tell you that the best animators in the industry run Linux on their workstations (not just in the render farm).Lieuchikaka - Thursday, June 2, 2016 - link
http://mavangvn.vn/ma-vang-dien-thoai/dien-thoai-s...Brandon Chester - Thursday, April 21, 2016 - link
I have had absolutely no luck getting consistent color management in Linux, and believe me, I've tried.tuxRoller - Thursday, April 21, 2016 - link
Plenty of design professionals do it.Colord was introduced for this reason.
I'm honestly not sure how you managed to fail.
ThreeDee912 - Saturday, April 23, 2016 - link
The post is a bit biased, as it's from a site targeting mainly hardcore open-source software users. It only spends a sentence or two claiming OS X has "a constant stream of bug reports" but doesn't specify what any actual issues are. The speaker isn't familiar with iOS but _guesses_ that color management probably doesn't work? And since this is an OSS event, the speaker of course praises only Linux and Android.I mean, heck, Apple developed the standard for ICC profiles and then co-founded the International Color Consortium back in the 90s. They're the basically ones that started this whole thing in the first place.
tuxRoller - Saturday, April 23, 2016 - link
And your comment is full of fallacies.You're assuming it's biased because it's coming from lwn (a site that, for the most part, is both highly technical and focused on development, not on beating any ideological drums; that oss is a good thing is assumed, so there's not a great need to proselytize).
You assuming that since apple contributed to a standard that their implementation is better than another's.
If you can actually refute the claims made in the article then please do so.
pasamio - Saturday, April 23, 2016 - link
I don't feel the previous commenter is necessarily saying the article is wrong but is saying that the speaker didn't quantify anything. The speaker doesn't qualify what the stream of bugs are but my reading of that phrase is that users report bugs on the software that may or may not be accurate because the earlier statement on Windows has the phrase "no one ever reports bugs about it if they can't use it". This makes me wonder is if it's end users who are reporting what they see as bugs due to the fact that Apple has colour management always enabled. I must admit I also agree that the fallacy of "nobody uses it, it must be broken" from the speaker is an interesting jump.tuxRoller - Saturday, April 23, 2016 - link
Thanks for the thoughtful comment:)While I don't agree with your reading of ThreeDee912's comment (I interpreted him as saying that since there might be bias, that is reason enough to doubt... his tone certainly didn't help matters when it comes to alternate interpretations), your points are well taken.
I think it would be useful to consider the perspective of the speaker. That is to say, his background.
Here's the link to his lgm 2015 talk along with a bit of background (http://libregraphicsmeeting.org/2015/program/##chr... From his site, he seems like he mostly works with Macs. So the bugs he's speaking of, if I may attempt to interpret him, seem to be from his interactions with users as their trainer. IOW, these would be bugs he's encountered in the course of working with clients (which is his job).
I completely agree that he made an assumption about the usefulness/abilities of the ios color management api. I've got to assume this was a poor choice of words that the author used to summarize him as Chris sent on to speak about a particular ios app that used the API:
"As far as he is aware, there is a single app that leverages it: a proprietary tool from X-Rite; even then, the app is largely inconsequential since it does not make any of its features accessible to other apps."
I have to say that it's very annoying that they (lgm) don't seem to have actually stored his video and slides ( http://activearchives.org/lgm-video-archive/index_...
tuxRoller - Saturday, April 23, 2016 - link
Ha! I just managed to grab the slides from 2015 (they dropped the .pdf extension on the URL):http://video.constantvzw.org/LGM15/day-02/12-Chris...
The video is there as well (again, they dropped the extension):
http://video.constantvzw.org/LGM15/day-02/12-Chris...
So, at around 7:50 he starts talking about the state of color management on osx. Apparently the state is a cycle of bugs->fixes->regression->bugs (which, to be honest, sounds like the norm for everybody, but my guess is he's especially bitter about this because he's mostly working with Macs so he's often encountering these issues and has to explain to his customers that there's not a lot he can do).
Having watched the section about Linux I can say that he doesn't claim Linux doesn't have bugs but it seems to have less regressions (implication of greater software stability).
Again, if you're interested in color standards, he talks about them a bit more for the first 3-5m(along with discussing a serious upcoming problem with the slowing of updating standards and less new color science being made public).
Lastly, his comments about ios are in this lightning talk
http://video.constantvzw.org/LGM15/day-02/19-Chris...
He says that, he's been told, that ColorSync exists in iOS but it's more of a placeholder. Regardless, he says that one app uses it but, because of how it works he doesn't consider it color management (or, at least, a good experience).
Lieuchikaka - Thursday, June 2, 2016 - link
http://mavangvn.vn/ma-vang-dien-thoai/dien-thoai-s...zanon - Thursday, April 21, 2016 - link
I assume it's still just much too early in terms of tech to switch to BT.2020 based displays, particularly since that really needs higher bit depth channels. It looks like this move may set a good foundation for Apple to make that switch when the time comes, which is certainly what I hope is the case rather then it making them tempted to stay on DCI-P3.svan1971 - Thursday, April 21, 2016 - link
Not interested in the 9.7 Pro, half the ram, slower cpu/gpu usb 2 not usb3.hlovatt - Thursday, April 21, 2016 - link
This just seems like an anti-Apple comment and nothing else. What better 9.7 inch tablet are you going to get. If you really want the higher performance buy a 12 inch one. Apples only 'crime' here is offering you choice and the opportunity to snipe at Apple.Even if you for emotional reasons would never buy an Apple it still is in your interest to app load better technology like this 9.7 Pro. You never know, it might spur on your favourite manufacturer to try harder.
jlabelle - Friday, April 22, 2016 - link
Interesting. I am not a fan of the iPad but for what application would you ned USB 3 instead of 2 ? In wich case would you need higher speed transfer and what would be the gain of time ?omf - Thursday, April 21, 2016 - link
Thanks for the interesting article. Will we have to wait for UltraHD video to see the benefits of the wider color gamut, or are there apps out now that can take advantage of it? I don't suppose the built-in photo/video app uses it?Brandon Chester - Thursday, April 21, 2016 - link
If you have Adobe RGB photos it will.jlabelle - Friday, April 22, 2016 - link
How ? If I send a aRGB picture by email, will it be displayed properly ? Same in the photo app ?Do 2 RAW pictures imported in the iPad, one in aRGB, the other in sRGB be displayed the same or differently ?
zepi - Monday, April 25, 2016 - link
Exactly. That is what Brandon is saying. You just put wide gamut pictures with corret profile in to the ipad in any way you can think of and "it just works".App makers have to deliberately break color management for it to fail.
Lieuchikaka - Thursday, June 2, 2016 - link
http://mavangvn.vn/ma-vang-dien-thoai/dien-thoai-s...bill44 - Thursday, April 21, 2016 - link
Current UHD HDR movies use PQ gamma, rec2020 container, 10 bit Color. Mastered to P3 gamut (for now) inside a rec2020 container. Wondering how this new Color Manegement system will cope with all that. If you calibrate this display to the P3 gamut, it will not line up with the rec2020 gamut lines. Apple needs to adopt rec2020 container support, HDR & PQ gamma ASAP.Cold Fussion - Friday, April 22, 2016 - link
Can you comment on the behaviour of viewing photos that identify themselves as adobergb.MrSpadge - Friday, April 22, 2016 - link
I don't like Apple's hardware prices and their closed software ecosystem. But I have to admit: it's impressive to see them caring even about such seemingly small things like matching the display color temperature to the ambient. On windows I'm using f.lux for that, without external sensor input. Once you get used to it, it's unbelievable how blue regular displays look in "evening light". However, the functionality is so not-integrated that I only have the choice of keeping the cursor at its regular color temperature (hardware cursor) or to have it matched but disappear in most games (software rendering) :/Oliva - Sunday, April 24, 2016 - link
Agreed the 9.7 iPad Pro is the best iPad yet. When you have the pencil, keyboard, and a good light source, like Lumiy Lightline LED task lamp, you can really work long hours on it.Need good accessories to reduce fatigue and increase productivity.
Wolfpup - Monday, May 9, 2016 - link
Dang...I'm impressed by my 13" iPad Pro's screen, and now it's outdated already LOLazulon1 - Wednesday, May 11, 2016 - link
How do i upgrade my 12.9 inch iPad Pro to true tone display mode. Please I looked for weeks all over the Internet, only art1icle I find is this? I can pay someone to helpin2046 - Sunday, May 22, 2016 - link
>> Apple's own applications interpret untagged content as sRGB, and also properly understand tagged images and videos and display them correctly.This seem to say that videos mastered in P3 color space and "properly tagged" will be displayed with a P3 gamut on the iPad Pro 9.7". Is this correct?
Are there example of such sample P3 video files that are available?
Lieuchikaka - Thursday, June 2, 2016 - link
see http://mavangvn.vn/ma-vang-dien-thoai/dien-thoai-s...