Analysis: Comparing the original and new DxOMark Mobile test protocols

To keep up with the rapid pace of innovation in smartphone cameras, we have overhauled and expanded our DxOMark Mobile suite of image quality tests. The most obvious change is our addition of an extensive set of tests to measure Zoom performance and Bokeh quality. These two new sub-scores join our existing Photo sub-scores to create the new overall Photo score. But that is far from the only change. We have also expanded our indoor and outdoor test suites to include low-light testing, effects of subject motion, and more exacting high-dynamic range tests. Overall, we now capture over 1500 still images and two hours of video for each device we test.

We have retested several of the top-scoring phones using our new test protocol, with some interesting results that help illustrate how the new DxOMark Mobile’s scope has been expanded to cover some of the recent innovations and features and be more relevant. You can get a sense of some of the changes by looking at this chart of old versus new scores for some of the leading smartphones. For example, our new test suite that measures Zoom and Bokeh helps set the dual-camera design of the iPhone 7 Plus apart from the single-lens iPhone 7. Some of the other camera scores have also changed significantly based on how they deal with low-light, motion, and HDR – three other capabilities we now test more extensively.

Comparison of old vs new scores for some leading smartphones.

Highlights of our new DxOMark Mobile Score include:

  • All new Zoom sub-score based on extensive testing at multiple focal lengths
  • All new Bokeh sub-score using a scene designed to allow comparisons
  • Low-light testing down to 1 Lux
  • Additional high dynamic range testing
  • Motion added to the test suite to more accurately evaluate camera performance in real-world situations
  • We capture and analyze over 1500 images and two hours of video using expanded lab and outdoor test scenes for each device we test with the new protocol.

Our new Zoom sub-score

Until recently, almost without exception, zooming on a smartphone has meant a simple digital scaling of the initial wide-angle image. With the advent of dual-camera designs featuring a telephoto lens, many smartphones can now take advantage of true optical zoom, or even some form of hybrid zoom that combines images from multiple cameras. As smartphones increasingly replace traditional cameras, users also expect them to meet all their photographic needs – including the ability to zoom in on subjects without losing image quality. So our new DxOMark Mobile includes a Zoom sub-score that evaluates a camera’s performance at a variety of distances from a fixed-size subject. Having a second camera with a telephoto lens can be advantageous, but design trade-offs and software challenges mean that real-world performance varies greatly among devices.

It’ll be specifically interesting to compare the zoom performance of recent optical or hybrid zoom cameras with older large-sensor high-resolution cameras, for example the Nokia 808, that were also designed for better zoom performance.

Standard “1x” image from the iPhone 7 Plus

The same scene shot using 2x zoom on the iPhone 7 Plus

Simulated Depth Effects and Bokeh provide an artistic flair to smartphone photographs

As with Zoom, having a subject that pops from the background has until recently required a standalone camera with a large sensor. Now, though, many smartphones come with a computational bokeh effect, using some combination of either or both dual cameras and computational imaging to create a Depth Effect, and to shape the appearance of out-of-focus bright spots to simulate the Bokeh (blurred background) that is typically associated with a wide-open lens on a high-end camera. Using purpose-built indoor test scenes, and carefully-designed outdoor test scenes, we now evaluate smartphone cameras on how well they are able to provide a smooth Depth Effect, as well as on the quality of their simulated Bokeh. In addition, longer focal lengths produce less subject deformation than wide-angle shots – especially noticeable with portraits – so we judge each camera on how well it is able to accurately render the proportions of a subject’s face.

You can see from this comparison that when activating Portrait mode the iPhone 7 Plus is able to blur the background, creating a more artistic look for the image.

When in Portrait mode, the iPhone 7 Plus blurs the background

The iPhone 7 doesn’t blur the background.

Added low-light tests reflect how consumers are using their smartphones

Consumers are also now expecting their phones to capture images in very-low-light conditions, so we are testing smartphone cameras in lighting down to 1 Lux (essentially candlelight). This exposes differences in performance between phones that otherwise might rank very similarly. For example, the Apple iPhone 7 does a better job of calculating exposure in very low light than the Samsung Galaxy S6 Edge – even though in more typical lighting conditions they score very similarly:

Apple iPhone 7, 1 Lux

Samsung S6 Edge, 1 Lux

The S6 Edge has exposure problems in very low light. It also uses a long shutter time to reduce noise, which in turn can cause motion artifacts. Both of these issues contributed to its relatively lower score when tested with our new protocols.

Capturing action means needing to test for Subject Motion

As smartphones are used in more and more action situations, it is important to evaluate how well they handle subject motion. Motion artifacts can be caused by long shutter times or by problems with multi-frame image processing algorithms. To evaluate performance in those conditions, we have added tests that compare low-light performance while mounted on a tripod and when hand-held. This allows us to directly compare detail preservation, noise level, and other aspects of the resulting images, such as with these cropped portions of low-light test images:

Nokia 808, Handheld, 5 Lux

Nokia 808, Tripod, 5 Lux

We have also added timing to our Autofocus testing to better understand whether a camera can react quickly enough to accurately focus on a scene — something that is especially important when photographing sports or even family scenes with kids running around. Here you can see that in low light, the Samsung S6 Edge is often slow to focus:

Low-light Autofocus timing results for the Samsung S6 Edge

Expanded test scenes push boundaries for Dynamic Range

Many smartphone cameras automatically capture and combine multiple frames to create a single still image, thus enabling them to accurately record scenes with much higher dynamic range. To help measure this and other camera improvements, we have added new challenging indoor and outdoor test scenes. In this challenging scene shot from under a bridge, you can see that the Samsung S6 Edge is unable to keep detail in the sunlit portion of the scene, while the Apple iPhone 7 provides some detail, although not much color. Only the Google Pixel is able to render the blue color accurately.

Apple iPhone 7

Google Pixel

Samsung S6 Edge

Similarly, you can see the results of our new indoor dynamic range tests, such as the following comparison of the iPhone 7 Plus with the S6 Edge:

Samsung Galaxy S6 Edge

iPhone 7 Plus

The green vertical lines in this chart show the brightest and darkest areas that each camera can discriminate under various lighting conditions. You can easily see that the iPhone 7 does a better job of rendering tonal values in the shadows than the S6 Edge under typical indoor and outdoor lighting conditions.

Comparing the dynamic range of the Samsung Galaxy S6 Edge with the Apple iPhone 7

Evaluating HDR (High Dynamic Range) performance

We have added tests on natural scenes that evaluate how well a camera can automatically handle and provide a quality image in high-contrast situations. For example, our “garden arch” scene features dynamic range beyond what any current smartphone sensor can capture in a single frame, illustrating the relative advantage of having multiple frames automatically combined. Here you can see that the Apple iPhone 7 Plus and the HTC U11 are able to outperform the Nokia 808’s larger sensor by using multiple images:

Apple iPhone 7 Plus

HTC U11

Nokia 808 PureView

Testing low-light Video performance

Just as with still images, low-light performance is of increasing performance to those using smartphones for Video. To respond to this need, our new DxOMark Mobile Video test includes lighting conditions down to 1 Lux (essentially candlelight). These tests show performance differences between phones that otherwise have similar performance characteristics. For example, this chart illustrates that in low light the iPhone 7 Plus under-exposes scenes, while the Google Pixel does a much better job:

In low light, the iPhone 7 Plus under-exposes video, while the Google Pixel is much more accurate.

Upgrades to Video Color, Texture, Noise, and Stabilization tests

In addition to adding low light conditions to our Video tests, we have enhanced our tests for Video Color, Texture, Noise, and Stabilization. Our Video Color tests now include all of the tests we use for our still image evaluation, including the same method of more accurately scoring pleasing color renderings by using a combination of objective and perceptual evaluation. An upgraded version of Video Texture analysis now includes an objective set of tests to complement our perceptual tests, in a range of lighting conditions especially focusing on indoor lighting levels. Our Video Noise tests have been greatly expanded and do an improved job of evaluating both spatial and temporal noise. In addition, our Video Stabilization tests are also more challenging with our new protocol, and include evaluating artifacts.

Our new DxOMark Mobile tests do an improved job of differentiating between top performers

While the same phones that came out on top in our old rankings are still at the top as we’ve retested them with our new test protocol, it is now even easier to understand their relative strengths and weaknesses, as you can see from this chart of several retested flagship phones:

Our new test protocol shows that even though they have similar overall scores, even the best phones have different strengths and weaknesses. For example, our new tests make it clear that the Pixel and U11 can in some situations capture better image detail in default mode, while the iPhone 7 Plus is better for Zoom and Portrait photography.

We have only seen the beginning of how computational imaging coupled with multi-camera and multi-frame capture will improve the utility of smartphone cameras for an increasing number of challenging situations. Our new DxOMark Mobile test protocols will enable us to evaluate these advanced capabilities, and help us to continue to be the leading source of independent image quality test results for the press, industry, and consumers.

  • Gieffe22

    I think this is as usual pretty bad. On real WORLD an iphone 7 have one of the worst alghorythm on the planet because of a heavily processed image turned out full of aliasing, pixelated most of times, and with an watercolor effect. Another thing i don’t understand is your indoor dynamic range test, basically you said that s6 is worse respect to the iphone because of less brightness on the shadowed part, but is CLEAR that s6 exposes the scene with a matrix exposure and able to do a good job with the brightest part of the scene. So, iphone burned out this part and s6 darkened out the shadows. Which is best? well, you can almost ever bring up back the shadows, but you can NEVER and NEVER backup a burned part, except if you shoot in dng mode.

    Anyway, why you use an old s6 to compare with a iphone 7 and not at least a galaxy s7 that in terms of light and illumination smokes out iphone 7 in terms of photodiode size, aperture, and camera sensor size?

    Recap for all:

    -colors=dxo say that iphone is pretty accurate, i think is not, compared to dng is TOTALLY undersaturated and with bad vibrancy of some frequencies
    -texture= dxo say that iphone is good, for you is good a watercolor pixelated 100% crop? for me NO
    -exposure= probably spot on the iphone, probably matrix on s6, so test in basically faulty indoor (and in fact on outdoor s6 do a better dynamic range that iphone)
    -there’s the zoom part but well…i think this deserve other dual camera to compare, not the s6

    I will suggest you to bee more accurate and less fan of some camera. PS in terms of image quality an iphone, a pixel, a s8, a htc u11, NEVER can do better than a nokia 808 or lumia 1020 if you can use proper a camera sensor.

    • Lars Rehm

      Well, today’s smartphone cameras are much more about multi-frame processing and other clever software processing functions than large sensors which is why I have to disagree with your last sentence 🙂 Computational imaging is where it’s all headed, hardware specs will be less and less relevant over time.

      • Gieffe22

        yes but anyway iphone still remeain BAD in terms of colour, noise reduction and details. Is basically impossible that this phone have more than 82-83 points, because can’t do well in never of the basic points of photography. Even using the iphone 7 and shooting dng you can see that camera sensor is good but lack in light capture, and still dng are much better than auto jpeg in terms of correct colors, sharpness and details. So the problem is apple image post processing that is VERY VERY BAD and site like dxo say that is good, and this is a LIE. Pixel, htc u11 deserve to stay on top or almost because s8 do much better in low light situations.

        • Lars Rehm

          If you look at the ranking you’ll see that Pixel and HTC ARE actually on top.

          • Gieffe22

            i know, but iphone that don’t deserve to be here…

  • Tiwi

    i agree with the use of a new method
    but, why did you insert only htc11 and pixel? what about s8,oneplus 5 and g6?

    • Chris

      Where’s the Note 8?

  • beldin2

    Where are old results? It should be available since there are a lot more phones. Is it somewhere?

  • jcn2u

    Grats!
    Cell phones are part of life now.
    Image capture is a main feature which needs comparison.
    Your leading system evaluation has adapted to meet changing hardware/software.
    Keep up the great work.

  • Rami El-Far

    HTC U11 and Google Pixel are equal ! ! !
    How?

  • iBrick

    I’ve always hated the “scores” on DXOmark since they are weighted subjectively on objective measurements. Maybe what’s weighted as more important in a score isn’t as important to me on a particular measurement. Now, bokeh is part of the score, which I might not care about at all (on my phone) or as much as say, low light or detail resolution. So how much weight is put on that in the score? No one knows except DXO.

    • rohi bal

      Exactly, break it down so people can sort by what is important to them – low light/iso noise, dynamic range etc

      • FWIW, sub-scores are always reported, so you can make up your own way to add them up based on your preferences.

  • F.F.

    A new protocol to benefit Apple cameras..
    Strange.. Paid by Apple to put Iphone 8 And X on top of all? Let’s see

    • MrWalker1000

      the timing is extremely convenient i have to say. It makes me trust them less now.

      • Chris

        Add in the fact that they haven’t reviewed the Note 8 which has been available to consumers longer than the iPhone 8 has and I really don’t trust them anymore.

    • Ahmed Aref

      Check iPhone 8 result now, the top of chart !!

      • disqus_qNZaphulDC

        You actually predicted this spot on. So blatant when they was comparing the s6 edge to a year and half older iPhone 7 loool then when they compared the dynamic range they said “look how well iPhone still Oleg up the shadow dark parts ” yes of course cus that’s all it picked up look at the outside there’s ZERO detail. Just the s6 did the opposite. Beautiful in the light parts shit in the dark.

        That’s not what dynamic range. They clearly got paid and their credibility is now shit.

        Obviously a company who doesn’t care about helping consumer just about the pay check. Ffs

  • مُحمّد™

    waiting for other devices (note8 , LG V30 , s8 , etc ……..)

  • Diogo Reis

    S6 Edge? It proves that a more than 2 years old phone camera can be as good as 999$ 2017 one, but why not the S8?

  • Nomaned

    Apple paid new protocol.

  • No_Underscore

    rather interesting results, i like the comparison chart. But would prefer if the line provided was along the 1:1 diagonal.
    As this would quickly show you which phones fared better or worse with the new method.

  • mamma mia let me go

    Where is the old score table with plenty of phones? Why is it gone?

  • Nuuk

    The last Samsung on your list is S6 Edge… What about of S8 & Note 8?

  • aribazsi

    You have’ve been the most reliable phone cam test site, …. WAS #anothersameshameapplesservantsite :((((

    • disqus_qNZaphulDC

      You actually predicted this spot on. So blatant when they was comparing the s6 edge to a year and half older iPhone 7 loool then when they compared the dynamic range they said “look how well iPhone still Oleg up the shadow dark parts ” yes of course cus that’s all it picked up look at the outside there’s ZERO detail. Just the s6 did the opposite. Beautiful in the light parts shit in the dark.

      That’s not what dynamic range. They clearly got paid and their credibility is now shit.

      Obviously a company who doesn’t care about helping consumer just about the pay check. Lol

  • Dominic Afonso

    how is the final score calculated, since it is not just the average. maybe it would be valuable to put more focus on highest individual test scores rather than a subjectively weighted final score. a results matrix would be really interesting to see.

  • Abhay Joshi

    The i phone has the worst image processing algorithm with dull colors. I also agree that it should not get a score more than 80-82. The new system of analysis seems to just benefit some particular phone company. I have also lost trust in this site. Better to search for some other in the future.

  • paouvous

    DXOmark please make this spyder chart an accessible comparison tool for all phones! It makes looking at the differences between phones so much easier than just having the list of the numbers.

  • Jack Tsui

    wow, i think iphone8 will have over 90points and even higher in iphoneX ! it’s the perfect timing for apple lol~
    photo zoom, it’s just a bonus part, please don’t put it in the overall score !!
    btw, how about the S8 new score ??

    • disqus_qNZaphulDC

      You actually predicted this spot on. So blatant when they was comparing the s6 edge to a year and half older iPhone 7 loool then when they compared the dynamic range they said “look how well iPhone still Oleg up the shadow dark parts ” yes of course cus that’s all it picked up look at the outside there’s ZERO detail. Just the s6 did the opposite. Beautiful in the light parts shit in the dark.

      That’s not what dynamic range. They clearly got paid and their credibility is now shit.

      Obviously a company who doesn’t care about helping consumer just about the pay check.

  • Miro

    So, iPhone 7 jumped up here: http://prntscr.com/gmmk2u to sell stocks now when iPhone 8 is coming on market?

  • prokakavip

    Compare lumia 950 with 2017 machines

  • sexyjon

    Since DXOmark is updating theyr test methods it would have been clever to include sound quality of video recordings in the test. That is a very important factor.

    Anyway I am doing my own comparison with other phones of other family members and still waiting for a better one than my Nokia 1020. In real life use it still turns out to provide the best image quality. Of course the camera is slow to use and all that but I am not going to upgrade to anything that provides less image quality anytime soon. 😉

  • Rick_Deckard

    I just don’t get it…. You retested the Google Pixel, but you haven’t retested the Oneplus 5 yet??

  • SonicPhoenix

    I like the new test protocol but why on earth would you use the Galaxy S6 Edge in comparison to the rest of the phones in this article? The Pixel, iPhone 7 Plus, and HTC U11 are all from roughly the same generation of phone but the S6 Edge is basically one cycle behind and happens to be from a cycle where there was a huge jump in camera quality across the board from the S6 and its peers to the S7 and its peers. Why wouldn’t you just reevaluate the S7 instead? Or compare it to the iPhone 6, the Nexus 6P, and other phones from that development cycle?

    • FWIW, it was tested because — in addition to some recent phones — the #1 from each year of DxOMark Mobile was re-tested, to provide something of a timeline of image quality progress over the years.

    • Never Trust Hillary

      Because Apple PAID THEM.

      • SonicPhoenix

        By that logic, Google, HTC, and Nokia also must have paid them. And wasn’t everyone shouting that Google must have paid them because the iPhone 7 plus review was delayed so much?

        Or maybe they just had a different rationale that doesn’t fit your preconceived narrative?

  • Clawz114

    Here’s an idea;
    Allow users of the site to pick the 2 phones they want to compare, then show them random pairs of unlabeled photo samples from both phones. The user clicks the photo that looks the best in each comparison and at the end, it is shown which phone the user perceives to be better quality.

    • Never Trust Hillary

      Apple didn’t pay for that.

  • Chris

    Come on why wouldn’t you have reviewed the Note 8 when it has been out longer than the iPhone 8 and you already reviewed it?

    • Never Trust Hillary

      Big check from Apple. DxOMark credibility score just went WAY DOWN.

  • Brian W

    Will you retest the LG G6?

  • CoolTube

    Retest the S8 and test the Note8. Its fishy that the day before the iPhone 8 is released you change it and then say that the iPhone 8 is “the best camera we ever tested.” Also you don’t test any Samsung cameras except the S6. I’m losing your trust