(Source: Anshel Sag)
I have spent the better part of the last two months using the Apple iPhone 7 Plus, which I reviewed here, as well as the LG Electronics V20 and the Google Pixel. (While I did not get a Pixel XL, there’s no doubt that the Pixel and Pixel XL have the exact same camera capability.) I carried these phones around with me (all three of them) to places like Berlin, Hong Kong, New York, San Francisco and San Diego. I used them for daily photography, night photography, conference photos, selfies and group photos.
Do note that the iPhone 7 Plus and the LG V20 both feature a dual camera array that allows for more functionality than a standard single image sensor would offer. This has been something of a major trend for 2016 and something that I only expect to accelerate in 2017, since companies like Google and Samsung Electronics have yet to adopt the technology. When it comes to technology, LG and Apple went in different directions when it came to how they used their ‘dual camera’ capability.
So, what directions did the companies go with their differing camera arrangements? Each of these companies seems to have aimed for different goals to achieve with their cameras, and how you feel about photography will likely determine which camera you like the most. Apple aimed to achieve a couple goals with the new iPhone 7 camera; one was to increase the zoom capability and fidelity by fusing the two cameras together. There is no doubt they accomplished this goal in my experience, the iPhone 7 Plus has the best and smoothest zoom of any phone I’ve ever used. They also recently introduced a DoF (depth of field) effect that creates the illusion of bokeh that would normally be generated by a DSLR. They do this with the two different cameras capturing depth data and creating a sharp foreground and a blurred background. However, it is still in beta and is extremely sensitive to low-light, and images can come out as grainy or simply not work in too low of light as the app warns you the light is too low for DoF effect.
(Source: Anshel Sag)
Google has a feature in the Pixel that does this as well, however it requires the user to move the camera around the object and then the app calculates the depth data and creates an image. It has to do this because the Pixel only has one image sensor and needs three-dimensional data. The user experience for this feature is much worse than Apple’s DoF feature, and I don’t see myself using it much at all. Speaking of the Pixel, Google’s goal with this phone’s camera appears to be to deliver well exposed and low noise images in low light and very fast images in normal light. The Pixel is without a doubt the fastest of the three cameras and in many cases, delivers a superior image in terms of exposure, color depth and dynamic range. However, even to this day, a month since I’ve had the Pixel, it still has major issues with lens flares, both day and night. Admittedly, Apple did have some issues with the iPhone 7 Plus camera as well, but they had fixed them within a month of the launch.
On the LG G5 and V20, LG opted for a standard high-resolution f/1.8 12MP camera and a very wide angle f/2.4 8MP camera with a slightly narrower aperture. They also opted for OIS (optical image stabilization) rather than EIS (electronic image stabilization) in other phones like the Pixel. The reason why I’m talking about the V20 rather than the G5 in this article is because the V20 is newer and features dual front-facing selfie cameras that the G5 doesn’t have. Apple on the other hand uses their dual camera setup with identical 12MP sensors with different optical zoom. The ‘main’ camera features a f/1.8 aperture lens and 1x zoom (no zoom) and then the secondary camera features an f/2.8 2x optical zoom. This design decision gives Apple an entirely different choice of features to enable compared to other devices on the market. The Google Pixel on the other hand does not have a dual camera arrangement or OIS like the other two cameras do. This would normally result in much poorer low-light performance, but because of Google’s extremely powerful HDR+ algorithm, they’re able to mostly compensate for the lack of dual cameras or OIS. Mostly.
(Source: Anshel Sag)
What makes the LG V20 superior, in my opinion, to the iPhone 7 Plus or Google Pixel? It really comes down to how LG has been refining their dual camera capabilities. Nobody questions LG’s camera capabilities in normal shooting scenarios. It keeps up with the iPhone, Galaxy S7 and Pixel. Most of these phones shoot good quality images over all with some doing better than others in different areas. The real reason why the LG V20 is a better camera, in my opinion, is that it allows you to capture more in the photo than any of the other cameras do. You can recreate your ability to see your surroundings better with the V20, be it with the main camera or the front-facing camera. In both instances, LG repeatedly proves itself the best at helping recreate memories and share them, and ultimately that’s what makes a camera good. When you are somewhere and want to share that experience, you want to include as much of what you are seeing and that ultimately leans in the favor of the LG V20.