The combination of updated camera hardware and great software should give the Google Pixel 6 series another leap forward in imagery.
Google’s Pixel 6 camera is finally getting competitive, and I couldn’t be more excited about
I firmly believe that cameras are the backbone of smartphone update cycles for most users. Even with midrange hardware, KPIs are no longer such a critical buying decision. Images, on the other hand, offer the most visible improvement year after year. Since the introduction of the first Pixel, Google has focused on photography.
Ironically, while the popularity of its smartphones relies heavily on imaging capabilities, Google’s hardware development on the camera front has been surprisingly slow.
Did you know that the Pixel series has been using the same camera sensor since the Pixel 3 launched in 2018? This sensor wasn’t much different from the previous Pixel 2. Or take the Pixel 5, for example, which ultimately included an ultra-wide sensor but didn’t include tabletop inserts like a Tele sensor, instead. , Google insisted on using its software -Super Res zoom technology which worked reasonably well, but it couldn’t be compared to real optical zoom. Elsewhere the year before, the company opted for a telephoto lens on the Pixel 4 but chose not to use an ultra-wide sensor, which certainly can’t be replicated with software.
Google’s strategy for images and smartphones, in general, was diametrically opposed to what almost every other OEM in the Android space is advocating: specs. You can step out of the pixel camera sensor and use the mindset of engineering to create a consumer product. Aside from the fact that even Apple chooses to use hardware solutions instead of reinventing the wheel.
Why Google has fallen behind on the camera curve
Let’s start with the obvious: it is clear that Google has pushed the IMX363 sensor to its limits. Our own tests have shown how far the Pixel 5 is from the competition. From HDR noise and zoom functions to the lackluster ultra-wide camera, there are a few things that even software can’t beat.
Excavator boss Marc Levoy could be blamed for this reluctance to change. In an interview about the introduction of the Pixel 5, Levoy said he was not convinced that the pixel clustering and the resulting increase in signal-to-noise ratio via a high-resolution sensor led to a noticeable improvement in images. That may be true in 2019, but since then the myriad of phones that have put these sensors to good use have turned out to be wrong.
Although few of them have matched the capabilities of Google’s software, improvements in sensors have allowed competitors to overcome many hardware limitations. Huawei has pioneered the use of RYYB sensors that enable night vision capabilities, while Sony is expanding the expertise of its camera division at Enhance Color Science. Others, like OnePlus, have chosen to partner with traditional camera makers like Hasselblad to improve their game.
Elsewhere, the BBK Group has invested heavily in imaging, and phones like the Oppo Find X3 have a variety of camera sensors to cover every possible use case. Xiaomi has also jumped in the ring and the Mi 11 Ultra is one of the best-equipped flagship cameras, not only because of the hardware, but also because of its excellent camera setting.
Where Google led by a mile nationally, it is now at best on par with the competition and behind in more ways than one.
New sensor gives Google software the hardware it needs to shine
If Google’s thought process behind the Pixel series has shown us one thing, it’s that the company isn’t interested in competing a quarter mile over time. You prefer to take big steps forward and perfect your equipment. With Levoy no longer in charge, it seems Google has recognized the flaw in its previous thinking.
An improved camera sensor is exactly what the Pixel 6 series needs to improve its gaming, and that’s exactly what camera phones are going for. The software maximized the hardware, but we already know that Google’s imaging algorithms shine on high-end hardware.
Google’s camera app ports already exist for next-gen sensor phones, and the results are instructive. Because Google chose an updated sensor, already excellent software can maximize the benefits of years of hardware advancement. Artificial intelligence, machine learning in the future with the tensor chipset will only increase this. This goes far beyond a tangible but expected improvement in image quality, however.