Google’s latest flagship smartphones Pixel 2 and Pixel 2 XL is equip with the highest-rated smartphone camera sensors in the entire universe right now, but both devices will soon be getting even better and more appealing.
It’s all thanks to a new image processing chip housed inside the second-generation Pixels, which wasn’t even discussed in search engine’s keynote at the product launch of the new Pixel 2 series.
The dedicated image processing chip inside the Pixel 2 and Pixel 2 XL, the Pixel Visual Core system-on-chip (SoC) processor, is the Google’s first custom-designed energy-efficiency co-processor for consumer electronics products and is designed to provide maximum image processing performance while using minimum power of the device’s battery life. Additionally, the new dedicated image chip will empower third-party apps to make use of the HDR+ feature of the 2017 Pixels, something which is currently only achievable using the Google’s camera app.
[agg-ad id=”442″ align=”center”][agg-ad id=”438″ align=”left”]HDR+ is the technology giant’s epithet for the image processing magic which enables the Pixel 2 and Pixel 2 XL to take photographs of a much better quality than would conventionally be possible with equivalent hardware.
However, it is currently available in company’s very own camera app yet and this means favored third-party camera apps, which usually offer greater manual control and much more features, can’t currently capture photos of such premier quality.
Google says that the Pixel Visual Core chip will be activated “in the coming months” through a software update and will enable more apps to access and use the Pixel 2’s camera app in HDR+ mode.
Pixel Visual Core is a powerful chip, eight-core processing unit efficient of delivering up to 3 trillion operations per second per single core. According to the search engine goddess, this means the chipset can process images five times faster than the Pixel 2’s main Qualcomm Snapdragon 835 SoC while using only one-tenth of the energy.
The Pixel Visual Core requires distinctly written code, but this has been made much more easier for app developers thanks to a supporting open source languages “Halide” for the image processing and TensorFlow for machine learning.
Pixel Visual Core is slated to be switched on for the first time in the upcoming Android Oreo 8.1 developer preview, that would be available in the coming weeks.
Moreover, Google says that the Pixel Visual Core will ultimately be accessible to all third-party app developers through the Android Camera App API.