Sunday, November 18, 2018

Google Pixel 3 Night Mode Explained: Tech behind the incridible camera

                      Google Pixel 3 Night Mode Explained


Google recently announced Pixel 3 smartphone and astonished the whole world by its incredible camera with night mode which could beat the Dual camera of IPhone XS and Note 9 also the triple camera setup of  Galaxy A7 and P20 Pro. The team at DPreview to interview the brains behind the path breaking technique.


Pixel 3 Camera
Google Pixel 3 camera

The interview with Isaac Reynolds, product manager for camera on Pixel, and Marc Levoy, engineer and computational photography lead at Google, managed to give out plenty of details about the machine learning mechanisms and other rather odd techniques used to deliver better camera quality than the underlying sensor is capable of.


Synthetic Fill Flash
While the presenters on stage barely shed any light on this thanks to time constraints, the Synthetic Fill Flash, works in the same way that Google creates Portrait imagery, using a mix of edge detection and machine learning (ML).
Once the photo is clicked the camera will then selectively raise the exposure on the person and their face in the photograph that gives the subject a lively glow. In traditional photography with camera, this usually requires reflectors and other equipment to add more light on a subject against a brighter sunset or in scenarios with dim lighting. The Pixel's processing lets you capture details of both the subject (or even subjects) and the details in the background as well, making for a vibrant, lively and well exposed photograph.

Iphone XS vs Pixel 3
Image by Google


Night Sight
According to the camera team, the Pixel 3 does take better pictures in the standard low light mode itself as compared to what you get on a Pixel or Pixel 2.
If there is really no light (street light 40 feet away) or very dim sources of light (like a sunset), the Night Sight mode comes to your rescue. You will get a bit of shutter delay (upto 4-5 seconds) but the results are reportedly worth waiting for. Wide-angle selfies also come with Night Sight so that colours even in low light are accurate thanks to machine learning.

Raw Capture
As per the team, the DNG files produced by aligning 10-15 frames and merging them is far from your typical smartphone's RAW capture. And thanks to this super accurate and smart frame alignment algorithm, there is no ghosting even if the subject happen to be moving.



Super Res Zoom
A new kind of burst photography that will actually increase the details of the photographs has been used in the new digital zoom mode branded as Super Res Zoom. The technology basically makes complete use of software algorithms to deliver image quality that is far better than its underlying sensor is capable of.

Shots by Google Pixel 3


The system relies on small micro-shifts to gather this data and will also go as far as to purposely move the OIS system to get those shifts (if the user's hands are too steady) and then align the photos back in software to sub-pixel precision.

May you have liked the post. Post your questions in the comments.







EmoticonEmoticon