Google Engineers Explain & Show-Off the Pixel’s Night Sight Astrophotography Mode
Several months ago we covered a leak of some photos from the new Google Pixel 4, and what got me the most excited from that leak, was the incredible detail in their Astrophotography Mode; “Night Sight”. It almost seemed faked…I mean, it couldn’t be real right? Not that much detail from a smartphone? Well, it looks like it may just be all that we hoped for and more.
In a recent blog post from Google themselves, they do a deep dive into the Night Sight mode, specifically an Astrophotography, to share with us all the details and more on this incredible new technology. Now the Night Sight mode isn’t exactly new, it’s been available on the Pixel 3 series for a while now, but the new version seems to step the game up and then some!
To take full advantage of this new feature for astrophotography, you’ll have to treat it much the same as if you were attempting the shot with a DSLR, meaning you’ll still want to use something like a tripod to help stabilize your shot.
Night Sight, by design, takes several frames in a burst that are aligned automatically to compensate for any camera shake and motion by shooting hand held. Effectively letting you get incredibly detailed and bright shots in very low light, instead of taking a single Long exposure like we would with our DSLR. With Night Sight, these images captured in the burst and aligned are then “Averaged” to reduce the noise and increase the sharpness and detail in the frame.
The team of engineers at Google decided the exposure time for each frame shouldn’t be any longer than 16 seconds to make the stars look like points of light, rather than streaks, ensuring a sharper and more detailed image. What the Astrophotography mode does, is it’ll use up to a maximum of 15 frames with up to a 16 second exposure per frame, and then merges and color corrects all of those layers IN CAMERA!!. If you do the math, at a max exposure and frame count, you’d get a 4 minute long exposure worth of detail! That’s kind of insane.
So if you’re like me, and you’ve taken some super long exposures, you’re bound to notice some hot or blown pixels in your shot. The Pixel’s astro mode is no different there, but where it does stand apart, is the system will identify those hot pixels, and compare them frame by frame as well as with their immediate neighboring pixels replacing them wherever an “outlier” is discovered. (See above image). To further the magic, the Pixel uses it’s AI to identify the sky in the image, and then selectively darkens it, so that you get an image that reflects something closer to the “Real scene” while applying a sky-specific noise reduction to increase the contrast, making features like clouds, color gradients, or stars pop out more.
As the conclusion of the blog post details, with the phone on a tripod, the Night Sight feature produces sharp images of starry skies, and as long as you have a little bit of moonlight or ambient lighting, the landscapes will be clear, and impressively vibrant! The engineers also point out that even though this is a massive step forward, there’s still, and always will be room for improvement. This only leaves me wondering, and kind of drooling over what enhancements will come in the next generation of the Pixels cameras?!
I’ve been an iPhone guy now, for more years than I care to even think about, but after hearing the praises from friends with the pixel, and now reading these details, i’m enticed to make the jump when it comes time for my next upgrade. At the very least, i’ll be reaching out to a friend with this phone so that I can head out on an adventure with them to do a mini review!
So tell me, have you tried the Night Sight mode on the Pixel phones yet? Do you have any comparison images you can share? Would you like to see us do a review and comparison against the iPhone 11 or another smartphone? Let us know in the comments below!