Google has been a computational photography pioneer over the past few years and its latest trick, called ‘Photo Unblur’, could be one of its most impressive tricks so far. A Google Photos feature that’ll initially be exclusive to the Pixel 7 and Pixel 7 Pro, it promises to rescue your new and old snaps from blurry oblivion.
Photo Unblur is an expansion of ‘Face Unblur’, which arrived last year on the Pixel 6 and Pixel 6 Pro. The latter has quickly become one of the most popular computational photography features since Google unveiled ‘Night Sight’ on the Pixel 4 back in 2019. But it’s also quite different from Photo Unblur, which means the two will act as complementary modes for varying situations.
Both features use machine learning to improve your pictures, but Photo Unblur is designed to improve the shots you’ve already taken on any camera. Face Unblur, meanwhile, is a pre-emptive mode that uses the power of Google’s Tensor chip to detect when someone is moving too quickly in your scene. It then automatically takes two photos, which are then combined you give you a well-exposed, sharp snap.
So how exactly does Google’s new Photo Unblur mode work without being fed multiple snaps of the same scene? Google hasn’t fully expanded on its inner workings yet, but we can get a good idea by looking at where it’s come from.
How does Photo Unblur work?
Photo Unblur hasn’t arrived completely out of the blue – while Google hasn’t yet expanded on its inner workings, it’s likely built on some existing features we’ve seen in the Google Photos app. And that means it could ultimately be available on devices beyond the Pixel 7 and Pixel 7 Pro.
In 2021, the Google AI Blog (opens in new tab) described the tech behind two new Google Photos features called ‘Denoise’ and ‘Sharpen’. These arrived to help you boost photos that were shot in tricky conditions, or with older phones that had noisy sensors or ancient optics. And these likely form the basis of Photo Unblur.
Photo editors have long had sliders to help you adjust noise and sharpening, but Google’s new tech is much smarter than those. For starters, it analyzes your whole image to work out the levels of noise and blur down to a pixel level, regardless of which camera the photos were taken on.
This crucial step allow the noise reduction and de-blurring to occur on a more granular level than older techniques, which makes them less processor intensive. And this makes them ideal for running on-device or in the cloud. Once Google’s analyzed your image, it can then apply its slightly counter-intuitive methods for reducing blur and noise.
These are counter-intuitive because they involve pushing your photo in the seemingly ‘wrong’ direction, before bringing it back to an improvement on the original. To reduce noise, Google combines noisy pixels (effectively downsampling the image), then merges them together while regenerating finer detail. The sharpening works in a similar fashion, with Google’s algorithms re-blurring the image several times in an efficient, phone-friendly process.
So how does Photo Unblur build on these techniques? Right now, we don’t know the specifics, but a year is a long time in machine learning – and some of Google’s examples during the Pixel 7 launch certainly looked impressive.
The image below, for example, has been impressively cleaned up from its almost unusable origins, which appear to have been caused by light movement and an excessively slow shutter speed.
Because Photo Unblur doesn’t work with two images of the same scene, like Face Unblur, it may struggle to be quite as powerful as that older feature, particularly for issues caused by movement. But we’re looking forward to taking it for a spin on our old snaps when the Pixel 7 and Pixel 7 Pro launch.
How do you use Photo Unblur?
Google again hasn’t revealed the specifics of how you’ll use Photo Unblur on the Pixel 7 and Pixel 7 Pro yet. But it has said that that in “just a few taps” you’ll be able to remove blur and visual noise in a process that sounds just as straightforward as last year’s Magic Eraser (for removing unwanted objects).
This process will take place in the Google Photos app, with Photo Unblur initially only being available on the PIxel 7 and Pixel 7 Pro. But we’re expecting the tech to eventually be available on all devices running the Google Photos app at a later date.
While Photo Unblur isn’t quite as automated as Face Unblur, which works during the photo-taking process on phones from the Pixel 6 series onwards, it does look like another very simple example of computational photography improving our snaps. Including the old ones we’d written off.
It looks likely that the two modes will be complementary, with Face Unblur kicking in (on supported devices) before you take a photo, and Photo Unblur being useful for old snaps taken on any camera. We’ll be taking Photo Unblur for a spin very soon and will update this article with all of our findings.