When Google introduced Night Sight on the Pixel 3, it was a revelation.
It looks like someone turned on the lights in your low-light photos. Previously impossible scenes became possible – no tripod or deer-in-the-headlights flash required.
Five years later, taking photos in the dark is old hat – every phone up and down the price spectrum comes with some sort of night mode. However, the video is a different story. Night modes for still photos capture multiple frames to create a bright image, and by its very nature it's impossible to copy and paste the dynamics of that feature into video. Already A series of pictures. The answer, as it stands lately, is to call in AI.
When the Pixel 8 Pro launched this fall, Google announced a feature called Video Boost with Night Sight, which will arrive in a future software update. It uses AI to process your videos – and enhances detail and color, which is especially helpful for low-light clips. There's just one catch: this processing takes place not on your phone, but in the cloud on Google's servers.
As promised, Video Boost started rolling out to devices a few weeks ago with December's Pixel update, including my Pixel 8 Pro review unit. And that's good! But this isn't quite the watershed moment that the original Night Side was. It speaks to how impressive Night Sight was when it debuted, and the specific challenges that video presents to the smartphone camera system.
Here's how Video Boost works: First, most importantly, you need to have a Pixel 8 Prof, not the regular Pixel 8 — Google didn't answer my question as to why. When you want to use it just turn it on in your camera settings and start recording your video. When you're done, the video should be automatically or manually backed up to your Google Photos account. Then you wait. And wait. Sometimes, wait – Video Boost works on videos up to ten minutes long, but processing a two-minute-long clip can take hours.
Depending on the type of video you're recording, that wait may or may not be worth it. Google's support documentation It's designed to let you create “videos on your Pixel phone in high quality and with great lighting, colors and detail” in any lighting. But the Important Video Boost is a great low-light video service — team product manager Isaac Reynolds tells me. “Think of it as a Night Sight video, because all of the other algorithms' changes are in pursuit of Night Sight.”
All the processes that make our videos look great in good light — stabilization, tone mapping — stop working when you try to record video in very low light. Reynolds explains mercy The blurry video you get in low light is different. “OIS [optical image stabilization] A frame can be confirmed, but only to a certain length. Low-light video requires longer frames and poses a greater challenge for stabilization. “When you start walking around in low light, you can get a certain kind of intraframe blur with those long frames, which is a residual that OIS can compensate for.” In other words, it's more complex.
All of this helps explain what I see in my own video boost clips. In good light, I don't see much of a difference. Some colors do appear a bit more so, but I don't see anything that would compel me to use it regularly when there's plenty of light. In Very much Low Light Video Boost can restore some of the colors and details that are completely lost in a static video clip. But it's not nearly as dramatic as the difference between a regular photo and a night-side photo at the same position.
There's a real sweet spot between these extremes, though, where I can see Video Boost really coming in handy. In one clip I walk into a dark pergola at dusk Kobe Bell House, there is a significant improvement in shadow detail and post-boost stabilization. The more I used Video Boost in typical, medium-low indoor lighting, the more I saw a case for it. You start to see how static videos wash out in these conditions – like my son playing with trucks on the dining room floor. Turning on the video boost brought back some of the vibrancy I was missing.
Video Boost is limited to the Pixel 8 Pro's main rear camera, and can record in 4K (default) or 1080p at 30fps. Using Video Boost results in two clips – an initial “preview” file that is unenhanced and available for immediate sharing, and finally, a second “boost” file. There's a lot more going on under the hood.
Reynolds explained to me that Video Boost uses a completely different processing pipeline that retains much of the captured image data that would normally be discarded when you record a standard video file—similar to the relationship between RAW and JPEG files. A temporary file keeps this information on your device until it is sent to the cloud; After that, it will be deleted. This is a good thing, because temporary files can be huge – several gigabytes for long clips. However, the final upscaled videos are very reasonably sized – 513MB for a three-minute clip and 6GB for the temporary file I recorded.
My initial reaction to Video Boost was that it seemed like a stopgap — a demo of a feature that requires the cloud to work now, but will move to the device in the future. Qualcomm showed off a version of a similar device this fall, and that has to be the end game, right? Reynolds says he doesn't think so about it. “The things you can do in the cloud are always more interesting than the things you can do on the phone.”
The difference between what your phone can do and what a cloud server can do fades into the background
Case in point: Currently, Pixel phones run various smaller, optimized versions of Google's HDR Plus model on the device, he says. But the full “parent” HDR Plus model that Google has been building for its Pixel phones over the last decade is too big for any phone to realistically run. And the AI capabilities on the device will improve over time, so it's possible Some Things that can only be done in the cloud will move to our devices. But equally, what's possible in the cloud is changing. Reynolds says he sees the cloud as “another element” of Tensor's capabilities.
That way, video boost There is A glimpse of the future — a future where AI on your phone works hand-in-hand with AI in the cloud. Additional functions are handled by on and off device AI, and the difference between what your phone can do and what a cloud server can do fades into the background. It's not quite the “wow” moment that the Knight Side was, but it will be a significant shift in how we think about our phone's capabilities.