Video Boost on the Google Pixel 8 Pro is a handy video tool to use in low light

When Google introduced Night Sight on the Pixel 3, it was a real revelation.

It was as if someone literally turned on the lights in your low-light photos. Previously impossible shots are now possible – without the need for a tripod or flash in your headlights.

Five years later, taking photos in the dark is outdated — almost every phone across the price spectrum comes with some sort of night mode. But video is a different story. Night modes for still images capture multiple frames to create one brighter image, and it is not possible to copy and paste the mechanics of this feature into video, which by its nature is actually A series of pictures. The answer, it seems recently, is artificial intelligence.

When the Pixel 8 Pro launched this fall, Google announced a feature called Video Boost with Night Sight, which will arrive in a future software update. It uses AI to process your videos – bringing out more detail and improving colours, which is especially useful for low-light clips. There's just one problem: This processing happens in the cloud on Google's servers, not on your phone.

As promised, Video Boost started arriving on devices a couple of weeks ago with the Pixel December update, including our Pixel 8 Pro review unit. that's good! But it's not the defining moment that the original Night Sight was. This shows just how great Night Sight was when it first came out, as well as the special challenges video presents to a smartphone camera system.

Video Boost works like this: First, and most importantly, you must have a Pixel 8 phone forefront, not a regular Pixel 8 — Google didn't answer my question about why that is. You can turn it on in your camera settings when you want to use it and then start recording your video. Once done, the video should be backed up to your Google Photos account, either automatically or manually. Then you wait. And wait. And in some cases, keep waiting — Video Boost works on videos up to ten minutes long, but even a clip just a few minutes long can take hours to process.

See also  Intel: 'Moore's Law isn't dead' as Arc A770 GPU goes for $329

Depending on the type of video you're recording, this wait may or may not be worth it. Google support documentation It says it's designed to let you “create videos on your Pixel phone at higher quality and with better lighting, colors, and detail” in any light. But the Main The thing that Video Boost does serve up is better low-light video — that's what group product manager Isaac Reynolds told me. “Think of it as Night Sight video, because all the tweaks to the other algorithms are all about achieving Night Sight.”

All the processes that make our videos look better in good light — stabilization and tone mapping — stop working when you try to record video in very low light. Even, Reynolds explains Kindly The amount of blur you get in low-light video varies. “O.I.S [optical image stabilization] It can stabilize the frame, but only to a certain length.” Low-light video requires longer frames, and that's a big challenge to stabilize. “When you start walking in low light, with very long frames, you can get a certain kind of blur within the frame that's just the rest Which OIS can compensate for.” In other words, it's very complicated.

All of this helps explain what I see in my Video Boost videos. In good lighting, I don't see much difference. Some colors stand out a little more, but I don't see anything that would compel me to use them regularly when available light is plentiful. in To the fullest extent In low light, Video Boost can completely recover some of the colors and details lost in a standard video. But it's not as dramatic as the difference between a normal image and a Night Sight image under the same conditions.

See also  Google Home App Introduces Redesigned Controls

There's a sweet spot between these two extremes, where I can see Video Boost becoming really useful. In one clip I was walking down a path at dusk inside a dark pergola Housing Coby BellThere is a noticeable improvement in shadow detail and stability after enhancement. The more you use Video Boost in normal, medium, and low indoor lighting, the more you'll see the right condition for it. You start to see how washed-out standard videos look in these conditions — like my son playing with trucks on the dining room floor. Turning on Video Boost restored some of the liveliness I had forgotten I was missing.

Video Boost is limited to the Pixel 8 Pro's main rear camera, and records at 4K (default) or 1080p at 30fps. Using Video Boost creates two clips – an initial “preview” file that is not boosted and is immediately available for sharing, and eventually a second “boosted” file. However, there is a lot going on under the hood.

Reynolds explained to me that Video Boost uses a completely different processing pipeline that retains a lot of captured image data that would normally be discarded when recording a standard video file — a bit like the relationship between RAW and JPEG files. A temporary file keeps this information on your device until it is sent to the cloud; And then it is deleted. This is a good thing, because temporary files can be huge – several gigabytes for longer clips. However, the final enhanced videos are a much more reasonable size – 513MB for a three-minute clip I recorded versus 6GB for a temporary file.

My initial reaction to Video Boost was that it seemed like a temporary solution — a demo of a feature that needs the cloud to work now, but will be ported to the device in the future. Qualcomm showed off a similar version on the device this fall, so this has to be the final game, right? Reynolds says that's not the way he thinks. “The things you can do in the cloud will always be more impressive than the things you can do on a phone.”

See also  1,000 'digital-only' titles are estimated to disappear when Nintendo 3DS and Wii U eShop shut down

The distinction between what your phone can do and what a cloud server can do will fade into the background

Case in point: Right now, he says, Pixel phones run several smaller, improved versions of Google's HDR Plus model on the device. But the full “native” HDR Plus model that Google has been developing over the past decade for its Pixel phones is too large to realistically run on any phone. On-device AI capabilities will improve over time, so it's likely that this will happen some Things that can only be done in the cloud will move to our devices. But likewise, what is possible in the cloud will also change. Reynolds says he believes the cloud is just “another component” of Tensor's capabilities.

In this sense, video enhancement He is A glimpse of the future – It's just a future where the AI ​​on your phone works alongside AI in the cloud. More functions will be handled by a mix of on- and off-device AI, and the distinction between what your phone can do and what a cloud server can do will fade into the background. It's hardly the “aha” moment that Night Sight was, but it will mark a major shift in how we think about our phones' capabilities nonetheless.

Leave a Reply

Your email address will not be published. Required fields are marked *