If you missed it last week Other big newsGoogle shipped a mobile phone camera with the zoom function that uses the generated AI. That’s right: Pixel 10 Pro Come with AI in the camera application This cleans up otherwise bad digital zoom images until 100 times. It’s a-photo nightmare, but it’s also good – at least it seems to be. However, it’s hard to be completely sure what your photos should look like when you’re miles away. So I brought a side-by-side comparison ringtone: Nikon Coolpix P1100.
For those who are unfamiliar, the P1100 is a huge super camera with an equivalent range of 24-3000mm. When you have optics like this, you don’t need to do any upgrades like the Pixel 10 Pro. Of course, the camera can reduce noise, sharpen and color adjustments. But this doesn’t have to be completely guess Start with the appearance of any single pixel, as it has some information.
Like the Pixel 10 Pro, digital zoom is another matter. Upgrading the image 10 or 20 or 100 times without the benefit of optical amplification, this fills a lot of the gaps. Algorithms can make good guesses, but they just do this: guess. The Pro Zoom of the Pixel 10 Pro makes these guesses with the help of generating AI. And, if we are taking AI zoom photos, what better theme than the moon?
1/3
It requires many smartphone cameras to take photos of the moon, Google Not the first phone maker to bring AI into battle. The Pro res Zoom version certainly looks like the moon, but the AI gives it a weird sponge-like texture that doesn’t look quite right, especially comparing it to the P1100 version.
1/3
Image above Lumen fieldA mile away overlooks the view from a downtown Seattle area near Pike Place Market. It was a hazy, dark day, so apologize for the monotonous imagery, but they have a better idea of where the Pro Res Zoom stands out and where it falls. The AI model can make the numbers on the sign readable and very clean the edges, but basically erase the metal cladding on the sides of the building, such as excessively aggressive noise reduction. Again, AI doesn’t know how to deal with writing.
1/3
These photos of Starbucks headquarters were taken from the same point of view, located a mile from the lumen. On the small screen, the AI version looks pretty good, but if you look closely you’ll see where it turns into a window and the clock on the tower gives Salvador Dalí a little bit of processing.
1/3
On a sunny day, I pointed both cameras at another Seattle landmark. I was about three miles from the Space Needle and encountered another enemy of remote photography: Thermal Fog. AI doesn’t know much about how to handle twisted lines and creates Tim Burton’s Space Needle instead. But you’ll see that the P1100 doesn’t perform much, what’s all the hot atmosphere between the lens and the theme.
1/3
In this case, thermal fog is obviously also a problem. In the image above, I’m not too far from the Boeing Tian’s plane, but there’s a lot of hot asphalt between me and the plane I’m photographing. But this is obviously where AI shines. Actually, this might be yours The only one If you want to correct something tricky like hot fog, choose.
This is where everything becomes complicated
This is where everything becomes complicated. Generated AI has existed in photo editing tools for years, which is very useful for removing noise from photos taken from old DSLR cameras. Thermal fog is an even more troublesome issue. It is nearly impossible for traditional digital photo editing tools to correct random twists and waves. Landscape and wildlife photographers are Already included AI editing tools Your regular Lightroom slider can only do things.
Is it different not only in professional image editors that are used after the fact when AI is located in a camera application? Absolutely. Does Pro Res Zoom often get many problems wrong? Too. But it’s an inspiring exercise, and I don’t think it’s the last time we’ve heard of the generative AI used in the image capture tool itself.

