Technology

Samsung’s shots of the moon force us to wonder how much artificial intelligence is too much


And unlike, say, the Eiffel Tower, its appearance won’t change drastically based on lighting. Moon shooting usually only happens at night, and Samsung’s processing falls apart if the moon is partially obscured by clouds.

One of the most obvious ways Samsung tinkers with midrange contrast processing is making its topography more noticeable. However, it is also clearly able to render the appearance of texture and detail not present in the initial image.

Samsung is doing this because the 100x zoom photos of the Galaxy S21, S22, and S23 Ultra phones suck. Of course they do. It involves cropping heavily into a tiny 10MP sensor. The Periscope zoom feature in phones is cool, but it’s not magic.

credible theories

Huawei is the other big company accused of faking its moon photos, in another great way Huawei P30 Pro As of 2019. It was the last flagship phone released before the company was blacklisted in the US, effectively destroying the appeal of its phones in the West.

Android authority claimed The phone has pasted a stored image of the moon into your photos. Here’s how the company responded: “Moon Mode works on the same principle as other major AI modes, in that it recognizes and enhances details within a photo to help people take better photos. of storage space as AI mode recognizes over 1,300 scenarios. Based on machine learning principles, the camera recognizes a scenario and helps optimize focus and exposure to improve details such as shapes, colors, and highlights/low-lights.”

Familiar, isn’t it?

You won’t see these technologies used in many other brands, but not for any lofty reason. If the phone doesn’t have at least a 5x long distance zoom capability, the moon mode is pretty much useless.

Trying to photograph the moon with an iPhone is tough. until the iPhone 14 Pro Max It doesn’t have a zoom range of its own, and the phone’s automatic exposure will turn the moon into a burning white blob. From a photographer’s perspective, the S23’s exposure control alone is excellent. But how “fake” are S23’s moon photos, really?

The most generous explanation is that Samsung uses real camera image data and only applies its machine learning knowledge to massage processing. This may help, for example, to trace the outlines of Sea of ​​serenity And Sea of ​​calm When trying to bring out a greater sense of detail from a blurred source.

However, that line is extended in the way that the final image displays the position of the craters Kepler, Aristarchus, and Copernicus with seemingly uncanny accuracy when these small features are not perceptible in the source. While you can get an inference as to where the moon’s features are from a hazy source, this is next-level stuff.

With that said, it’s easy to overestimate how much of a lead the Samsung Galaxy S23 gets here. Pictures of the Moon might look good at a glance, but they’re still bad. newly vs video Featuring S23 Ultra and Nikon P1000 It shows what a decent DSLR consumer superzoom camera can do.

Trust question

The outrage over this moon issue is understandable. Samsung uses moon images to highlight the 100x camera position and the images are somewhat manufactured. But it actually just poked a finger out of the ever-expanding Overton AI window here, which has guided phone photography innovation for the past decade.

Each of these tech tricks, whether you call them AI or not, is designed to do what would have been impossible with the rudimentary basics of a phone’s camera. One of the first, and arguably the most important, was HDR (High Dynamic Range). Apple included HDR in its Camera app in iOS 4.1, which was released in 2010, the year of the iPhone 4.



Source link

Show More

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button