Samsung first introduced Space Zoom in 2020 with the Galaxy S20 Ultra, and it has gone on to feature in all its successors, including the new Galaxy S23 Ultra. It is a neat feature that leverages the periscope telephoto camera used in high-end Samsung Galaxy devices to achieve an incredible zoom range. In this case, making it possible to capture clear images of the moon, or so Samsung claimed.
Over the weekend, a Reddit user, u/ibreakphotos made a thread detailing how Samsung is not being entirely honest about how their camera goes about capturing and processing these moon shots.
This is not the first time the authenticity of Samsung’s Moon Shots had been put into question, but up to now, no one had proven them fake. Samsung explains that the moon shots are captured by using up to 20 pictures captured by the lens, which are then put together by AI to form a final image that has more details.
However, according to the Reddit user, this is not entirely true. To prove his hypothesis, the user downloaded a high-resolution image of the moon, downsized the image to a 170 by 170 resolution image, and then applied a Gaussian blur on the image to hide any details of its surface.
The user then put the edited low-quality picture on full screen on his monitor, walked to the other end of the room, switched off the lights, zoomed in on the monitor and took a photo.
The image he captured had more details than the blurry photo he had put on his monitor, a clear indicator that Samsung is leveraging an AI model to add in details such as craters and colour on the moon shots, something which Samsung had claimed not to be doing and instead explained that the camera used multiple images captured by the lens and then put all of them into one final shot that was more detailed.
In the past, people had tried tricking Samsung’s Space Zoom on a clove of garlic on a black background and even a table tennis ball, but these were not successful. A low-resolution 170 x 170 image of the moon however seems to have done the trick, providing the AI with just enough information to think it was looking at the real thing.
“Samsung is using AI/ML (neural network trained on 100s of images of the moon) to recover/add the texture of the moon on your moon pictures, and while some think that’s your camera’s capability, it’s actually not… None of the frames has the craters etc. because they’re intentionally blurred, yet the camera somehow miraculously knows that they are there.” explains the Reddit user
Samsung is yet to put a statement regarding the whole controversy. The company has always been clear about using AI to deliver high-quality moon shots. However, what people are angry about is that the company never mentioned that they would be adding in details and textures where none existed before.