Samsung wants to put an end to the controversy that has formed over the processing technology of its camera when capturing photos of the Moon. The company has shared on its website an explanation of the Moon photo detection system that it uses from the Galaxy S21. If you have Scene Optimizer turned on, the AI ​​detects when you’re taking a clear shot of the Moon at 25X zoom or higher. The technology reduces glare, captures multiple frames (to produce a bright, low-noise image), and uses a neural network to improve detail using a high-resolution reference image for comparison (this is possible because you’ll always see the same image). lunar surface unless you go into space).
The article comes after a Reddit user reported that Samsung was faking images of the Moon by adding details that were not present in the raw scene. To prove it, he took photos of blurry, low-resolution images on a computer screen: there’s no information the phone could retrieve from the shot. However, the device seemed to add information that simply wasn’t there.
Actually, Samsung uses the real photo for reference. However, its algorithms are tuned to produce photos that do not represent what you get through the lens. The company seems to be aware of this, as it says it’s refining the Scene Optimizer to “reduce any potential confusion” between taking photos of the actual Moon and mere existing images of it.