South Korean hardware company Samsung has finally gone public following fresh claims its phones are artificially enhancing photos of the moon taken using its Galaxy phones.
Redditor ibreakphotos resurfaced an old controversy last week by outlining a clever experiment they ran which was designed to expose how Samsung’s AI is tricking people into thinking its phones can take better photos than they really can.
The experiment was relatively straightforward: take a high-resolution photo of the moon from the internet, downsize it and apply a blur, full-screen that image on a monitor, turn the lights off, then go to the other end of the room and take the photo.
If the Samsung phone was being honest, it would deliver a blurry photo of a blurry photo of the moon. But that’s not what happened.
Instead, ibreakphotos found in their camera roll a nicely detailed photo of the moon, complete with craters and seas.
“The moon pictures from Samsung are fake,” the Redditor said. “Samsung's marketing is deceptive. It is adding detail where there is none.”
The phones can’t be using software to sharpen the image or add details taken from multiple frames the Redditor said, because in their experiment “all the frames contain the same amount of detail”.
“None of the frames have the craters etc because they're intentionally blurred, yet the camera somehow miraculously knows that they are there.”
Left: the blurred moon photo. Right: the photo the phone produced. Images: Reddit/ibreakphotos
On Wednesday, Samsung finally responded to the controversy with a blog post explaining how the technology works.
“Since the introduction of the Galaxy S21 series, Scene Optimiser has been able to recognise the moon as a specific object during the photo-taking process, and applies the feature’s detail enhancement engine to the shot,” Samsung said.
“When you’re taking a photo of the moon, your Galaxy device’s1 camera system will harness this deep learning-based AI technology, as well as multi-frame processing in order to further enhance details.”
Samsung’s explanation seems to confirm the Redditor’s theory that the phone is using AI to recognise the moon in its camera and then create, using a neural network trained on moon images, an “enhanced” version of the photo.
In practice, it’s no different to the AI camera Google introduced with its Pixel 6 phones. Google advertised its ‘Night Sight’ mode as a way of getting crips, bright images in low light settings that “previously would have required a professional camera, a tripod, and quite a bit of photography know-how”.
Samsung has said it will continue working on its ‘Scene Optimiser’ to “reduce any potential confusion that may occur between the act of taking a picture of the real moon and an image of the moon”.