Samsung caught faking Zoom Photos of the moon

• I don't think this has ever been done (at least not from any reputed company anyway, what hawai or some random china company does/did is irrelevant). Bringing AI into this is comically misleading.
I think the term "AI" is being thrown around in this discussion because of the "auto scene detection" setting in the Samsung camera app. Imo this is completely fine since that is what cameras have been doing for many years now under the name of "scene processing" or more generic "AI scene detection" which detects the type of photo being taken grass, landscape, sky, etc and this list has only been growing to accommodate more and more scenarios with better suited processing for each scene.

From a processing standpoint, I think the testing methodology is a bit flawed since all I see Samsung doing is detecting the moon, decreasing the shutter speed as much as possible, turning the shadows up, sharpness up, etc. I say the test is wrong because when using one and a half moons, the software probably only detects the "fuller" moon and applies processing on it. I think a better test could be to blur half of the moon and then try taking a photo, since then it should be detected as a moon but there should be a clear difference between the amount of processing on the top half and the bottom half (Also this kinda model shouldn't have been trained by the AI because well, it's unnatural). If anyone has an ultra they can test out, please try.

• Most people preferring this behaviour would be the Gen Z I hope. If not, they are clueless about photography. Drawing board anyone? Lol.
People want the best point and shoot when they talk about phone cameras so phone companies tend to target the 95% of the market. (Also the reason why iPhone 14 won MKBHD's best camera system). And for the rest of the photographers who know how to operate a camera, the pro mode in all the camera apps provides the granularity and control over the image. I see this as a win-win.
 
List of phones using the same sensor hardware that of Pixel 6A. Google used same sensor for most of its phones till Pixel 6A. Are we suggesting the quality has stayed the same?

The same sensor is also used in cheapest phone in the market Redmi 8A which used to cost like 6k. If it's not the software then Redmi 8A should take just as fine photos as Pixels. I have poco f1 which also has this sensor. Yeah it won photography competition on MKBHD but I don't think it's as good as the pixels.

Purebred photographers shouldn't do smartphone photography. Because today smartphone photography is 80% computational and 20% what sensor sees. Yes, Samsung did go a little overboard but I don't think it's a mistake.

I say the test is wrong because when using one and a half moons, the software probably only detects the "fuller" moon and applies processing on it.
I think mkbhd did check moon with a different phase and the phone seems to process the phase.
 
Is smartphone photography even photography when details are added instead of captured?

Hey man pulling phone out of the pocket is too much work. Let me just move my mouse.


After 10 years nobody will be discussing pureness of photography anymore. AI will create all kinds of photos we want at our commands.
 
Well even the likes of Photoshop enables usage of AI for one click post processing akin to what happens in your mobile. As long as they have the option to manually control how much and what you really want the AI to do and disable AI as and when needed by the user, I am fine. I would definitely object to allowing AI to do everything independently of the user. If that is where the world is heading to just get rid of the cameras and call upon DALL.E to create the photo you need. Fortunately I know there are folks, who still shoot B&W and on film.
 
Just for pondering:

• I don't think this has ever been done (at least not from any reputed company anyway, what hawai or some random china company does/did is irrelevant). Bringing AI into this is comically misleading.
At this point whatever happens in the background, companies keep saying "AI" for the sake of marketing. They could even come up with advanced AI, superior AI or even Ultra AI just to one-up their competition. If we are talking about Huawei, they too make some excellent phones, but got caught up in weird shit.

• Most people preferring this behaviour would be the Gen Z I hope. If not, they are clueless about photography. Drawing board anyone? Lol.

Not just Gen Z, the majority of people prefer that. Atleast from my observation.
The main requirement for people is, point and shoot that's all, nobody has the patience to edit images unless that editing transforms the photos like they are on a different planet doing Tron-like racing.
People want usable pictures to share on social media and companies are trying to manage whatever they can with those little sensors.
I am only worried if companies will force the "AI processing" pictures instead of giving the unprocessed pictures when necessary.

In any case, if this indeed continues, then RIP photography as we have always known it.

I don't think photography will ever go into RIP stage. The need for professional gear doesn't vanish at any point. There will always be people needing or wanting that, be it professionals or enthusiasts.

Some people are mentioning that some of us are confused about using the word AI, see the companies are using that word for marketing irrespective of whatever programming is happening in the background. Hence "AI".

After 10 years nobody will be discussing pureness of photography anymore. AI will create all kinds of photos we want at our commands.

AI definitely can't draw your wedding photos, or birthday events of your kids. You still need photography right there.
 
Last edited:
Plot twist: it's not the sensor's megapixels that allows you to take detailed photo of the moon, but the optical telephoto lens. There is simply no way something as tiny as a phone's lens can resolve that amount of detail. You can certainly magnify the moon with a long enough lens (more focal length), but without increasing the aperture (lens diameter) the image will become fuzzy, and there's no software that can compensate for this.

So if they can't make it, they fake it.

Here's a cropped JPG shot from my old OnePlus 3 through a telescope at ~120x "zoom":
IMG_20201001_202154.jpg


And here's the same photo that I postprocessed for 5-10 secs in FastOne:
IMG_20201001_202154Edit.jpg


All I did was to adjust Shadows/Highlights/Contrast sliders. Not even Sharpened! This clearly shows how powerful software is to bring out details even from a heavily compressed JPEG. However, those details were availble thanks to a big telescope acting as a lens for the phone. Without aperture, there is NO WAY you can magnify that much without losing details.
 
S23 ultra user here, from my experience, it is AI adding details and sharpening. But its not as bad as the Huawei one where it blindly added moon overlay, to get extremely good moon pics. You can call it cheating but I think it comes in more in realm of computation photography like pixel tricks. skin tone enhancements, face edits etc, it falls more into that category. I have more pics (took while I was in flight), and it is a oversharpened mess in 100x which is expected I believe
moonshot.jpg

this is bit clear one
moon blur.jpg

This is blurry, so its atleast AI unscaling? or sharpening intelligently, adding details with the blur
mountain 30x.jpg

30x himalayas i think

mountain 100x.jpg

100x on one of the peaks
lotus temple.jpg

and bonus add on, lotus temple
 
This isn't just with moon photos, and this is not fake. This is just lazy journalism, with no investigation of its own and using a random person's claim on reddit as gospel. This post processing happens with every photo taken in auto mode, and it is just the AI and algorithms doing their work. It is pretty impressive how far phone cameras have come in the last 5 years - so much so, that I prefer my S22 Ultra over my Fujifilm X-T20 because it is simply that good. Specifically, I don't have to stack photos or take multiple photos with or edit a photo to get close to my ideal result; the phone just does all of that for me on auto mode itself. I also have GCam and there is simply no need for my Fujifilm X-T20 except for portrait photoshoots and non-daily stuff.
The basic question is more nuanced, to the extent that is the photograph you are taking reflecting your creativity and decision-making? You could simply use Midjourney to generate any photo in any context without actually being there or capturing the moment, as that will look better than any photo you take in imperfect conditions while travelling. We could get into the entire Ship of Theseus argument here.

Real-life photography will become more of a niche, but also more valuable on account of that. The layman will of course be super impressed by this, as they have been with filters - whatever makes it good for vanity on social media.
I think this isn't the first time Samsung has been accused of doing such things. Many companies did similar things and got caught. I remember even Apple, Nokia got accused.
Nokia got caught using a DSLR while advertising the phone camera with the "images simulated" disclaimer, which was more explicit.
 
S23 ultra user here
Could you try photgraphing this image from far away and see how it performs?

Then repeat the same for a different image that has been made smaller and blurry:

1small.png

[Both images are 170x170, not sure why the first looks bigger]

If it doesn't perform similarly for any other subject, we can put this matter to rest.

Photography is about capturing what there is. Not inferring what should be there.

If you are not interested in the real, physical truth, save your money and just Google images for "moon". Want to go to the Taj Mahal? I will photoshop a pic of you in front of it for a fraction of your travel expense.

Unfortunately we live in a time when there are filters that make you look like you're wearing a mask.
 
The devices are made to cater the common masses. Not the tech savvy or purist photographer.

Things should work well, should look pretty, should look superior. NOONE CARES what is going on under the hood. Tech companies have been taking advantage of the common masses and will continue to do so.
 
I think mkbhd did check moon with a different phase and the phone seems to process the phase.
That's exactly why I suggested blurring only half of the moon. Even better if the blur pattern is of quadrants.
1678887584861.png

(I was previously suggesting top half and bottom half blur resulting in:
1)The AI recognises a half-moon and applies its processing only on the lower half leaving the upper half untouched.
2)The AI tries to overlay the entire moon and the difference of details would be same between the top and bottom half of the moon.
If the phone goes with the second case we can easily deduce the answer but the first case gets tricky as the processing method (effects vs overlay) cannot be distinguished.)

Hey man pulling phone out of the pocket is too much work. Let me just move my mouse.


After 10 years nobody will be discussing pureness of photography anymore. AI will create all kinds of photos we want at our commands.

if(drawing similar to database) -> print database.image
What about everything that lies out of the database? What about pictures of books, finger prints, or any kind of finer detail?

Photography is about capturing what there is. Not inferring what should be there.
"The art or practice of taking and processing photographs" -according to google but to each to their own.
Modern day consumer level photography has become.. well.. what people want to see. Else things like night-mode wouldn't exist, portrait mode wouldn't exist, so why didn't people bother objecting against that? Again, answer lies in the first statement, each to their own.
 
if(drawing similar to database) -> print database.image
What about everything that lies out of the database? What about pictures of books, finger prints, or any kind of finer detail?
AFAIK, those details in the image are artificially generated. The tree isn't real. There's no real photo of that tree. That mountain doesn't exist either. Finer details like synthetic fingerprints should also be possible. That tool runs on most RTX graphics card. Give it a spin if you have one.

AI definitely can't draw your wedding photos, or birthday events of your kids. You still need photography right there.
My statement was a little sarcastic. But only a little. Current AI can do some of that already. You can give it your selfies photo and ask it to "make" a wedding photo with elon musk as guest of honor.
 
My statement was a little sarcastic. But only a little. Current AI can do some of that already. You can give it your selfies photo and ask it to "make" a wedding photo with elon musk as guest of honor.
This kind of AI would be very useful for storyboarding, memes, wallpapers and would also satisfy some porn fantasies to a great extent.
 
Modern day consumer level photography has become.. well.. what people want to see. Else things like night-mode wouldn't exist, portrait mode wouldn't exist, so why didn't people bother objecting against that? Again, answer lies in the first statement, each to their own.
While supply does caters to demand as you rightfully said, this is not photography by any stretch of the definition. Photo + graphy = light that is recorded. It can be analog films, digital sensors, or whatever, but the light is coming from the source, and that is what is being recorded. You can't fake the record and claim it to be a true photograph.

Night mode on digital cameras is to combat digital noise that the sensor creates when it is trying to amplify the low light conditions (High ISO). There is also thermal noise which occurs when the sensor becomes hot. Noise reduction in this case is to address the shortcomings of the sensor tech. Basically it is removing false data that the sensor is creating.

Samsung however is adding fake data to make the images look better. So you are paying extra for a phone to spit out a moon image, when you could have just googled it for free :D

In tech terms, this would be akin to changing the VBIOS of a GTX 650 to read RTX 4090 Ti. If you don't see a problem with that, I have a few 4090 Ti's available for 1L each ;). You wanted a 4090. I gave you a GPU that says it is 4090. Fair deal right
 
Last edited:
I made a video for y'all explaining what I have said on this thread.


/s
I guess this testing methodology also works, and there's the answer it doesn't overlay a high res moon shot but instead dials in the textures and other photo editing techniques to create a better looking moon.
Whether people want it is like asking do people agree with face manipulation in the phones (bigger eyes, smaller nose, those kinds). Everybody has their opinions on it.

While supply does caters to demand as you rightfully said, this is not photography by any stretch of the definition. Photo + graphy = light that is recorded. It can be analog films, digital sensors, or whatever, but the light is coming from the source, and that is what is being recorded. You can't fake the record and claim it to be a true photograph.
I guess the term 'Smartphone photography' is more suited here since it's more "smart"? Also the google definition is pretty vague on what comes under "processing" the photo.

Night mode on digital cameras is to combat digital noise that the sensor creates when it is trying to amplify the low light conditions (High ISO). There is also thermal noise which occurs when the sensor becomes hot. Noise reduction in this case is to address the shortcomings of the sensor tech. Basically it is removing false data that the sensor is creating.
I mean, it still isn't "True photography" is it? It isn't what your eyes would've seen anyways. Also under that logic, enhancing the moon could be written off as "Compensating for the aperture difference" as stated by one of your previous comments.

Samsung however is adding fake data to make the images look better. So you are paying extra for a phone to spit out a moon image, when you could have just googled it for free :D
The video lockhrt999 shared clearly shows the processing that is being done on the moon. No "fake" details are added, it's just that the ones which are present are being better represented. Really great vid, must say, summarises the issue quite well. lockhrt999 should become a full-time youtuber, could go big on the platform /s.

In tech terms, this would be akin to changing the VBIOS of a GTX 650 to read RTX 4090 Ti. If you don't see a problem with that, I have a few 4090 Ti's available for 1L each ;). You wanted a 4090. I gave you a GPU that says it is 4090. Fair deal right
I don't completely agree with this analogy as it would be more suited towards the comparison of megapixel than this particular scenario. I think a better analogy would be you're paying for a "low-end" sensor and getting a better image, i.e. Paying for a GTX 650 and getting the performance of a 4090Ti. Similar to how DLSS and FSR are boosting the framerates using "AI". But the ethicality and nature of realism of the generated frames is still highly subjective.
 
Back
Top