I think the term "AI" is being thrown around in this discussion because of the "auto scene detection" setting in the Samsung camera app. Imo this is completely fine since that is what cameras have been doing for many years now under the name of "scene processing" or more generic "AI scene detection" which detects the type of photo being taken grass, landscape, sky, etc and this list has only been growing to accommodate more and more scenarios with better suited processing for each scene.• I don't think this has ever been done (at least not from any reputed company anyway, what hawai or some random china company does/did is irrelevant). Bringing AI into this is comically misleading.
From a processing standpoint, I think the testing methodology is a bit flawed since all I see Samsung doing is detecting the moon, decreasing the shutter speed as much as possible, turning the shadows up, sharpness up, etc. I say the test is wrong because when using one and a half moons, the software probably only detects the "fuller" moon and applies processing on it. I think a better test could be to blur half of the moon and then try taking a photo, since then it should be detected as a moon but there should be a clear difference between the amount of processing on the top half and the bottom half (Also this kinda model shouldn't have been trained by the AI because well, it's unnatural). If anyone has an ultra they can test out, please try.
People want the best point and shoot when they talk about phone cameras so phone companies tend to target the 95% of the market. (Also the reason why iPhone 14 won MKBHD's best camera system). And for the rest of the photographers who know how to operate a camera, the pro mode in all the camera apps provides the granularity and control over the image. I see this as a win-win.• Most people preferring this behaviour would be the Gen Z I hope. If not, they are clueless about photography. Drawing board anyone? Lol.