After a week muck in old controversy bring up its head once again , Samsung has formally detailed how its smartphones take such high fidelity pic of the moon . While most of what ’s in the post is n’t anythingwe have n’t hear before , Samsung ’s response to this workweek ’s disarray over its Sun Myung Moon - shooting capabilities is a solid monitor that much of the illusion of smartphone photography is due to package enhancement on the backend .

It all lead off about a hebdomad ago onReddit , where most good arguing gets stoked . A user anticipate ibreakphotos went viral with their post about how Samsung ’s space zoom moon shot are faked , providing proof with screenshots . They concluded as such :

The moon picture show from Samsung are fake . Samsung ’s selling is deceptive . It is adding detail where there is none ( in this experiment , it was intentionally bump off ) . Inthis article , they cite multi - frames , multi - photo , but the realness is , it ’s AI doing most of the work , not the eye , the optics are n’t capable of resolving the detail that you see . Since the moonlight is tidally locked to the ground , it ’s very comfortable to develop your mannequin on other moon double and just slap that grain when a moon - like matter is detected .

The Galaxy S23 Ultra can take detailed shots of the moon—or can it?

The Galaxy S23 Ultra can take detailed shots of the moon—or can it?Photo: Florence Ion / Gizmodo

To anyone who knows how smartphone photos bring , hearing that much of the heavy lifting is done by AI should n’t be too surprising . Indeed , Samsung ’s reply to the findings reduplicate down on the fact that AI is working behind the view to improve the quality of certain shots . The company explains that its Scene Optimizer characteristic has tolerate moon photography since theGalaxy S21series — before it was bewilderingly marketed as “ space zoom”—and that the company has since ameliorate the algorithms associated with this variety of scene so that the feature jazz there a moon in the frame that need so - called optimization .

Samsungwrites :

The locomotive for recognizing the moon was build based on a variety of moon chassis and details , from full through to crescent moons , and is based on images taken from our view from the Earth .

The photo of the moon the author took for the Galaxy S22 Ultra review.

The photo of the moon the author took for the Galaxy S22 Ultra review.Image: Florence Ion / Gizmodo

It use an AI deep erudition simulation to notice the mien of the moonshine and key the area it occupies – as denoted by the square box – in the relevant image . Once the AI model has completed its learning , it can notice the country engross by the moon even in images that were not used in breeding .

I have screen Samsung ’s “ space zoom ” way on two Samsung smartphones . you’re able to see a photo of the moonshine I took with the Galaxy S22 Ultra in ouroriginal reviewfor that phone . I remember notice on the frigidity factor of being able to produce such an icon and share it on societal culture medium .

It ’s deserving register through Samsung ’s integral post , especially if you are curious about how smartphones manage to produce the images they do . However , it does n’t justify Samsung of the fact that it pushed selling to make it sound like its cameras were zoom 100x natively , like a DSLR or mirrorless camera with a adequate to telescopic crystalline lens . A smartphone and its glass are no match for a full - blow camera setup . The sensor size and fanny - face glass need to be big to capture the detail you need from the Sun Myung Moon .

Tina Romero Instagram

This is also not your distinctive postprocessing , and or else is more akin to a live photoshop . Samsung ’s Sun Myung Moon photo , as discovered by the original Reddit situation , are not always improving what you ’ve already shot but are instead using your photo alongside multiple others ( taken by your camera at the same time without your noticing ) and AI deep learning of existing Sun Myung Moon details to partially synthesise what the earphone thinks the lunation ’s lineament should look like in your shot . This means things like colour will be save ( which would n’t be the case if the phone simply copy and paste other moon pic over yours ) , but also that your speech sound will still be able to take advantage of the Sun Myung Moon ’s tidally lock position to know what it should look like at any given time and line up your photo to mate .

At the same clip , if it quacks like a moon photo , it might just be a Sun Myung Moon photo . I ’m reticent to take the side of a jumbo pudding stone , but I also do n’t fault Samsung for pushing the narrative on its camera capabilities . All most of us want our smartphones to do is capture the world so we can look back upon our gibibyte of selective information with affectionateness rather than wonder why a photo appears the way it does .

Apple and Google employ similar selling strategies with their respective smartphone camera , with Google leaning into the estimation that its auto learning makes its photo the well out of the contest . If you desire documented exposure of the moon phases as they ’re in reality happening outside your window , consider ditching the $ 1,200 starting Leontyne Price of the Samsung Galaxy S23 Ultra and give some of that immediate payment to a telescopic camera setup instead .

Dummy

GoogleSamsungSamsung GalaxySmartphoneSmartphonesTechnology

Daily Newsletter

Get the best technical school , science , and culture tidings in your inbox daily .

News from the hereafter , delivered to your present tense .

Please take your desired newssheet and submit your email to upgrade your inbox .

James Cameron Underwater

You May Also Like

Anker Solix C1000 Bag

Naomi 3

Sony 1000xm5

NOAA GOES-19 Caribbean SAL

Ballerina Interview

Tina Romero Instagram

Dummy

James Cameron Underwater

Anker Solix C1000 Bag

Oppo Find X8 Ultra Review

Best Gadgets of May 2025

Steam Deck Clair Obscur Geforce Now

Breville Paradice 9 Review