Samsung Exec Envisions Future Where Photos Are Customized to the Shooter

Have you ever wondered why different manufacturers’ smartphone images have a certain “look” to them? In an interview with Engadget, Vice President and head of Samsung Mobile’s visual software R&D Joshua Sungdae Cho explains how images are rendered now will change going forward.

According to Engadget, it’s hard to find a review of Samsung’s latest flagship, the Galaxy S21 Ultra, that doesn’t call out the way the company chooses to render images: fine details are heavily sharpened – to a degree that may consider overkill – colors are pumped, and auto smoothing computationally “airbrushes” faces. For folks who are hobbyists and professionals in the photography space, these choices are bad individually but near sacrilegious when they happen all at the same time.

So why does Samsung do this? According to the interview, Cho says the company isn’t the one responsible for this kind of processing: the users are.

Samsung apparently doesn’t have an internal team that determines what images should look like, rather it uses crowdsourced data to determine what the average person believes looks best in an image. They do this by actually holding focus groups with users from around the world and ask them what about images they like and what could be improved. The goal is to try and pinpoint what aspects of an image the company can focus on to appeal to the most users the most amount of time.

The conversations can get pretty granular, though. In trying to suss out what people like about their favorite images, Cho says discussions can veer toward subjects like color tone and saturation, noise levels, sharpness of details, overall brightness and beyond, all so Samsung can tune its HDR models and its smart scene optimizer to deliver what he calls “perfectly trendy” photos.

Trying to please everyone all of the time is an impossible task, but Samsung seems intent on trying to do it.

That said, Cho seems to understand this method isn’t a sustainable practice and envisions a future where the same photo taken by multiple people will look different for each of them, with AI able to know exactly how to best please each individual.

“When there are ten people taking a picture of the same object, I want the camera to provide ten different pictures for each individual based on their preference,” Cho says.

The limiting factor to this future is, unfortunately, neural processing unit (NPU) technology. Right now, chip makers like Qualcomm are only in their third iteration of NPU technology, which is relatively young. For Cho, this is still the “initial stage” of development. But with the right investment into research and development of NPUs, it’s very likely that your Samsung phone will take photos that look nothing like the ones your friend takes on theirs.

Cho explains that the right neural engine could look at a person’s album to determine what they saved and what they deleted, examine the filters they use, and track trends with how the image might be edited.

“Those are some of the things that we could look at in order to ensure the system learns about the user,” he says.

For now, it’s unclear how close or far away that future is, and Cho wouldn’t say. What is clear is that chip makers have a way to go in order to keep up with the dreams and ambitions of people like Cho.

Engadget’s full interview is definitely worth your time if you’re interested in the future of computational photography and how a company envisions the future of smartphone image-making. You can read it here.


Image credits: Header photo by Dan Smedley.