Cho Seong-dae, head of the Visual Solution Team (Executive Vice President) at the MX Business Division of Samsung Electronics, answers questions during a press briefing at the JW Marriott Union Square in San Francisco on the 25th (local time)./Courtesy of Sim Min-gwan

The core of the Galaxy S26 series camera is to make the entire process from shooting to editing and sharing easier and more convenient.

Jo Seong-dae, head of the Visual Solution Team (vice president) in the MX Division at Samsung Electronics, stated accordingly at a press briefing held on the 25th (local time) at the JW Marriott Union Square in San Francisco.

Vice President Jo oversees development of visual technologies at Samsung Electronics, including camera software and image quality, AI solutions including nightography, Gallery and Editor, and Generative AI editing. Jo summarized Samsung's camera philosophy into five pillars—light's essence and portrait expression, every moment including the night, creativity for everyone, and AI that breaks limits—and said that under the slogan "every moment of life in the language of technology," the user experience has evolved.

Jo particularly cited improved low-light shooting as a performance touchpoint of the Galaxy S26 Ultra. He said, "The wide-angle camera's aperture is F1.4, and the 5x telephoto is F2.9, designed to take in 50% more light than the previous model," adding, "In dark environments, AI finely controls noise and detail on top of the hardware foundation, a structure that further elevates nighttime video quality." The Galaxy S26 Ultra features a lens array of a 200-megapixel wide, a 50-megapixel ultrawide, a 50-megapixel 5x telephoto, and a 10-megapixel 3x telephoto.

Jo emphasized advancement of the Pro Visual Engine (an AI-based image processing engine). He said, "Each sensor has different noise characteristics, so processing them in the same way can cause either detail loss or residual noise," adding, "We added a dedicated block to the mobile application processor (AP) that pre-removes fine particles tailored to each sensor's characteristics before image signal processor (ISP) handling."

He also added, "For exposure stability in nighttime video, we added an AI-based exposure system trained on diverse exposure scenes, and combined it with gyro data that detects shake to aim for natural brightness control even in sudden scene transitions."

In editing, the evolution of "Photo Assist" was foregrounded. Jo explained, "With precise selection and image understanding, it accurately grasps the user's intent to build prompts, and completeness is determined in the process of post-processing the result." He said multimodal inputs—such as not only text but also voice and object images to be composited—enable editing and generation that fit the intent with more concise inputs.

At the same time, he noted, "There are still difficulties in understanding composite commands simultaneously," adding, "We added an edit history function so users can check step-by-step logs and move to the desired point, and we are also researching composite command support." Generative editing uses the cloud, and he explained, "Even if the same cloud is used, the reason results differ by manufacturer is the difference in pre- and post-processing technologies."

Jo also explained the protruding camera ring design of the Galaxy S26. He said, "From the initial development stage, we comprehensively decided the direction by considering design elements, brightness, and image quality goals all together," adding, "It is hard to see it as a simple design change."

Jo said, "The future of cameras has moved beyond simple recording into a new stage where, at the moment of capture, they understand the user's intent and connect it to actions," adding, "We are preparing a (new) camera experience combined with agent AI."

※ This article has been translated by AI. Share your feedback here.