Illustration=Gemini

Until just a few years ago, robots were seen as a money-losing business for Google. When it sold Boston Dynamics, then a symbol of the robotics industry, to SoftBank in 2017, many in the market said it was a move to shed a low-margin business. The military image and the high barriers to commercialization were also burdens.

For a while, robots were viewed as an area removed from Google's core strategy. But Google's direction is changing. This time, rather than building the "body" of robots itself, the strategy is to control the "brain" that goes inside. Instead of competing in hardware manufacturing, the goal is to secure artificial intelligence (AI) software that will be embedded in all robots—a so-called "Android for robots" plan.

The most symbolic move was the decision announced on the 25th (local time) to integrate Intrinsic into Google from under Alphabet. Intrinsic evolved out of Alphabet's research unit X and was spun out in 2021; with this merger, it has been reorganized as a core strategic pillar of Google. Going forward, it will directly use Google DeepMind's Gemini model and Google Cloud infrastructure. Through the Flowstate platform, which lets even non-robotics experts design automation processes, the goal is to stake out the factory operating system (OS) realm. Teaming up with Foxconn to pursue AI-based manufacturing automation is in the same vein.

This strategy has been made more concrete through collaboration with Boston Dynamics. In Jan. 2026 at CES 2026, the two companies agreed to equip the latest humanoid "Atlas" with Google's "Gemini Robotics" model. It is a structure that combines the robot hardware capabilities owned by Hyundai Motor Group with Google's AI foundation model. Rather than manufacturing robots directly, the approach is to extend influence through core software. The recruitment of former Boston Dynamics Chief Technology Officer (CTO) Aaron Saunders as vice president of hardware is also seen as a move to strengthen internal capabilities that connect software and hardware.

This approach is not unfamiliar to Google. In the mobile era, instead of mass-producing smartphones itself, Google led with the Android operating system and formed a structure of collaborating with manufacturers such as Samsung Electronics. Partners handled the hardware, while Google controlled the software and ecosystem. As a result, Android became the standard operating system in the global smartphone market.

The landscape of the robotics industry is also changing quickly. As the Generative AI race expands beyond chatbots into the physical world, "physical AI" has emerged as a key strategic area. Nvidia is expanding its ecosystem with the world model "Cosmos," which understands the laws of physics, and a chip platform for robots. Tesla is converting its Fremont factory into a robot production hub and preparing an annual production system of 1 million units alongside mass production of Optimus Gen 3 in the first quarter of 2026. OpenAI has also made large-scale investments in physical intelligence to secure general-purpose AI for robots, and Alibaba has released its robot AI "Linbrain" as open source, moving to expand the ecosystem in China.

The reason Big Tech is focusing on robots is clear. Beyond learning centered on text and images, physical data collected in the real world is increasingly likely to become the key resource of next-generation AI. In digital environments, errors can stay on the screen, but with robots, the moment a physical task fails, the result is immediately apparent. The physical world is both a proving ground for AI's limits and an arena where new commercial opportunities open.

In the end, Google's choice lies not in hardware competition but in securing a "general-purpose brain." The strategy is to make AI software that can be installed in any robot into the standard. In the era of physical AI, leadership may depend less on who makes the more sophisticated robot and more on who controls the brains of more robots. The market is watching to see whether Google can, this time, use robots to design the next 10 years of AI.

※ This article has been translated by AI. Share your feedback here.