Rami Rahim, executive vice president and general manager of networking at Hewlett Packard Enterprise Company (HPE)./Courtesy of HPE

The network in the age of artificial intelligence (AI) is no longer a supporting infrastructure but a mission-critical core area (the most important area where problems would fatally affect an entire company or service). Beyond mere consolidation, a self-driving network that autonomously identifies, diagnoses, and resolves issues before users are affected will decide the market's winners.

Rami Rahim, executive vice president and general manager of networking at Hewlett Packard Enterprise Company (HPE), met with ChosunBiz at the Grand Hyatt Seoul in Yongsan-gu on the 20th of last month and offered this assessment of network competitiveness in the AI era.

Rahim, who studied electrical engineering at the University of Toronto in Canada, joined AMD in 1994 and worked for a year as an ASIC (application-specific integrated circuit) engineer. He then entered graduate school at Stanford University, earned a master's degree in electrical engineering, and in 1997 joined Juniper Networks as an ASIC engineer. Rahim joined HPE with HPE's acquisition of Juniper Networks in 2025 and had served as Juniper's chief executive officer (CEO) for 10 years until then. He now oversees the networking business that spans HPE Juniper (Juniper) Networking and HPE Aruba (Aruba) Networking. Juniper Networks, acquired by HPE, makes network equipment and software for corporations, carriers, and cloud providers, with particular strengths in AI-based network operations and data centers and security. Through the Juniper acquisition, HPE strengthened its network competitiveness for the AI and Hybrid Cloud era.

Rahim explained HPE's AI network strategy in two pillars: AI for the network and the network for AI. One is to use AIOps (technology that automates IT system operations with AI and predicts and manages failures) to reduce operational complexity and raise the user experience, and the other is to target the data center network and routing markets for AI training and inference infrastructure, he said, adding, HPE sees both opportunities at the same time.

The differentiator he emphasized for HPE is the self-driving network. Rahim said, It is no longer enough to have a structure where people respond only after a failure occurs, adding, We need a network that detects anomalies first, analyzes the cause, and optimizes itself. He continued, What we aim for is not simply attaching AI to network management tools, and said, It is AI-native networking that provides a consistent autonomous operations experience across campuses, branches, and data centers.

Just five months after completing the Juniper acquisition, in December last year, HPE unveiled an AI-native networking portfolio that combines HPE Aruba Networking and HPE Juniper Networking. Rahim said, The core of the Juniper integration is not organizational consolidation but an evolution in operations that customers can tangibly feel, adding, We are strengthening a common AI operations experience that spans Aruba Central and Mist.

He assessed that the importance of networks in the AI era becomes clearer in data centers. Rahim said, Even if you invest billions of dollars in an AI data center, if there is latency or bottlenecks in the network, you cannot use the graphics processing units (GPUs) optimally, adding, High-performance computing alone does not complete AI infrastructure. He continued, In an AI factory, you have to view everything as one design—from the internal consolidation that links GPU to GPU, to long-distance interconnection between data centers, to the on-ramps where AI workloads flow in from the edge, to the routing that supports it, and said, Without high-performance networks, the return on GPU investments is also halved.

In response, HPE is designing AI infrastructure by bundling high-performance switches for GPU consolidation inside data centers, long-distance data center interconnection, edge on-ramps, and routing. The new QFX5250 switch unveiled in December last year was designed to support ultra Ethernet transport. Rahim said, AI workloads are far more sensitive to latency and congestion, adding, It is important to ensure stable consolidation not only inside data centers but also across multi-cloud and long-distance distributed cluster environments.

Collaboration with Nvidia is also a pillar of this strategy. In December last year, HPE expanded its AI factory portfolio with Nvidia and added edge on-ramp and data center interconnection capabilities based on HPE Juniper Networking. Rahim said, AI is no longer a workload that stays only within a single data center, adding, You have to ensure stable consolidation across Multi-Cloud, distributed clusters, and the edge (a computing environment that processes data close to users or devices rather than at a central cloud data center) for AI to work properly in real customer environments.

This strategy has been reflected in results. According to HPE's released results for the first quarter of the 2026 fiscal year (Nov. 2025–Jan. 2026), total revenue was $9.3 billion (about 13.9054 trillion won), up 18% from a year earlier. Of that, networking revenue surged 151.5% to $2.7 billion (about 4.037 trillion won), and the operating margin was 23.7%. Campus and branch revenue rose 42% to $1.2 billion (about 1.7942 trillion won), data center networking revenue rose 382.6% to $444 million (about 663.9 billion won), and security revenue rose 114.3% to $255 million (about 381.3 billion won), each from a year earlier.

Rahim said, The effects of the Juniper integration are being proven not by a simple message but by the numbers, adding, Customers now see networks not as simple consolidation equipment but as core infrastructure that determines competitiveness in the AI era. He added, In both areas—AI for the network, which improves operational efficiency, and the network for AI, which supports AI infrastructure itself—demand is growing simultaneously among corporations, service providers, and cloud providers.

Reflecting this trend, HPE raised its 2026 fiscal year (Nov. 2025–Oct. 2026) networking institutional sector revenue growth outlook to 68%–73%. Rahim said, The battle for networks in the AI era ultimately comes down to autonomy, adding, The market landscape will be determined by who first properly implements a network that does not respond after a problem occurs but instead identifies and resolves issues on its own before users are affected.

※ This article has been translated by AI. Share your feedback here.