Dell Technologies said on the 14th that it will launch the AI‑dedicated desktop Dell Pro Max with GB10, which can handle large language models (LLMs) with up to 200 billion parameters without network consolidation, on the 16th.
The new product features Nvidia's latest GB10 Grace Blackwell superchip, enabling data center‑level AI computing performance in a personal environment.
GB10 combines the 20‑core Arm architecture‑based Grace CPU and Blackwell GPU to deliver up to 1 petaflop (1,000 trillion operations per second) of compute performance. With this, users can perform everything from prototyping and fine‑tuning to inference of an LLM with up to 200 billion parameters in a local environment.
The product offers 128GB of LPDDR5x system memory and 2TB/4TB NVMe SSD options, allowing stable handling of massive data and complex workloads. Through TPM 2.0‑based hardware security features and a sandbox environment, it securely protects corporations' sensitive data from external networks.
The operating system applies Nvidia DGX OS based on Ubuntu Linux, and Nvidia CUDA, AI Workbench, JupyterLab, and Docker come preinstalled so an AI development environment can be set up immediately without separate installation. In addition, by using Nvidia ConnectX-7, which supports ultra‑low‑latency networking, to consolidation two units of Pro Max with GB10, models with up to 400 billion parameters can be processed in parallel.
Dell expects this product to be a solution that meets the growing demand for on‑premises (self‑hosted) AI. Research institutions can run large language models such as Llama 3.3 70B directly to improve research efficiency, and startups can quickly carry out AI prototyping and inference in a local environment without separate infrastructure. In security‑critical industries such as healthcare and finance, AI training can be conducted without exporting data externally.