Qualcomm Technologies, Inc. has announced the launch of its latest AI-optimised solutions for data centres: the AI200 and AI250 accelerator cards and racks. Building on its NPU technology leadership, these offerings bring unrivalled rack-scale performance and memory capacity, setting a new standard for generative AI inference.
The Qualcomm AI200 is tailored to deliver low total cost of ownership and optimised performance for large language models and other AI workloads. With support for 768 GB of LPDDR per card, it addresses high memory needs with an emphasis on scalability and flexibility for AI tasks.
Meanwhile, the Qualcomm AI250 introduces a pioneering memory architecture based on near-memory computing. This delivers a more than 10x improvement in memory bandwidth efficiency and significantly reduces power consumption, facilitating the robust application of AI in hardware tailored to meet diverse customer needs.
Both solutions are equipped with features like direct liquid cooling, PCIe for scalability, and Ethernet for expansion. They also incorporate confidential computing to safeguard AI processes, with a power draw of 160 kW per rack. This allows Qualcomm's offerings to meet the demands of modern data centre environments.
According to Durga Malladi, SVP & GM at Qualcomm, "Our rich software stack and open ecosystem support make it easier than ever for developers and enterprises to integrate, manage, and scale already trained AI models on our optimised AI inference solutions. With seamless compatibility for leading AI frameworks and one-click model deployment, Qualcomm AI200 and AI250 are designed for frictionless adoption and rapid innovation."
The combined software stack supports a wide array of machine learning frameworks and inference engines, optimising for generative AI tasks with techniques like disaggregated serving. Developers benefit from streamlined model integration and access to AI applications, libraries, and tools, ensuring operational efficiency.
Anticipated to be commercially available from 2026 and 2027, the AI200 and AI250 embody Qualcomm's commitment to annual progress in data centre AI, focusing on performance, energy efficiency, and cost-effectiveness.