AI back-end networks to drive data centre switch spending

According to the new AI Networks for AI Workloads report by Dell’Oro Group, spending on switches deployed in AI back-end networks used to connect accelerated servers is forecast to approach $80 B over the next five years, nearly doubling the total data center switch market opportunity.

Current data center switch market spending is on front-end networks used primarily to connect general-purpose servers but AI workloads will require a new back-end infrastructure buildout. While InfiniBand is currently dominating the AI back-end network market, Ethernet is poised to take over soon.

“As predicted, growth in AI workloads and the associated data center infrastructure continues to track well ahead of market expectations. We therefore raised our forecast for the AI back-end networks compared to our prior December 2023 forecast, though the latter was initially considered very aggressive by a number of industry players,” said Sameh Boujelbene, Vice President at Dell’Oro Group. “The upward adjustment was broad-based across Ethernet and InfiniBand. We are now, however, much more optimistic about the ability of Ethernet to eclipse InfiniBand within the next few years. This optimism stems from significant improvements on the technology side as well as market demand.

“On the technology side, we are observing significant enhancements on Ethernet at multiple layers of the stack: network chips, network operating systems, network interface cards and optics. In terms of demand, we are tracking significant new Ethernet wins, such as the latest 100K GPU cluster win announced by NVIDIA, among many other potential wins by other switch vendors including Celestica, Cisco, Arista, Juniper, and Nokia,” continued Boujelbene.

Additional highlights from the AI Networks for AI Workloads Report:

AI networks will accelerate the transition to higher speeds. The majority of the switch ports deployed in AI back-end networks are expected to be 800 Gbps by 2025 and 1600 Gbps by 2027.

While most of the market demand will come from Tier 1 Cloud Service Providers, we have significantly raised our forecast for Tier 2/3 and large enterprises in light of their accelerated pace of AI workload adoption and infrastructure deployment.

Developer productivity and quality engineering has passed the tipping point of adopting generative...
HCLTech has launched its advanced AI Transformation academy in partnership with Multiverse, a...
Wireless Logic report reveals significant demand for eSIM, remote SIM provisioning and robust...
Partnership with IFS to boost efficiency, increase operational agility and support planning and...
Tech Mahindra has established a Center of Excellence (CoE) powered by NVIDIA platforms to drive...
ISACA research shows automating threat detection/response and endpoint security are the most...
Deployed in minutes without code or consultants, Freddy AI Agent delivers fast time to value,...
Splunk has released The State of Observability 2024 report in collaboration with Enterprise...