Surge in AI server demand from cloud service providers: TrendForce – InfotechLead.com

TrendForces latest industry report reveals a sustained high demand for advanced AI servers from major cloud service providers (CSPs) and brand clients, projected to continue into 2024.

The expansion in production by TSMC, SK Hynix, Samsung, and Micron has alleviated shortages in the second quarter of 2024, significantly reducing the lead time for NVIDIAs flagship H100 solution from 4050 weeks to less than 16 weeks.

Key Insights:

AI Server Shipments: AI server shipments in Q2 are estimated to rise by nearly 20 percent quarter-over-quarter, with an annual forecast now at 1.67 million units, representing a 41.5 percent year-over-year growth.

Budget Priorities: Major CSPs are prioritizing budgets towards AI server procurement, overshadowing the growth of general servers. The annual growth rate for general server shipments is a mere 1.9 percent, with AI servers expected to account for 12.2 percent of total server shipments, a 3.4 percentage point increase from 2023.

Market Value: AI servers are significantly boosting revenue growth, with their market value projected to exceed $187 billion in 2024a 69 percent growth rate, comprising 65 percent of the total server market value.

Regional Developments:

North America and China: North American CSPs like AWS and Meta are expanding proprietary ASICs, while Chinese companies Alibaba, Baidu, and Huawei are enhancing their ASIC AI solutions. This trend will likely increase the share of ASIC servers in the AI server market to 26 percent in 2024, with GPU-equipped AI servers holding about 71 percent.

Market Dynamics:

AI Chip Suppliers: NVIDIA dominates the GPU-equipped AI server market with a nearly 90 percent share, whereas AMD holds about 8 percent. When considering all AI chips used in AI servers (GPU, ASIC, FPGA), NVIDIAs market share is around 64 percent for the year.

Future Outlook: Demand for advanced AI servers is anticipated to remain robust through 2025, driven by NVIDIAs next-generation Blackwell platform (GB200, B100/B200), which will replace the Hopper platform. This shift is expected to boost demand for CoWoS and HBM technologies, with TSMCs CoWoS production capacity estimated to reach 550600K units by the end of 2025, growing by nearly 80 percent.

Memory Advancements: Mainstream AI servers in 2024 will feature 80 GB HMB3, with future chips like NVIDIAs Blackwell Ultra and AMDs MI350 expected to incorporate up to 288 GB of HBM3e by 2025. The overall HBM supply is projected to double by 2025, fueled by the sustained demand in the AI server market.

Conclusion:

The AI server market is experiencing unprecedented growth, with significant contributions to revenue and technological advancements. As major CSPs and tech giants continue to invest heavily in AI infrastructure, the industry is set for transformative developments through 2025.

View original post here:
Surge in AI server demand from cloud service providers: TrendForce - InfotechLead.com

Related Posts

Comments are closed.