Supply Chain Shortages for NVIDIA AI GPUs Improve, AI Server Shipments Expected to Surge in Second Half

kyojuro วันอาทิตย์ที่ 14 กรกฎาคม พ.ศ. 2567

Since last year, NVIDIA's AI GPUs have been in short supply, leading to extended delivery cycles. For instance, the delivery cycle for servers built with H100-based components can range from 36 to 52 weeks. TSMC, responsible for the manufacturing and packaging of these chips, has faced significant capacity constraints, predominantly due to limited CoWoS packaging availability.

NVIDIA AI GPUs

According to DigiTimes, industry insiders report a significant improvement in the supply shortage of H100 compute cards, with strengthened shipments noted. In the first half of 2024, AI servers experienced a dilemma of having orders but no materials. However, by the end of the second quarter, the situation began to change. It's expected that AI server shipments will continue to grow in the second half of 2024, bolstered by the new generation of Blackwell architecture B100/200 shipments.

During the second quarter, many long-standing orders were finally fulfilled, and the gap between supply and demand narrowed to single digits. To put this in perspective, in 2023, the supply-demand gap ranged between 30% and 40%, indicating a considerable disparity. Original Design Manufacturers (ODMs) have taken note of these changes, with notable revenue growth seen in June. With the addition of B100/200, ODMs anticipate a higher percentage of AI server revenue.

Component vendors have also benefited from resolved supply chain material issues. Reports indicate that the iteration of AI GPUs has boosted fan output values, which have risen from $350 to $550 compared to AI servers equipped with H100 and A100. Although the exact increase for the upcoming B100/200 GPUs remains undetermined as they are yet to be mass-produced, expected figures are even higher.

ข่าวที่เกี่ยวข้อง

© 2025 - TopCPU.net