ELAU C200/10/1/1/1/00 C400/10/1/1/1/00



By
jonson
28 12 月 23
0
comment

ELAU C200/10/1/1/1/00 C400/10/1/1/1/00

With the rapid development of artificial intelligence, the demand for computing power is also on the rise. Nvidia’s AI processor H100 is currently very popular, but the peak power consumption of each H100 is as high as 700 watts, exceeding the average power consumption of ordinary American households. With a large number of H100 deployments, experts predict that its total power consumption will be on par with that of a large US city, and even exceed that of some small European countries.
IT Home noticed that Schneider Electric, a French company, had estimated as early as October last year that the total power consumption of data centers used for AI applications was equivalent to that of the entire country of Cyprus. So, what is the power consumption of one of the hottest AI processors – NVIDIA’s H100?
Paul Churnock, Chief Electrical Engineer of Microsoft’s Data Center Technology Governance and Strategy Division, predicts that by the end of 2024, when millions of H100s are deployed, their total power consumption will exceed the electricity consumption of all households in Phoenix, Arizona, but still lower than larger cities such as Houston, Texas.
“Nvidia’s H100 GPU has a peak power consumption of 700 watts, which is equivalent to the average power consumption of an American household (assuming 2.51 people per household) based on an annual utilization rate of 61%. Nvidia predicts that H100 sales in 2024 will range from 1.5 million to 2 million blocks. According to the ranking of urban residents’ electricity consumption, H100’s total power consumption will rank fifth in the United States, only behind Houston, and ahead of Phoenix.”
At an annual utilization rate of 61%, each H100 will consume approximately 3740 kilowatt hours (kWh) of electricity annually. If Nvidia sells 1.5 million H100 chips in 2023 and 2 million H100 chips in 2024, 3.5 million H100 chips will be deployed by the end of 2024. The total annual power consumption of these chips will reach an astonishing 13.091 billion kilowatt hours, or 13091.82 gigawatt hours (GWh).
In contrast, countries such as Georgia, Lithuania, or Guatemala also consume approximately 13092 gigawatt hours of electricity per year. Although the power consumption of H100 is shocking, it is worth noting that the efficiency of AI and high-performance computing (HPC) GPUs is constantly improving. Therefore, although the power consumption of Nvidia’s next-generation Blackwell architecture B100 chip may exceed H100, it will also provide higher performance, thus completing more work with less unit power consumption.
With the development of AI technology, balancing the contradiction between computing power and energy consumption will become an important issue that urgently needs to be solved in the future.

发表回复