Artificial intelligence chip energy requires databases to pay for creativity star-news.press/wp

Computer chips that operate ChatGPT questions consume more powerful energy than chips that dominate data centers just a few years ago. While artificial intelligence pushes individual chips to consume more electricity, the databases are racing to pressure more computing power of each Watt – and the physics of maintaining silicate may determine whether artificial intelligence is sustainable.
With the network restrictions that make new energy connections increasingly difficult, the data center operators and researchers scramble to find efficiency gains wherever possible. But small improvements-such as liquid refrigerated energy systems, the most intelligent maintenance tables, and the distribution of high voltage electricity-conflict with the problems of basic physics and economic incentives that give priority to sustainability performance. The question is whether these additional victories can keep up with it Amnesty International Electricity Agency.
Classes are risen by the server shelf. Data centers already consumes about 4 % From the American electrical network, a number expected to reach 9 % over the next decade. In hot markets such as Virginia and Texas, energy companies are immersed for new data center contacts so that they charge millions of dollars just to study whether the network can deal with pregnancy.
This has created a new urgency about an old scale: the effectiveness of energy use, or PUE, which measures the amount of electricity actually reaching computers for the amount that is lost on cooling and other general expenses. Mathematics is simple, but margins are narrow. The data center that works at 1.5 PUE can provide only 67 % of the electricity received to the actual computing – the rest disappears in the cooling systems and energy conversion losses. When improving this, it can add up to energy and cost saving, according to Rayan Malory, President and Operations Manager of the Flex Data Center operator.
“We are talking about dozens and a percentage.” “But it is very effective for the cost of operations. If you drop PUE 10 from a percentage – say you are heading from 1.4 to 1.3 – you can get a efficiency of $ 50,000 per month per megawatts of power consumption.”
For one large attachment, this adds quickly. A Mallory customer runs the AI facility 27 megawatts, providing improvement by 0.1 % PUE $ 1.35 million per month, or more than $ 16 million annually. More importantly, the same efficiency means that the facility can pack more computing power in the same network connection – which is very important when new energy connections can take years to agree to millions and only cost them to study.
Looking at the huge range of construction, these gains become more important. There are now approximately 7,000 megawatts of the Data Center market in North America, with more than 6000 megawatts under construction. Real Estate Company CBRE. Through this fingerprint, it can even be translated into modest improved efficiency into a much more computing capacity than artificial intelligence without having an additional stress on an electric grille already.
Cooling crisis
Often the way to PUE gains in physics and planning. Fall -fond operators such as Google and Meta can achieve PUE classifications of up to 1.1 or 1.2 because their server farms use identical equipment arranged in predictable patterns, creating a fixed air flow. But most data centers contain a mixture of different customers with different devices, creating what Malry called “chaotic air flow patterns and hot spots” that make effective cooling more difficult to achieve.
However, regardless of its perfect arrangement, all data centers are fighting the same battle against Heet.
The operators are creative in managing this heat. Mallory is scheduled to maintain equipment during the cold morning hours to avoid the power penalty in running tests during peak temperatures. In hot climates such as Las Vegas and Phenix, facilities use the evaporation system that travels before air before entering the main cooling system, similar to Missters in external restaurants. Some can even take advantage of “free air cooling” during the winter months, and open the ventilation holes to use the cold external air directly.
To deal with huge power loads more efficiently, data centers were forced to upgrade their electrical systems. To deal with huge power loads more efficiently, data centers were forced to upgrade their electrical systems. Traditional data centers have used low voltage energy distribution, but artificial intelligence shelves now require high voltage systems, as some operators jump to 400 or even 800 volts.
The upper voltage allows a decrease in the same energy, which reduces the losses of resistance that converts precious electricity into an undesirable heat. They are gains from the efficiency of two for one that reduces lost energy and generate heat. But even these improvements cannot solve the basic problem of shelves that are generated greater than heat such as space heaters jamd in a cabinet size.
To really address the heat problem, data centers need more radical solutions. For this reason, TE Connectivity and other companies have developed the liquid cooled energy distribution systems-mainly the electrical cables that are refrigerated with water-can deal with more energy in the same fingerprint as traditional systems while eliminating heat more effectively.
About 30 % of the new data centers are built through liquid cooling systems, where this percentage is expected to reach 50 % within two to three years, according to Gunish Srenivasan, Vice President of Business Development in the field of digital data networks in TE.
But liquid cooling creates its sustainability challenge: databases can consume millions of gallons annually for cooling and connecting local water supply. Some facilities try indulging cooling – literally full servers in mineral oil – which removes the use of the entire water, although logistics make them inappropriate for most applications so far.
The consequences of unintended competence
In addition to infrastructure improvements, chips make up their competence gains. Companies like AMD bet on the rack of the rack This can enhance energy efficiency 20 times by 2030, while newer chips designs support lower accuracy accounts that can significantly reduce arithmetic load. Nafidia The next generation of Blackwell graphics processing units – And something The latest Blackwell’s latest platform – Promise their competence improvements. The NVIDIA CEO, CEO of NVIDIO usually 20 times more efficient in energy For some AI’s work burdens of traditional central treatment units.
But there is an essential paradox in working with newer chips. Energy bills have doubled at the upgrade to the latest NVIDIA chips, according to Dan Alisara, a professor at the Austria Institute of Science and Technology, who is looking for the efficiency of algorithm. “It is a strange comparison because you manage things faster, but you also use more energy,” said Alisara.
The algorithms that work on artificial intelligence show less efficiency. Researchers like ALISTARH are still working on technologies that can reduce energy consumption from obstetric intelligence, such as the simplicity of the use of the simplest mathematics that requires lower computing power. Other combinations explore completely different structures that can completely replace transformers.
But these innovations were struggled to obtain strength because artificial intelligence companies are largely judged on how to perform their models in unified tests that measure capabilities such as thinking, mathematics, language understanding, and the results that directly affect financing and market vision.
Companies prefer to build energy -related models in these tests more than effective models that may be behind competitors, even a little. The result is an industry that improves the leaders of leaders on sustainability, which makes efficiency improvements, regardless of cost savings, which is a secondary concern at best.
“Anything that falls in the racing of mice from the criteria is a clear loss,” said Alisarah. “Nobody can do it.”
Reports on this article were made as part of the establishment of the press funded by the Austrian Science and Technology Institute.
https://qz.com/_next/static/media/logo-initials.7c4a39d5.webp
2025-06-19 13:47:00



