Artificial intelligence (AI) is hot right now. Also hot: the data centers that power the technology. And keeping those centers cool requires a tremendous amount of energy.
The problem is only going to grow as high-powered AI-based computers and devices become commonplace. That’s why University of Missouri researcher Chanwoo Park is devising a new type of cooling system that promises to dramatically reduce energy demands.
“Cooling and chip manufacturing go hand-in-hand,” said Park, a professor of mechanical and aerospace engineering in the Mizzou College of Engineering. “Without proper cooling, components overheat and fail. Energy-efficient data centers will be key to the future of AI computing.”
Solving tomorrow’s problem
Data centers are large facilities full of servers that contain computer chips that store and process data. They’re basically giant computer hubs that house the servers that run websites, mobile applications and data from the cloud.
They’re also power-hungry. In 2022, data centers used more than 4% of all electricity in the U.S., with 40% of that energy being spent to keep equipment cool. As demand on data centers increases, even more energy will be required.
To mitigate that, the U.S. Department of Energy has awarded more than $40 million to researchers to find new ways to cool data centers. Park recently received nearly $1.65 million from that initiative, known as COOLERCHIPS.
Currently, data centers are cooled with either air-moving fans or liquid that moves heat away from computer racks.
Park and his team are developing a two-phase cooling system designed to efficiently dissipate heat from server chips through phase change, such as boiling a liquid into vapor in a thin, porous layer. The system can operate passively without consuming any energy when less cooling is needed. Even in active mode, where a pump is used, it consumes only a negligible amount of energy.
“The liquid goes in different directions and evaporates on a thin metal surface,” Park said. “Using this boiling surface, we’re able to achieve very efficient heat transfer with low thermal resistance.”
The system also includes a mechanical pump that is activated to absorb more heat only when needed.
Early tests show that two-phase cooling techniques drastically reduce the amount of energy needed to keep equipment cool.
The team is now fabricating the cooling system — designed to easily connect and disconnect within server racks. Park hopes they’ll be in use within the coming decade just as AI-powered computers become mainstream.
“Eventually there will be limitations under current cooling systems, and that’s a problem,” Park said. “We’re trying to get ahead of the curve and have something ready and available for the future of AI computing. This is a futuristic cooling system.”
Park’s work aligns with the goals of the Center for Energy Innovation, a building being constructed on campus to allow interdisciplinary researchers to solve challenges presented by rising energy concerns and rapid growth in AI. The idea is to leverage advanced technology to optimize energy production, storage and efficiency.
“The center will allow us to explore additional ideas and innovations around energy-efficient processes,” Park said. “These are complex problems that require different areas of expertise. I look forward to future collaborations.”
Discussion about this post