New US design uses heat for computing with superior energy efficiency

Source: interestingengineering
Author: @IntEngineering
Published: 3/6/2026
To read the full content, please visit the original article.
Read original articleResearchers at Lawrence Berkeley National Laboratory have developed a novel "thermodynamic computing" design that harnesses thermal noise—the random vibrations of electrons typically seen as detrimental—to power computations. Unlike classical and quantum computers that expend significant energy to suppress heat-induced noise or require extreme cooling, this approach leverages thermal fluctuations at room temperature to perform complex, nonlinear machine learning tasks akin to neural networks. The key innovation lies in programming physical devices with energy scales comparable to thermal energy to evolve states driven by these fluctuations, effectively turning noise from a problem into a resource.
The team overcame two major challenges that previously limited thermodynamic computing: the slow equilibration times required for calculations and the restriction to simple linear operations. By using nonlinear components and digital simulations, they demonstrated that thermodynamic computers can be trained to perform calculations at specific times without waiting for equilibrium, enabling faster and more predictable processing with much lower power consumption. To address the stochastic nature of these systems, which makes standard AI training ineffective, researchers employed evolutionary simulations on the
Tags
energythermodynamic-computingneural-networksmachine-learningthermal-noiseenergy-efficiencycomputing-hardware