Thermodynamic Computing and Statistical Thermodynamics: FAQ

What is thermodynamic computing?

Thermodynamic computing is a new computing paradigm that uses the principles of thermodynamics to perform computations. In simple terms, traditional digital computers work hard to eliminate randomness and thermal noise, keeping operations deterministic. Thermodynamic computing does the opposite – it embraces randomness and harnesses the natural fluctuations (noise) of physical systems to compute. Instead of using perfectly stable bits that are either 0 or 1, a thermodynamic computer uses elements that randomly fluctuate between states due to thermal energy. By controlling the conditions of these fluctuations, useful computations can be carried out. This approach essentially treats entropy (disorder) as a resource rather than a nuisance, allowing the computer to leverage the same statistical behavior that physical systems exhibit in nature.

Why explore a new computing paradigm like this?

Modern computing faces significant challenges that thermodynamic computing aims to address. First, traditional transistor-based computing is reaching its limits. For decades, increasing the density of transistors (following Moore’s Law) improved performance, but transistors are now so tiny (approaching atomic scales) that effects like thermal noise are fundamentally disrupting their reliable operation. This means we can’t keep making conventional chips much faster or more efficient by simply shrinking them. Second, the computational demand – especially from AI – is skyrocketing. Training advanced AI models and performing huge simulations consumes enormous energy; tech leaders have even contemplated building data centers next to nuclear reactors just to power AI workloads. Continuing on the current path would require unprecedented increases in energy and infrastructure, which isn’t sustainable.

Thermodynamic computing offers a potential way out of this dilemma. By leveraging naturally occurring thermal noise, it promises computers that scale in performance without an exponential rise in energy cost. Rather than hitting a wall where random noise ruins our computations, this paradigm uses that very noise to compute. In summary, we need new paradigms like thermodynamic computing because we’re running up against the physical and energy limits of traditional computing, and the relentless demand for more computing power (especially for AI) calls for fundamentally more efficient hardware.

How is thermodynamic computing different from traditional computing?

Thermodynamic computing fundamentally flips the script on how we handle randomness in hardware. In a standard digital computer (like the one in your phone or laptop), every effort is made to isolate the system from outside influences and noise. Bits are kept stable, and any thermal fluctuations are regarded as errors to be mitigated. In contrast, a thermodynamic computer is an open system that engages with noise and fluctuations, using them to its advantage. One way to think about it: a conventional computer is like a tightrope walker who tries to minimize any wobbles, whereas a thermodynamic computer is more like a surfer who rides the natural waves. The “waves” in this analogy are the random thermal motions always present in physical systems.

Practically, this means thermodynamic computing often involves analog or probabilistic hardware elements rather than deterministic logic gates. Traditional chips use transistors as on/off switches, while thermodynamic computing might use devices that output random 0s and 1s (or continuous analog values) according to some probability. Instead of suppressing entropy, a thermodynamic computer finds a way to live in symbiosis with entropy, leveraging it as a computational asset. This is a profound shift: computation is no longer a step-by-step deterministic procedure but rather an evolving physical process influenced by thermal energy. The outcome is still controlled – we bias the random processes toward solving the problem we care about – but we don’t try to eliminate the randomness entirely. This approach draws inspiration from nature’s own “computers”: for instance, biological cells compute with chemical reactions that are inherently random, yet extremely efficient (e.g. gene regulatory networks rely on random molecular interactions). Thermodynamic computing seeks to imitate such natural information processing, using noise to help explore possibilities instead of treating noise as purely a problem.

What principles of statistical thermodynamics are involved here?

Statistical thermodynamics (or statistical mechanics) is the branch of physics that deals with large ensembles of particles and how they distribute among states according to energy and temperature. This is highly relevant to thermodynamic computing. In statistical thermodynamics, a system will naturally tend to occupy states with probabilities related to their energy – for example, following a Boltzmann distribution where $P(\text{state}) \propto e^{-E/(kT)}$. A thermodynamic computer exploits this principle by mapping computational problems onto energy landscapes. In effect, you design a physical system (the hardware) such that its most probable states correspond to desired solutions or useful computations. The hardware then samples from that probability distribution on its own, courtesy of thermal fluctuations.

A classic example is the concept of a Boltzmann machine in machine learning, which was inspired by statistical physics. It’s a network that finds low-energy states corresponding to good solutions, by simulating random flips of bits and gradually “cooling” the system. Thermodynamic computing can implement something like a Boltzmann machine directly in hardware – meaning the physics of the device naturally does what the algorithm would otherwise simulate. In fact, one application of thermodynamic hardware is accelerating energy-based models (EBMs), which are AI models that learn by shaping an energy landscape. By physically embodying probability distributions in hardware, thermodynamic computers can speed up tasks like sampling from complex distributions or finding optimal combinations.

To put it simply, statistical thermodynamics provides the theoretical foundation for why this works: it tells us how systems behave when there’s randomness and energy in play. Thermodynamic computing takes those same equations and principles – like entropy maximization, equilibrium distributions, and thermal fluctuations – and uses them as computing operations. This means concepts like temperature, free energy, and entropy aren’t just abstract physics terms; in this paradigm, they become part of the computing vocabulary (for example, “raising the temperature” in a device might let it explore more possibilities, whereas “lowering the temperature” makes it settle into an optimal state).

How does a thermodynamic computer actually work?

It’s understandable to be curious about what these machines physically look like and do. While implementations can vary, the core idea is to have hardware elements that randomly fluctuate but in a controlled, programmable way. One simple model is the probabilistic bit or “p-bit.” Unlike a normal bit that is definitively 0 or 1, a p-bit rapidly flips between 0 and 1 due to thermal noise. Importantly, one can bias a p-bit – for instance, set it so that it spends 70% of the time in state 1 and 30% in state 0 on average. By wiring many such p-bits together in clever ways (with certain probabilities influencing each other), the system can compute. It’s similar to how a bunch of random voters might collectively “decide” an outcome if you nudge their probabilities appropriately.

In practice, p-bits can be implemented with nanoscale electronics. For example, researchers have built p-bits using tiny magnetic devices where thermal agitation causes them to flip states, or using simple transistor circuits designed to generate random bits. Extropic, a company at the forefront of this technology, demonstrated a p-bit on an oscilloscope: the signal displayed a bit flipping between 0 and 1 at random, and by adjusting control parameters, they could tune the probability of the bit being 0 or 1 at any given time. By engineering interactions between multiple such bits, one can perform complex probabilistic computations. Essentially, the random flips provide a source of “exploration”, and the coupling between bits steers that randomness toward solving a problem.

Another way to understand how thermodynamic computers work is through a physical analogy. Imagine a small box of gas or a bunch of particles in liquid – they’re constantly jostling around (random motion due to thermal energy). Now imagine placing a bunch of tiny springs or constraints in that system so that the particles tend to settle into some preferred configuration (say, more particles on one side of the box than the other). The particles will still move randomly, but they’ll statistically spend more time in certain states because the springs pull them that way. If you observe the system over time, you could infer things like “state A happens 80% of the time and state B 20%” due to those constraints. That is a form of computation – the physical system is effectively encoding those probabilities for you.

Extropic uses a very similar principle in their hardware. They describe an analogy of Brownian motion: imagine microscopic particles bouncing around in a fluid, but connected by springs that bias where they tend to cluster. If you sample their positions over time after letting the system equilibrate, you get a specific probability distribution (heavier in some areas than others based on spring stiffness). In Extropic’s actual device, electrons play the role of those particles, and electronic components act like springs or barriers that shape the “energy landscape” the electrons explore. The natural thermal jostling of electrons (which normally just creates electrical noise) is utilized here: the electrons randomly hop around in circuits, but the circuit is designed so that the statistics of their hopping solve a computational problem or generate random samples from a desired distribution. In short, a thermodynamic computer works by setting up a playground for random motion to occur, and that playground is built such that the random motion yields useful answers.

What are the potential advantages of thermodynamic computing?

Thermodynamic computing holds several exciting potential benefits:

What are the challenges or downsides of thermodynamic computing?

As promising as thermodynamic computing is, there are important challenges and open questions:

In summary, while thermodynamic computing holds great promise, it’s at an early stage. Researchers and engineers are actively working on these challenges. The coming years will reveal whether solutions (like better nanodevices, clever error-mitigation techniques, and user-friendly software) can push the technology into the mainstream.

Who is leading the development of thermodynamic computing?

One of the leading players in this nascent field is a company called Extropic. Extropic is a startup (founded in 2022) that has made headlines for its ambitious effort to build a full-stack thermodynamic computing platform. The company’s mission is often described as “merging generative AI with the physics of the world.” In practical terms, Extropic is building an AI supercomputer that harnesses the first principles of thermodynamics and information theory – essentially using the physical world’s own entropy to power AI computations. Their vision is to create a new computing substrate where AI algorithms run on physics-based hardware rather than traditional digital chips. By doing so, they aim to achieve the absolute limits of efficiency dictated by physics, far beyond what normal chips can do.

Extropic’s approach is unique. The team is composed of experts in physics and AI (including former quantum computing researchers), and they deliberately chose a path not reliant on quantum mechanics. They see useful noise as an asset instead of a liability – a clear departure from the mindset in quantum computing where noise is a big problem. In fact, the company’s founder, Guillaume Verdon, was a quantum computing specialist at Google who left to pursue this alternative because quantum timelines were slow and required near-miraculous breakthroughs to scale. Extropic’s bet is that by using “classical” thermodynamic effects (which are easier to handle than quantum states), they can achieve revolutionary computing power much sooner and more scalably than quantum computers.

So, what exactly is Extropic building? Here are some key aspects of their work and technology:

Extropic’s work has garnered a lot of attention. They secured a substantial seed funding round (\$14.1 million) in late 2023 to pursue this vision, and they have been featured in the media discussing their progress. For example, WIRED magazine ran a story on how Extropic’s radical chip could potentially challenge Nvidia’s dominance in AI chips by offering a completely different way to compute. The company has demonstrated early results – like that controlled p-bit signal – indicating they’re on track. They claim their approach could yield accelerators three to four orders of magnitude more efficient than today’s hardware, which, if achieved, would indeed be game-changing.

As of now (2025), Extropic is still in development phase. Their public site teases “Thermodynamic Intelligence – Coming Summer 2025”. They are running an alpha program for early partners or users to test out their technology (likely either via cloud access or early hardware). This suggests that we may soon see the first real thermodynamic computing platform accessible outside of the lab. It’s an exciting time – essentially, Extropic is assembling a first-of-its-kind thermodynamic AI supercomputer, piece by piece. If they succeed, it could inaugurate a new era of computing where the laws of thermodynamics are not obstacles to computing, but the very medium of it. And for students and researchers, that means many of the concepts from statistical thermodynamics and physics might become directly relevant in how we design and think about computers.

Is thermodynamic computing being used today, or is it still just theoretical?

Thermodynamic computing is still in its early stages, but it’s moving from theory into practice. We don’t yet have large-scale thermodynamic processors running in everyday devices, but proof-of-concept systems have been built. For instance, researchers recently demonstrated a small-scale thermodynamic computing device built from electronic circuits (resistors, inductors, capacitors) on a circuit board that could do tasks like Gaussian random sampling and even solve a simple linear algebra problem by physical means. This shows that the concept works: real hardware can perform computations by harnessing thermal fluctuations. There have also been experimental devices based on nanomagnets, transistors, and other tech that realize p-bits and probabilistic logic.

In the commercial realm, as we discussed, Extropic is a key player pushing to create practical thermodynamic computing hardware. They have reported testing early versions of their superconducting chips (with a handful of probabilistic neurons) and are actively developing more advanced prototypes. The timeline hinted by Extropic suggests we might see initial products or cloud-accessible systems in the very near future (within a year or two). It’s worth noting that while you cannot buy a “thermodynamic computer” off the shelf today, the progress in this field is reminiscent of the early days of quantum computing or AI accelerators – the first small examples are proving the principle, and significant investments (both intellectual and financial) are being made to scale it up.

Another point is that the ideas of thermodynamic computing are influencing how people think about computation broadly. Even before full hardware arrives, some algorithms are borrowing from these principles (for example, simulated annealing in software, or stochastic neural networks that intentionally inject noise during training). In a sense, the boundary between traditional computing and thermodynamic computing is blurring as we find ways to introduce randomness and thermodynamic principles into computing to solve tough problems.

In summary, thermodynamic computing is transitioning from theoretical research to applied research and early development. We’re not quite there yet in terms of having it widely deployed, but given the rapid progress by companies like Extropic and academic groups around the world, it wouldn’t be surprising to see the first real thermodynamic computing accelerators used in specialized settings within the next few years. For undergraduate students today, by the time you are moving into industry or grad school, thermodynamic computing could very well be a tangible technology you might work with, especially in fields like AI, optimization, or computational physics. It’s an emerging area where a strong foundation in physics (thermodynamics, statistical mechanics) and computer science intersects – a great example of interdisciplinary innovation in action.

Further Reading