1

This was an idea brought up by a friend.

Due to the CPU producing heat, one would think that it would be more energy-efficient to reuse that heat energy.

Through what method could heat from the CPU be converted into electrical energy, which goes back into the computer?

How much electricity could be produced, and would it be significant enough to go through the trouble of harnessing it?

4
  • Are you thinking of a perpetual machine? That's not possible.
    – Biswapriyo
    Commented Aug 25, 2017 at 3:52
  • @Biswa not a perpetual machine, but a contraption that greatly increases the energy efficiency. Commented Aug 25, 2017 at 6:09
  • No, it won't. In general, waste heat is very darn difficult to get any usable energy out of. Although there may seem to be a lot of it, its temperature is usually not high enough above ambient to be useful. See "Carnot efficiency". And remember that in the calculation of Carnot efficiency you must use an absolute-zero-based temperature scale such as Kelvin or Rankine. Commented Aug 25, 2017 at 22:20
  • see e.g. electronics.stackexchange.com/a/309743
    – djvg
    Commented Apr 1, 2021 at 8:03

3 Answers 3

1

Best bet would be a thermoelectric generator:

...a solid state device that converts heat flux (temperature differences) directly into electrical energy through a phenomenon called the Seebeck effect (a form of thermoelectric effect).

It's been studied and folks have actually done what you're talking about.

I can't see any reason why you can't get a TEG for yourself and play around, perhaps with an old PC. The tricky part is using the small amount of electricity generated for something useful. I would suggest that trying to get the energy back into your computer is far to complicated and you won't recoup the expense of doing so.

Instead, why not regulate the output and use it to charge a phone or rechargeable batteries or something. Obviously all this is outside the scope of SU but I wanted to answer anyway ;-)

EDIT

The other tricky thing is how to prevent the CPU from overheating. Sure, a properly installed TEG will draw heat away from the CPU but then what? If the TEG gets too hot (and it will) then so will the CPU. Using a fan to cool down the TEG will reduce the TEG's effectiveness in generating electricity. So yeah it would be a challenge to make this work practically.

1
  • It is not that the TEG will "draw heat away from the CPU". The problem is the opposite, at least compared to what a good heatsink/fan (or liquid cooler, or etc... I'll just say "HSF") will do by itself. A TEG placed on top of a CPU, with the HSF on top of it, will obviously be far less effective at moving heat from the CPU to the HSF than a thin layer of heatsink grease or TIM; the CPU will overheat. Glue the TEG to one of the HSF's fins and you will impede cooling very little... and you will also pick up very very little heat so you will make very very little electric energy. etc. Commented Aug 25, 2017 at 21:00
2

I'm sorry to break this to you, but this is a very bad idea.

A major design problem in modern computers is to cool the CPU, no? Well, anything you add to the CPU that attempts to convert the heat energy to useful work will reduce the effectiveness of the heat sink/fan assembly.

This is even true if you have a liquid cooling system and try to put the TEG in the "hot air output" path from its radiator fan. You are blocking the air flow through the radiator, therefore making the radiator less effective.

"But I'm only blocking it a little bit!" Then you will only be able to recover a corresponding little bit of the energy.

Regardless of the setup, the amount of energy you recover will be tiny, and probably not even worth the price of the TEG.

0

Electricity used to power computers is almost 100% dissipated as heat. So it is an appealing idea to use it for something instead of throwing into air by CPU heatsinks. I think technically it is possible to use CPU heat for electricity producing but obviously not in home conditions.

Current power stations use steam power to rotate a turbine to generate energy. CPU cannot heat water to temperatures enough to produce steam so I think CPU warmth may be used for auxiliary water heating in addition to power station main vaporising process. This could be done if datacenters are located nearby power stations and then we could at least in some way utilise datacenter energy now usually being lost.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .