0

I've been running:

  • 5 identical systems (all running Windows XP embedded)
  • The systems are running identical loads (and nothing else)
  • Coretemp to test temperatures.

For 5 systems all of which have been running for ~24hrs in close proximity with the same cooling systems I get the following mean temperatures:

  • 12°C
  • 21°C
  • 19°C
  • 13°C
  • 15°C
  • 15°C

I want to know how much the reported temperature of Atom processors is likely to vary. Or is this more likely to be differences in the way heatsinks are mounted to the processors?

2
  • Most digital thermometers are accurate to about +/-1C. I don't see any reason why on-die sensors would need to be more accurate or would be less accurate. Maybe you should monitor over time, they might all slowly oscillate between 12C and 20C due to hysteresis in the cooling system? Commented Aug 22, 2012 at 9:48
  • 1
    There are three reasons why on-die sensors are less accurate. First, they're not fully calibrated or trimmed (just offset, the slope is not calibrated). They are only required to be accurate enough to detect temperatures over a trip point. Second, they're made on processes optimized for making CPUs, not optimized for making temperature measuring devices. Third, acceptance criteria are very loose -- you might throw out a $.05 digital temperature sensor if it's a bit inaccurate, but are you going to throw out a $45 CPU if its on-die temperature sensor is a bit inaccurate? Commented Aug 22, 2012 at 9:54

1 Answer 1

2

The on-die temperature sensor on the Atom is only designed to be accurate over the range of temperatures needed to control a fan and shut down the CPU if it overheats. It is very prone to "bottoming out" at low temperatures. Most likely, you are just reading the lowest temperature each on-die sensor can report, a temperature too low to activate the fan control or overheat logic and thus outside the on-die sensor's designed accuracy range.

Because each on-die temperature sensor is offset at the factory to ensure it activates thermal throttling at the correct temperature, the bottoming out temperature can be different on each CPU. While they're probably each internally reading the lowest temperature they can read, they are each adding a different calibration offset to this temperature and thus reporting a different measured temperature.

Some Intel CPUs have a more accurate thermistor temperature sensor bonded to the heat spreader. This is much more accurate for measuring CPU temperature than the on-die diode sensor (though it responds too slowly to be used as a thermal safety). I don't think the Atom's do though.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .