3
\$\begingroup\$

I understand that the three ways Geiger counters can be calibrated are electronic calibration and energy calibration, which both use a pulse generator, and radiological calibration which uses a check source.

Electronic calibration simply involves sending pulses to the Geiger counter, which registers them as counts. But why is this necessary? I thought every ionization in the Geiger–Müller tube registered as a count anyway, without the need for calibration using electrical pulses beforehand. So what is there to calibrate? One ionization always equals one count.

Radiological calibration involves exposing the Geiger counter to a radioactive check source and setting the Geiger counter's measurement of the radioactivity to match the actual known radioactivity of the check source. But instead, why not simply program the Geiger counter to know that x cpm from a particular source = x µSv/h as detected by a particular model of Geiger–Müller tube?

I sort of understand the usefulness of energy calibration; I understand that it determines the voltage threshold for each ionization; but I cannot see the point of the other two methods.

\$\endgroup\$
4
  • 1
    \$\begingroup\$ "One ionization always equals one count." That's what you expect but without periodic calibration is not something you can assume. \$\endgroup\$
    – winny
    Commented Jun 26, 2018 at 10:33
  • 1
    \$\begingroup\$ Actually, the OP kinda has a point. Energy calibration takes care of equipment aging, manufacturing differences etc. Basically it takes care of the count being right. But then translation of count into radiation level should be straightforward math not requiring calibration. \$\endgroup\$
    – Maple
    Commented Jun 26, 2018 at 11:01
  • \$\begingroup\$ Unless "one count" is electrically defined as something like "time frame in which number of ionizations passes certain threshold". Then the calibration with generator make perfect sense. \$\endgroup\$
    – Maple
    Commented Jun 26, 2018 at 11:06
  • 2
    \$\begingroup\$ Consider that (at least on analog counters) you are not reading a count directly (clicks per second.) You are reading a voltage generated by low pass filtering the clicks. The DC output from that is proportional to the number of clicks per second (or whatever time period.) \$\endgroup\$
    – JRE
    Commented Jun 26, 2018 at 11:12

4 Answers 4

2
\$\begingroup\$

First of all, the answer lies in the very principle of a measurement. When you measure something (anything, even weight, length etc.) you actually compare the measured parameter of the object to another object (a standard) with known parameters. The same applies to radioactivity (cps) or any other parameter/measurement for that matter. This also means that it's impossible to take an "absolute" measurement of anything.

The other problems you seem to neglect is the problem of noise and aging/change of your measuring instrument.

Noise is something that always has to be accounted for, otherwise you'll end up "measuring" values which bear no meaning whatsoever (as they contain random "information" that's the noise itself) or end up with "glitches" in your measurements (and might even draw wrong conclusions from them). To translate this into your question: one count might not always be one count, because of the noise present in the instrument (both in the physical and the electronic part).

Aging is something that's particularly of concern with equipment dealing with radioactive radiation. Since such radiation is an "ionizing" one, it means that the very exposure to such source of radiation causes electrons to be kicked out of the instrument's material (which sometimes leads to permanent changes in the material structure). The other cause of aging of these materials is the very environment we live in. Just consider this: ~21% of the atmosphere we live in consists of a very reactive gas called oxygen. The other problem lies in the fact that literally everything is covered in a thin layer of water (a layer so thin that it can't be seen with the naked eye but causes problems in atomic force microscopy). Since most of the elements on Earth are at least somewhat reactive (with the exception of gold and platinum), they enter into chemical reactions with the water, the oxygen or other materials in the air (CO2, the sulfides causing the tarnishing of silver etc.) and thus the materials change as the result of these reactions. This will cause instruments to "go out of tune", their signal-to-noise ratio to decrease and other effects that cause instruments to go out of calibration.

Moreover Geiger counters are really precise instruments that are some of the most sensitive devices out there. This means that even really small changes will have a noticeable effect on their calibration.

\$\endgroup\$
1
  • \$\begingroup\$ @ CoolKoon Can you discuss how these counters achieve their precision behavior? Stabilizing the high voltages? or the comparator trip points? windowing the pulse-duration? or what? \$\endgroup\$ Commented Jun 26, 2018 at 15:31
2
\$\begingroup\$

Electronic calibration simply involves sending pulses to the Geiger counter, which registers them as counts. But why is this necessary? I thought every ionization in the Geiger–Müller tube registered as a count anyway, without the need for calibration using electrical pulses beforehand. So what is there to calibrate? One ionization always equals one count.

True enough, but the Geiger counter doesn't concern itself with single counts. Instead, it converts count frequency to radiation level. Electronic calibration assumes the tube itself is working correctly, but then provides known pulse frequencies which allow the calibration of rate vs dose to be checked.

Radiological calibration involves exposing the Geiger counter to a radioactive check source and setting the Geiger counter's measurement of the radioactivity to match the actual known radioactivity of the check source. But instead, why not simply program the Geiger counter to know that x cpm from a particular source = x µSv/h as detected by a particular model of Geiger–Müller tube?

That is essentially what electronic calibration does. But it makes assumptions about the response of the GM tube itself. Exposing the unit to a known source of radiation is (once you deal with the precautions needed) simple, fast, and by its nature comprehensive. Why mess around with simulations when you can check directly against reality?

\$\endgroup\$
1
  • 1
    \$\begingroup\$ (+1) In particular, the classroom demonstration where you hear a GM tube produce discrete clicks that you can count by ear is rather far from the usual working conditions. A single particle is nothing as far as radiation safety goes; you want to be able to measure so many that some of the clicks may arrive too close together to register separately with a naive digital counter. So you want at least to know how your input circuits respond in the analog domain to such clumping so you can correct for it later. \$\endgroup\$ Commented Jun 26, 2018 at 15:09
1
\$\begingroup\$

Just a guess, I have no idea how Geiger counters are made.

The count by itself has no meaning without time domain. What you actually measuring is "counts per time period". And timing circuitry definitely needs calibration, which is best done with precise pulse generator.

\$\endgroup\$
0
\$\begingroup\$

The point of a geiger counter calibration is for confirming the correct detector response and linearity at high and low counts rates. At much higher count rates, the dead time of your detector circuitry comes into effect and the result has to be compensated for counts that are missed.

\$\endgroup\$

Not the answer you're looking for? Browse other questions tagged or ask your own question.