87
$\begingroup$

There was an atomic bomb dropped in Hiroshima, but today there are residents in Hiroshima. However, in Chernobyl, where there was a nuclear reactor meltdown, there are no residents living today (or very few). What made the difference?

$\endgroup$
3
  • 8
    $\begingroup$ Actually Chernobyl is already kinda safe now. Typical radiation levels are only 1 uSv/hour and lower, peaking to 10 uSv/hour in the area near to reactor itself. There are populated cities with higher ambient radiation levels. So it's not inhabited only because of intertia, fear and bureaucracy. BTW I lived for 15 years 100km from Chernobyl. $\endgroup$ Commented Sep 28, 2014 at 8:56
  • $\begingroup$ @user14154 amount of radioactive in Chernobyl is equal to 100 atomic bomb! $\endgroup$
    – AminM
    Commented Nov 18, 2015 at 6:47
  • 3
    $\begingroup$ @BarsMonster Is that the reason for your online name??? :) You look fairly normal from you profile $\endgroup$ Commented Mar 13, 2017 at 23:55

3 Answers 3

55
$\begingroup$

While they work on the same principles, the detonation of an atomic bomb and the meltdown of a nuclear plant are two very different processes.

An atomic bomb is based on the idea of releasing as much energy from a runaway nuclear fission reaction as possible in the shortest amount of time. The idea being to create as much devastating damage as possible immediately so as to nullify enemy forces or intimidate the opposing side into surrender. Both effectively ensuring the conflict ends quickly. Thus, it would be important that the area bombed does not remain uninhabitable long after the two sides make peace (Ok, that's my own speculation, but I think it's a nice ideal to work with).

A nuclear reactor is based on the idea of producing low amounts of power using a controlled and sustained nuclear fission reaction. The point being that it does not release all of the energy at once and slower reaction processes are used to ensure maximum lifetime of the nuclear fuel.

Moving beyond the ideas behind each, the radioactive isotopes created in an atomic blast are relatively short-lived due to the nature of the blast and the fact that they are normally detonated above the ground to increase destructive power of the concussive wave. Most radioactive materials from an atomic blast have a maximum half-life of 50 years.

However, in the Chernobyl meltdown, most of the actual exploding was due to containment failure and explosions from steam build-up. Chunks of fuel rods and irradiated graphite rods remained intact. Furthermore, the reaction has, both initially and over its life, produced a far higher amount of radioactive materials. This is partly due to the nature of the reaction, the existence of intact fuel to this date, and that the explosion happened at ground level. A fission explosion at ground level creates more radioactive isotopes due to neutron activation in soil. Furthermore, the half-lives of the isotopes made in the Chernobyl accident (because of the nature of the process) are considerably longer. It is estimated that the area will not be habitable for humans for another 20 000 years (Edit: to prevent further debate I rechecked this number. That is the time before the area within the cement sarcophagus - the exact location of the blast - becomes safe. The surrounding area varies between 20 years and several hundred due to uneven contamination).

Long story short, an atomic bomb is, like other bombs, designed to achieve the most destructive force possible over a short amount of time. The reaction process that accomplishes this ends up creating short-lived radioactive particles, which means the initial radiation burst is extremely high but falls off rapidly. Whereas a nuclear reactor is designed to utilize the full extent of fission in producing power from a slow, sustained reaction process. This reaction results in the creation of nuclear waste materials that are relatively long-lived, which means that the initial radiation burst from a meltdown may be much lower than that of a bomb, but it lasts much longer.

In the global perspective: an atomic bomb may be hazardous to the health of those nearby, but a meltdown spreads radiation across the planet for years. At this point, everyone on Earth has averaged an extra 21 days of background radiation exposure per person due to Chernobyl. This is one of the reasons Chernobyl was a level 7 nuclear event.

All of this contribute to why even though Hiroshima had an atomic bomb detonate, it is Chernobyl (and Fukushima too I'll wager) that remains uninhabitable.

Most of the relevant info for this can be found in Wikipedia.

One further thing:
As pointed out, one thing I forgot to mention is that the amount of fissionable material in an atomic bomb is usually considerably less than the amount housed in a nuclear reactor. A standard nuclear reactor can consume $50000lb$ ($\sim22700kg$) of fuel in a year, whereas little boy held significantly less (around $100-150lb$ or $45-70kg$). Obviously, having more fissionable material drastically increases the amount of radiation that can be output as well as the amount of radioactive isotopes. For example, the meltdown at Chernobyl released 25 times more Iodine-129 isotope than the Hiroshima bomb (an isotope that is relatively long-lived and dangerous to humans) and 890 times more Cesium-137 (not as long lived, but still a danger while it is present).

$\endgroup$
9
  • 1
    $\begingroup$ @swdev The iodine isotope I referenced is I-129, not I-131. I-131 is made in abundance in nuclear reactors, but in nuclear fission contamination events (especially Chernobyl), I-129 is created in sufficiently dangerous levels. It has a halflife of 15.7 million years. $\endgroup$
    – Jim
    Commented Apr 30, 2015 at 14:07
  • 1
    $\begingroup$ You can, in fact, look it up too that I-129 is more biophilic than some of the other Iodine isotopes, which means it is more dangerous even in smaller amounts. So when I said the iodine isotope is relatively long-lived and dangerous to humans, I was not incorrect. And Cs-137 isn't as long lived as it (30 years vs 15.7 million years) $\endgroup$
    – Jim
    Commented Apr 30, 2015 at 14:10
  • 1
    $\begingroup$ I-131 is a literally a billion times more radioactive than I-129. Wouldn't you need a billion times more of it to be equally dangerous? $\endgroup$
    – swdev
    Commented May 1, 2015 at 8:08
  • 2
    $\begingroup$ @swdev I never said it was more dangerous than I-131. It is more dangerous than other Iodine isotopes. Specifically, I-123, I-124, I-125, and I-128. It is less dangerous than I-131 and I-135. But I-131 has a halflife of 8 days and I-135 has a halflife of under 7 hours, so they aren't a danger for very long. I-129 is persistent and penetrates into the ecology easily. It is the primary tracer for nuclear fission contamination of an environment. $\endgroup$
    – Jim
    Commented May 6, 2015 at 13:59
  • 2
    $\begingroup$ @swdev I gave examples of isotope levels and how they were much greater from Chernobyl than Hiroshima merely to indicate that all isotopes were produced in greater amounts. Why are we nitpicking the choice of examples I selected? $\endgroup$
    – Jim
    Commented May 7, 2015 at 17:05
16
$\begingroup$

A quick calculation brings some of the points in the other answers into clear focus.

Consider a big power station, like Fukishima before its demise. Its output was at a whopping rate of $5GW$.

From here I get the conversion factor that 1 kiloton of TNT equivalent is taken to be $4.184\times 10^{12}$ joules. Assuming the Nagasaki bomb let slip 20 kiloton TNT equivalent, this is about $8\times10^{13}J$.

Now do the calculation: how long does it take (working) Fukishima to output this much energy? Answer $8\times10^{13} / 5\times10^9=16000s$. That is, about four and a half hours. Less than one afternoon's output!

Now I hasten to add that I am in no way trivialising what was suffered by those at Hiroshima or Nagasaki. But in these terms, the amount of energy and consequent waste output by even a fearsome several megaton bomb is rather trivial compared to the lifetimg output of a power station. And the main contamination from a bomb tends to be lethal, but very short lived isotopes begotten by the irradiation of dirt and other matter sucked into the updraught.

$\endgroup$
0
15
$\begingroup$

Short answer: A nuclear power plant contains a lot more nuclear material than an atomic bomb. The "Little Boy" bomb was detonated at 1968 feet (600m) over Hiroshima with the nuclear material dispersed quickly in the air; the Chernobyl meltdown contaminated its environment for decades.

Long answer:

http://en.wikipedia.org/wiki/Background_radiation

Total doses from the Chernobyl accident ranged from 10 to 50 mSv over 20 years for the inhabitants of the affected areas, with most of the dose received in the first years after the disaster, and over 100 mSv for liquidators. There were 28 deaths from acute radiation syndrome.[30]

Total doses from the Fukushima I accidents were between 1 and 15 mSv for the inhabitants of the affected areas. Thyroid doses for children were below 50 mSv. 167 cleanup workers received doses above 100 mSv, with 6 of them receiving more than 250 mSv (the Japanese exposure limit for emergency response workers).[31]

The average dose from the Three Mile Island accident was 0.01 mSv.[32]

http://www.huffingtonpost.com/patrick-takahashi/why-worry-about-fukushima_b_847250.html

Today, the background radiation in Hiroshima and Nagasaki is the same as the average amount of natural radiation present anywhere on Earth. It is not enough to affect human health.

There was a slight increase of leukemia in the Nagasaki region, but no additional incidence of cancers anywhere in and around Hiroshima. Thus, contrary to any kind of logical sense, while the high altitude (1968 feet for Hiroshima and 1800 feet for Nagasaki) of the nuclear explosions immediately killed 200,000 people, these cities soon became safe, and are thriving today. I'm, actually, still wondering why.

But with respect to the relative long-term danger of nuclear power plants versus ATOMIC BOMBS, another article mentioned that there is a lot more fissionable material in the former compared to the latter. For example, a 1000 MW reactor uses 50,000 pounds of enriched uranium/year and produces 54,000 pounds of waste, which keeps accumulating, so in a 20-year period, there should be more than a million pounds of radioactive material on site. Little Boy had only 141 pounds of U-235, while Fat Man used 14 pounds of Pu-239.

Chernobyl released 200 times more radiation than the Hiroshima and Nagasaki bombs, combined. As far away as Scotland, the radiation rose to 10,000 times the norm. Frighteningly, the Fukushima reactors are said to be more dangerous than Chernobyl (Uranium-235) for two reasons: more enriched uranium, and Fukushima #3 has plutonium.

$\endgroup$
1
  • 1
    $\begingroup$ About the low incidence of cancers in Hiroshima / Nagasaki. Radiation doesn't really cause mutations in biologgy, unless of lowish, chronic levels. Instead of mutating biology, radiation tends to destroy it outright. $\endgroup$ Commented Sep 6, 2015 at 23:12

Not the answer you're looking for? Browse other questions tagged or ask your own question.