Thursday, 25 August 2016

Nuclear power plants cause climate change by emitting Krypton-85, not

"Nuclear news"

Is not a nuclear news site. It's a site full of junk written by keen antinuclear power activist Christina MacPherson / Noel Wanchope (They are the same person). Any lie or fabrication about nuclear power is repeated by nuclear news and sent out to her followers. The criteria for publication here is antinuclear good, pronuclear ignored. To some people nuclear news is a nutty site. To others it's a hero site. It's remarkable how similar it is in choice of story to mainstream newspapers like The Guardian: pronuclear all but banned, almost any antinuclear story published.

Let's look at one claim Christina makes on her site, for example the myth that

krypton-85 is responsible for climate change
Google gives 97 hits for this meme on Christina MacPherson's website:

Krypton-85 is a very rare radioactive gas found in earth's atmosphere and mostly originating from nuclear power. In particular it leaks out when spent fuel is reprocessed. Krypton is an inert gas, and does not react chemically with other atoms. It occurs in mono-molecular form, AKA: single atoms. Because it has made no bonds there is no polarization to speak of in the molecule and it has no radiative forcing effect.

The basis of this anti-nuke claim (Krypton-85 made by nuclear power causes climate change) originated decades ago from a speculative academic paper. The researchers said that if a lot more krypton-85 was made (tens, hundreds of times as much?), then it might have an effect. Krypton-85 decays with a ½-life = 10¾ years. When it decays it produces a positively charged ion and a negatively charged electron (beta-ray). The anti-nuke theory goes this atmospheric ionization causes climate change. Let's look at the numbers.

  1. There's a small amount of Kr-85. In 2009, the total amount of Kr-85 in the atmosphere was estimated at 5500 PBq due to human sources. This is 0.38 tonnes of Kr-85 in 5,140 trillion tonnes of air (making up all earth's atmosphere). 0.000000000074 ppm, or 0.000074 ppt (parts per trillion).
  2. The atmosphere is positively charged (overall). Earth has an overall negative charge. This is why lightning happens.
  3. An average bolt of lightning carries a negative electric current of 40 kiloamperes (kA) (although some bolts can be up to 120 kA), and transfers a charge of five coulombs and energy of 500 MJ, or enough energy to power a 100-watt lightbulb for just under two months. There are estimated to be around 2,000 lightning storms active around the globe at one time creating over 100 strikes per second. i.e. 500 coulombs discharged per second on earth.
  4. 5500 Pbq means, each second 5.5 × 1018 positive ions are made by Kr-85, and the same number of electrons. This is the number of Kr-85 decays happening. About 5.5 ÷ 6.241 coulomb. = 0.88 coulomb. This assumes all the negative ions are lost. Negative charge is not lost. All beta-rays are captured in the atmosphere. So there is no net charge contribution to the atmosphere from krypton-85. A beta ray (electron) can travel about 2 metres in air before it collides with an air molecule.
  5. Krypton-85 makes about 0.88 coulomb of positive and negative charge (per second). Each second, about 500 coulombs of charge in the atmosphere is neutralized by lightning.
  6. There has to be far more Kr-85 in the atmosphere for it to affect climate in any measurable way. Even then climate scientists dispute the effect of ionization, which is said to cause cloud formation, which either cools or warms depending on whether these are in the upper or lower atmosphere.
  7. I almost forgot. Significant buildup of Kr-85 will not happen because doubling the amount emitted (mostly by nuclear fuel reprocessing plants) will not double the amount in the atmosphere. This is because of its 10¾ year ½-life.

Alert observers may notice I already covered this once. I was completely baffled when I discovered this antinuclear power meme. It holds a kind of ghoulish fascination for me. As in: just how daft can hard-core anti-nukes get?


RationalWiki: Krypton-85 and climate change

Saturday, 6 August 2016

Concerns about National Academy of Sciences and Scientific Dissent

I found this. I can't resist posting it here.

Concerns about National Academy of Sciences and Scientific Dissent
Dec 15, 2015
Peter Wood

Introductory note: NAS president Peter Wood sent the following letter by email on December 9, 2015 to California members of the National Academy of Sciences.

Dear Members of the National Academy of Sciences,

This is an NAS to NAS letter—which requires some “disambiguation.” I am president of the National Association of Scholars, founded in 1987, and whose organizers apparently didn’t give much thought to the space already occupied by those initials by the National Academy of Sciences, founded 124 years earlier. I’ll defer to the Academy’s seniority by reserving NAS in what follows for the body of scientists who incorporated during President Lincoln’s tenure. The National Association of Scholars is a broad-based group of academics that includes professors in the humanities and social sciences (I’m an anthropologist) as well as the natural sciences.

The occasion for this letter is Dr. Marcia K. McNutt, Editor-in-Chief of Science. We are concerned that she is the only official candidate to be the next NAS president. To be clear, the National Association of Scholars does not oppose Dr. McNutt’s candidacy. We simply believe that members of an important national organization like NAS should have at least two candidates to consider when voting for your next president. Indeed, the American Association for the Advancement of Science (AAAS), which publishes Science, always has two candidates for president and its other elected positions. Other scientific organizations also have two candidates for their elected positions.

Also, we want to bring to your attention our serious concerns about the current state of discourse in the sciences. Dr. McNutt has played a significant role in three active controversies involving national regulatory policy that deserve attention in themselves and that are also part of a larger problem. The larger problem is how the scientific establishment, particularly Science and NAS, should evaluate and respond to serious dissent from legitimate scientists. This is an especially important consideration for NAS, which was established to provide “independent, objective advice on issues that affect people's lives worldwide.”

The three controversies are:

  1. The status of the linear no-threshold (LNT) dose-response model for the biological effects of nuclear radiation. The prominence of the model stems from the June 29, 1956 Science paper, “Genetic Effects of Atomic Radiation,” authored by the NAS Committee on the Biological Effects of Atomic Radiation. This paper is now widely questioned and has been seriously critiqued in many peer-reviewed publications, including two detailed 2015 papers. These criticisms are being taken seriously around the world, as summarized in a December 2, 2015 Wall Street Journal commentary. In August 2015 four distinguished critics of LNT made a formal request to Dr. McNutt to examine the evidence of fundamental flaws in the 1956 paper and retract it. However, on August 11, 2015 Dr. McNutt rejected this request without even reviewing the detailed evidence. Furthermore, Dr. McNutt did not even consider recusing herself and having independent reviewers examine evidence that challenges the validity of both a Science paper and an NAS Committee Report.

    This is a consequential matter that bears on a great deal of national public policy, as the LNT model has served as the basis for risk assessment and risk management of radiation and chemical carcinogens for decades, but now needs to be seriously reassessed. This reassessment could profoundly alter many regulations from the Nuclear Regulatory Commission, Environmental Protection Agency, and other government agencies. The relevant documents regarding the 1956 Science paper and Dr. McNutt can be examined at

  2. Extensive evidence of scientific misconduct in the epidemiology of fine particulate air pollution (PM2.5) and its relationship to mortality. Since 1997 EPA has claimed that lifetime inhalation of about a teaspoon of particles with diameter less than 2.5 microns causes premature death in the United States and it established an national regulation based on this claim. Science has provided extensive news coverage of this issue and its regulatory significance, but has never published any scientific criticism of this questionable claim, which is largely based on nontransparent research.

    Earlier this year, nine accomplished scientists and academics submitted to Science well-documented evidence of misconduct by several of the PM2.5 researchers relied upon by EPA. The evidence of misconduct was first submitted to Dr. McNutt in a detailed June 4, 2015 email letter, then in a detailed July 20, 2015 Policy Forum manuscript “Transparent Science is Necessary for EPA Regulations,” and finally in an August 17, 2015 Perspective manuscript “Particulate Matter Does Not Cause Premature Deaths.” Dr. McNutt and two Science editors immediately rejected the letter and the manuscripts and never conducted any internal or external review of the evidence. This a consequential matter because many multi-billion dollar EPA air pollution regulations, such as the Clean Power Plan, are primarily justified by the claim that PM2.5 is killing Americans. The relevant documents regarding this controversy can be examined at

  3. Science promotes the so-called consensus model of climate change and excludes any contrary views. This issue has become so polarized and polarizing that it is difficult to bring up, but at some point the scientific community will have to reckon with the dramatic discrepancies between current climate models and substantial parts of the empirical record. Recent evidence of Science bias on this issue is the June 26, 2015 article by Dr. Thomas R. Karl, “Possible artifacts of data biases in the recent global surface warming hiatus”; the July 3, 2015 McNutt editorial, “The beyond-two-degree inferno”; the November 13, 2015 McNutt editorial, “Climate warning, 50 years later”; and the November 25, 2015 AAAS News Release, “AAAS Leads Coalition to Protest Climate Science Inquiry.”

    Dr. McNutt’s position is, of course, consistent with the official position of the AAAS. But the attempt to declare that the “pause” in global warming was an illusion has not been accepted by several respected and well-informed scientists. One would not know this, however, from reading Science, which has declined to publish any dissenting views. One can be a strong supporter of the consensus model and yet be disturbed by the role which Science has played in this controversy. Dr. McNutt and the journal have acted more like partisan activists than like responsible stewards of scientific standards confronted with contentious claims and ambiguous evidence. The relevant documents and commentary regarding the Karl paper and McNutt editorials can be examined at

All three of these controversies have arisen on issues in which a strong degree of scientific consensus became intertwined with public policy and institutional self-interest. That intertwining can create selective blindness.

Dr. McNutt has in her career found herself faced more than once with the challenge of what to do when an entrenched orthodoxy meets a substantial scientific challenge. The challenge in each case could itself prove to be mistaken, but it met what most scientists would concede to be the threshold criteria to deserve a serious hearing. Yet in each case Dr. McNutt chose to reinforce the orthodoxy by shutting the door on the challenge.

The three areas that I sketched above, however, seem to have such prominence in public policy that they would warrant an even greater investment in time, care, and attention than would be normally the case. In that light, Dr. McNutt’s dismissive treatment of scientific criticisms is disturbing.

I bring these matters to your attention in the hope of accomplishing two things: raise awareness that the three issues represent threats to the integrity of science arising from the all-too-human tendency to turn ideas into orthodoxies; and suggest that it might be wise for NAS to nominate as a second candidate for president someone who has a reputation for scientific objectivity and fairness and who does not enforce orthodoxy.

I welcome your responses. The National Association of Scholars will present an open forum on these matters with a section reserved specifically for NAS members. Furthermore, I will put you in contact with NAS members who are concerned about Dr. McNutt becoming the next NAS president.

Thank you for your consideration.

Yours sincerely,

Peter Wood
National Association of Scholars
8 W. 38th Street, Suite 503
New York, NY 10018
(917) 551-6770

Thursday, 21 July 2016

Who killed nuclear power and why?

  1. In trying to answer the question, we look at who is most opposed to it today: the Green movement. Look back to the period when the green movement moved against nuclear power (late 1960s/early 1970s). The 'Club of Rome' began in 1968. Friends of the Earth, FotE, in 1969. The term 'Renewable Energy' first appears in print in 1971 in Scientific American. FotE first employee was Amory Lovins who became the renewable energy guru. Before he was ever a renewable energy guru, he tried to become the anti-nuclear power guru. Five of his books have 'nuclear' in the title. In Australia, FotE established CANE : The Campaign Against Nuclear Energy, in 1976. It went on to have a major effect in Australia - one of two countries in the world to ban nuclear power early on (the other was Austria, in 1978). Their motive is population control: by limiting energy use, they would stop population growth. Not necessarily an argument you'll often hear them make. Also a wrong argument. Promoting poverty, and energy poverty, has the exact opposite effect: poor people have more children because they can, and perhaps, because they view children as an economic resource.

    This explanation is promoted by ecomodernists at Environmental Progress, The Breakthrough Institute, etc. Many are ex-greens. Other ex-greens also support this premise. PS: by 'ex-green' I mean ex- mainstream green. The story is most convincingly told by Michael Shellenberger (himself once an anti-nuclear activist working to 'constipate' nuclear power by promoting arguments against nuclear waste). The weakness of this argument is it does not give due credence to funds giving greens so much influence (see: Steve Malanga, and Donald Gibson), nor on other political factors at work (see Marsha Freeman), nor Malthusian ideas already widespread in society, nor why the late 1960s/ early 1970s were pivotal. I agree the Green Movement became intellectual victims of Malthusian ideas (see Gibson, page 78). But why? What is the connection between greens and Malthus? Why is it so pronounced. Is something more fundamental at work here? Something more akin to a generalized anti-humanism, which finds its own expression within the green movement as environmentalism. There is certainly a huge schism between generally people loving green nukes, and anti-human green anti-nukes.

  2. US neo-Cons under the influence of Albert Wohlstetter. These were a small number, perhaps only a dozen, ex-liberals, who turned to conservatism in the late 1960s/early 1970s to later occupy influential positions in government. Their argument was anti-proliferation. Anti-proliferation was cited by Carter Democrats in 1977 for stopping breeder reactors, and Clinton Democrats in 1994 when they stopped all US government research into nuclear power. It became a major plank of US foreign policy. All nuclear vendors had to buy into the notion of the 'cradle to grave' nuclear fuel cycle which limited enrichment, breeder reactors, and reprocessing technologies. In politics, bomb proliferation was portrayed as the major threat to world peace and security.
  3. Fossil fuel lobbyists. The Atomic Energy Commission, AEC, lost control of nuclear regulation in 1974 when the US created the Nuclear Regulatory Commission, NRC, in response to Fossil Fuel lobbying. The NRC had the sole responsibility to make nuclear power as safe as possible. Previously: the AEC had a dual mandate: to promote safe nuclear power. In response to NRC creation, investment in new US nuclear plants vanished overnight. The NRC licensed no new nuclear plants for decades. This argument is popularized by blogger Rod Adams, AKA atomic rod.

But surely the reader objects: it was public fear in response to the 'disasters' of Windscale, Three Mile Island, Chernobyl and Fukushima Daiichi? No. Public fear in response to the Banqiao Dam disaster of 1975, which eventually killed 171,000 people, did not kill hydroelectricity. The 'disasters' above were responsible for less than 60 deaths caused by radiation.

A lot of people are very confused about what stopped nuclear power. In citing multiple points above it looks like I'm just adding to confusion. The failure of nuclear power had nothing at all to do with Three Mile Island, Chernobyl nor Fukushima. In 1978, an Austrian referendum on nuclear power saw 49.5% vote for and 50.5% against nuclear power in Austria. This stopped nuclear power in Austria before the Three Mile Island incident of 1979. What's so special about Austria? It was the Adolf Hitler's birthplace. His green political movement: the Nazis began the Second World War under the influence of Malthusian ideas. See "Black Earth: The Holocaust as History and Warning", 2015, by Timothy Snyder

What have all these anti-nuke ideas in common? Under whose banner do they rally? The common factor is Malthus. I use Malthus in the wide sense as an obsession with economic limits. None of these arguments above exclude others. The promotion of Malthus, beginning in earnest, in 1968 with the 'Club of Rome', does not exclude anti-nuke contributions from the fossil fuel industry, neo-cons, ex- peace-movement anti-nukes, and deep greens. It welcomes such efforts, and funds them. There were anti-nuclear power people before 1968 but FotE (1969) were the first to dedicate themselves to the task. Late 1960s was a pivotal point leading to an early 1970s tip, which got us where we are today.

Thursday, 19 May 2016

Unreported massive release of radiation 36 years ago!

Unreported Release

by Karl Johanson

36 years ago today, a major release of radioactive material took place in Washington State. The explosive event which lead to the release, killed 61 people and was covered extensively by the news media. However, the release of radioactive material was never mentioned in this coverage. The amount of radioactive material released can only be roughly approximated, but the following is a fair estimate.

  • Actinium-227 13 grams
  • Thorium-228 18 grams
  • Radium-228 60 grams
  • Lead-210 300 grams
  • Protactinium-231 20 kilograms
  • Radium-226 22 kilograms
  • Thorium-230 1,000 kilograms
  • Uranium-234 32,000 kilograms
  • Uranium-235 420,000 kilograms
  • Uranium-238 60,000,000 kilograms
  • Thorium-232 170,000,000 kilograms
  • Potassium-40 300,000,000 kilograms
  • Rubidium-87 337,000,000 kilograms

In addition to the above, the following radioactive isotopes were released in trace amounts:

  • Astatine 215, 216, 218 & 219
  • Bismuth 210, 211, 212, 214 & 215
  • Francium 223
  • Lead 211, 212 & 214
  • Plutonium 239 & 244
  • Polonium 210, 211, 212, 214, 215, 216 & 218
  • Radon 219, 220, & 222
  • Thallium 206, 2207, 208 & 210
  • Thorium 227

Many of the listed isotopes are the daughter isotopes of Uranium 238, Uranium 235 and Thorium 232.

This amount of radioactive material is (very roughly) what one would expect to find in any 4 cubic kilometres of the Earth's crust. The event which expelled this material, involved a release of energy roughly 500 times that of the nuclear bomb used on Nagasaki. 500 hectares of land was devastated and shockwaves shook houses more than 30 kilometres away. Around 25% of the material was emitted as dust, which remained in the atmosphere for some time. The remainder precipitated out fairly rapidly over the nearby countryside.

To add some interesting perspectives on this amount of material, consider the following.

  • 420,000 kilograms of Uranium 235 is enough to make more than 100,000 nuclear weapons.
  • The US government is studying Yucca Mountain in Nevada with intent to store around 70,000 tonnes of spent nuclear fuel there.
  • Some claimed that roughly 40,000 kilograms of depleted uranium (almost pure Uranium 238) was used in the Gulf war and that it represented 500,000 "potential deaths". (I make no commentary here on the ethics of weapon use in general, nor of the ethics of the use of this specific weapon.) I'm curious what those people would estimate the number of "potential deaths" would be from the emission of around 1,000 times as much Uranium 238.
  • If you've heard and believed the mistaken claim that releasing 1 pound of plutonium would kill every human on Earth, consider the following. The 22 kilograms of Radium 226 (just one isotope from the above list) would have a specific level of alpha radiation equivalent to just over 350 kilograms (770 pounds) of Plutonium 239. Radium is also far more likely to form readily aspiratable particles and is more readily absorbed by the human body when ingested. Somehow, that powdered radium left more than 7 billion humans alive.

If you're at all curious, the event which released somewhere around the above estimated amount of material, happened on May 18, 1980 and was, of course, the eruption of Mount Saint Helens. Eleven years later, on June 13, 1991, Mt. Pinatubo released roughly one and a quarter times as much material (radioactive and otherwise) as Mount Saint Helens.

I would never suggest that a natural event emitting any given material should be used as ‘justification’ of humans releasing large amounts of similar materials. Nor does Saint Helens’ and Pinatubo’s emission of Lead and Uranium ‘justify’ the use of such things as Lead or Uranium bullets in any given situation. I do suggest however, that the Saint Helens and the Pinatubo examples (and dozens of other recent eruptions) are useful data points on the road to understanding the complex issue of the possible effects of releases of radioactive material.

Thursday, 12 May 2016

Nuclear Winter?

The nuclear winter meme3 originated in 1982 as a development of the, then, climate models. Carl Sagan (an anti-nuclear war advocate, and science popularist) was an advocate and contributed to a book sub-titled: "Nuclear Winter and the End of the Arms Race".1 I find it very interesting that there are two climate change nuclear criticisms:

  1. Nuclear winter - resulting from atomic war
  2. Climate change caused by atomic power

I already refuted the 2nd nuclear-caused climate change idea, but that's something most scientists and technically educated people can do at a glance. What about the first? Will a small nuclear war really exterminate humanity via climate change as the nuclear critics say? Obviously not. What about a massive nuclear war where every bomb is exploded and war is global: all over the planet? This, at least, is almost a scientific debate. British scientists considered nuclear winter scaremongering.5

The rest of this post below is an extract from "Nuclear War Survival Skills", by Cresson H. Kearny,2 but the best available refutation of the meme is now 30 old: Nuclear Winter Reappraised in Foreign Affairs.4. Which led to a debate with Carl Sagan.6


Unsurvivable "nuclear winter" surely will follow a nuclear war. The world will be frozen if only 100 megatons (less than one percent of all nuclear weapons) are used to ignite cities. World-enveloping smoke from fires and the dust from surface bursts will prevent almost all sunlight and solar heat from reaching the earth's surface. Universal darkness for weeks! Sub-zero temperatures, even in summertime! Frozen crops, even in the jungles of South America! Worldwide famine! Whole species of animals and plants exterminated! The survival of mankind in doubt!


Unsurvivable "nuclear winter" is a discredited theory that, since its conception in 1982, has been used to frighten additional millions into believing that trying to survive a nuclear war is a waste of effort and resources, and that only by ridding the world of almost all nuclear weapons do we have a chance of surviving.

Non-propagandizing scientists recently have calculated that the climatic and other environmental effects of even an all-out nuclear war would be much less severe than the catastrophic effects repeatedly publicized by popular astronomer Carl Sagan and his fellow activist scientists, and by all the involved Soviet scientists. Conclusions reached from these recent, realistic calculations are summarized in an article, "Nuclear Winter Reappraised", featured in the 1986 summer issue of Foreign Affairs, the prestigious quarterly of the Council on Foreign Relations. The authors, Starley L. Thompson and Stephen H. Schneider, are atmospheric scientists with the National Center for Atmospheric Research. They showed " that on scientific grounds the global apocalyptic conclusions of the initial nuclear winter hypothesis can now be relegated to a vanishing low level of probability."

Their models indicate that in July (when the greatest temperature reductions would result) the average temperature in the United States would be reduced for a few days from about 70 degrees Fahrenheit to approximately 50 degrees. (In contrast, under the same conditions Carl Sagan, his associates, and the Russian scientists predicted a resulting average temperature of about 10 degrees below zero Fahrenheit, lasting for many weeks!)

Persons who want to learn more about possible post-attack climatic effects also should read the Fall 1986 issue of Foreign Affairs. This issue contains a long letter from Thompson and Schneider which further demolishes the theory of catastrophic "nuclear winter." Continuing studies indicate there will be even smaller reductions in temperature than those calculated by Thompson and Schneider.

Soviet propagandists promptly exploited belief in unsurvivable "nuclear winter" to increase fear of nuclear weapons and war, and to demoralize their enemies. Because raging city firestorms are needed to inject huge amounts of smoke into the stratosphere and thus, according to one discredited theory, prevent almost all solar heat from reaching the ground, the Soviets changed their descriptions of how a modern city will burn if blasted by a nuclear explosion.

Figure 1.6 pictures how Russian scientists and civil defense officials realistically described - before the invention of "nuclear winter" - the burning of a city hit by a nuclear weapon. Buildings in the blasted area for miles around ground zero will be reduced to scattered rubble - mostly of concrete, steel, and other nonflammable materials - that will not burn in blazing fires. Thus in the Oak Ridge National Laboratory translation (ORNL-TR-2793) of Civil Defense. Second Edition (500,000 copies), Moscow, 1970, by Egorov, Shlyakhov, and Alabin, we read: "Fires do not occur in zones of complete destruction . . . that are characterized by an overpressure exceeding 0.5 kg/cm2 [- 7 psi]., because rubble is scattered and covers the burning structures. As a result the rubble only smolders, and fires as such do not occur."

Fig. 1.6. Drawing with Caption in a Russian Civil Defense Training Film Strip. The blazing fires ignited by a surface burst are shown in standing buildings outside the miles-wide "zone of complete destruction," where the blast-hurled "rubble only smolders."

Translation: [Radioactive] contamination occurs in the area of the explosion and also along the trajectory of the cloud which forms a radioactive track.

Firestorms destroyed the centers of Hamburg, Dresden, and Tokyo. The old-fashioned buildings of those cities contained large amounts of flammable materials, were ignited by many thousands of small incendiaries, and burned quickly as standing structures well supplied with air. No firestorm has ever injected smoke into the stratosphere, or caused appreciable cooling below its smoke cloud.

The theory that smoke from burning cities and forests and dust from nuclear explosions would cause worldwide freezing temperatures was conceived in 1982 by the German atmospheric chemist and environmentalist Paul Crutzen, and continues to be promoted by a worldwide propaganda campaign. This well funded campaign began in 1983 with televised scientific-political meetings in Cambridge and Washington featuring American and Russian scientists. A barrage of newspaper and magazine articles followed, including a scaremongering article by Carl Sagan in the October 30, 1983 issue of Parade, the Sunday tabloid read by millions. The most influential article was featured in the December 23,1983 issue of Science (the weekly magazine of the American Association for the Advancement of Science): "Nuclear winter, global consequences of multiple nuclear explosions," by five scientists, R. P. Turco, O. B. Toon, T. P. Ackerman, J. B. Pollack, and C. Sagan. Significantly, these activists listed their names to spell TTAPS, pronounced "taps," the bugle call proclaiming "lights out" or the end of a military funeral.

Until 1985, non-propagandizing scientists did not begin to effectively refute the numerous errors, unrealistic assumptions, and computer modeling weakness' of the TTAPS and related "nuclear winter" hypotheses. A principal reason is that government organizations, private corporations, and most scientists generally avoid getting involved in political controversies, or making statements likely to enable antinuclear activists to accuse them of minimizing nuclear war dangers, thus undermining hopes for peace. Stephen Schneider has been called a fascist by some disarmament supporters for having written "Nuclear Winter Reappraised," according to the Rocky Mountain News of July 6, 1986. Three days later, this paper, that until recently featured accounts of unsurvivable "nuclear winter," criticized Carl Sagan and defended Thompson and Schneider in its lead editorial, "In Study of Nuclear Winter, Let Scientists Be Scientists." In a free country, truth will out - although sometimes too late to effectively counter fast-hitting propaganda.

Effective refutation of "nuclear winter" also was delayed by the prestige of politicians and of politically motivated scientists and scientific organizations endorsing the TTAPS forecast of worldwide doom. Furthermore, the weakness' in the TTAPS hypothesis could not be effectively explored until adequate Government funding was made available to cover costs of lengthy, expensive studies, including improved computer modeling of interrelated, poorly understood meteorological phenomena.

Serious climatic effects from a Soviet-U.S. nuclear war cannot be completely ruled out. However, possible deaths from uncertain climatic effects are a small danger compared to the incalculable millions in many countries likely to die from starvation caused by disastrous shortages of essentials of modern agriculture sure to result from a Soviet-American nuclear war, and by the cessation of most international food shipments.

  1. A Path Where No Man Thought: Nuclear Winter and the End of the Arms Race, by Carl Sagan, Richard Turco;
  2. Nuclear War Survival Skills, by Cresson H. Kearny
  3. Nuclear winter (wikipedia)
  4. Thompson, Starley L & Schneider, Stephen H Nuclear Winter Reappraised in Foreign Affairs, Vol. 64, No. 5 (Summer, 1986), pp. 981-10055. doi:10.2307/20042777.
  5. "Home Office dismissed nuclear winter threat as scaremongering, files show", Guardian 30 Nov 2014.
  6. The Nuclear Winter Debate, by Carl Sagan, Richard Turco, George W. Rathjens, Ronald H. Siegel, Starley L. Thompson and Stephen H. Schneider; Foreign Affairs, Vol. 65, No. 1 (Fall, 1986), pp. 163-178, DOI: 10.2307/20042868

Monday, 2 May 2016

Don’t criticize what you can’t understand

Your old road is rapidly agin’
Please get out of the new one if you can’t lend your hand
For the times they are a-changin’
If only Jim Green (national anti-nuclear campaigner for Friends of the Earth, Australia), had listened to Bob Dylan. Poor Jim hasn't much of a clue about science so he does not know what he's talking about. He attempts to criticise thorium powered nuclear reactors here but does a bad job because he does not know much about the technology pathways being promoted. For instance, let's just look at his WNA quotation. There's so much wrong with it:
"A great deal of testing, analysis and licensing and qualification work is required before any thorium fuel can enter into service. This is expensive"
-- The anti-nuclear movement are the reason for this. We could have zero-carbon, cheaper energy, thorium reactors working in a few years were it not for the catalogue of regulation and agencies (official and unofficial) blocking the development of better nuclear power.
"Other impediments to the development of thorium fuel cycle are the higher cost of fuel fabrication"
-- Liquid fuels like molten salts could be far cheaper. Such fuels require no fabrication. Liquid molten salt fuels are proposed for nearly every thorium reactor
"the cost of reprocessing to provide the fissile plutonium driver material"
-- Thorium reactors do not need plutonium to start. They could start with uranium-235. That will only require enrichment; which is what currently happens.
"the high cost of fuel fabrication (for solid fuel)"
-- Nearly all the thorium reactor plans are for molten salt reactors - not solid fuel reactors. Such liquid fuels have no 'fabrication'.
"Separated U-233 is always contaminated with traces of U-232"
-- were it to be true, that would be a good thing. It would make the U-233 made in thorium reactors proof against weapons proliferation. According to weapons experts as only 50 ppm U-232 will make uranium-233 unsuitable for weapons. Reprocessing can be run either entirely robotically, or manually behind safe screens. As was demonstrated for the IFR over 2 decades ago, and as done at some reprocessing plants today. Contaminating U-232 will pose no problem. Au contraire, several reactor designers want to make sure their fuel will contain such U-232!

Note: Thorium reactors work by breeding thorium-232 to uranium-233.

   Th-232 + n -> Th-233; 
   Th-233 + e- -> Pa-233; Th-233 ½-life: 22 min
   Pa-233 + e- -> U-233;  Pa-233 ½-life: 27 days

Uranium-233 is the fissionable material. The product 'bred' from thorium (Th-232). The other materials: Th-233, and Pa-233 are just steps along the way. U-233 behaves differently to other materials. When its nucleus is hit by a neutron, it is generally not captured to increase the atomic weight, causing transmutation [i.e. U-233 + n -> U-234 ]. Instead, U-233 is fissionable. It splits in two when hit by a neutron. One atom becomes two smaller, energetic, atoms, plus 2 or 3 neutrons, plus electromagnetic rays. This is where the energy in nuclear fission is made: from the release of nuclear binding energy. This fission is the reaction we want. We do not want fissionable materials to be wasted by capturing a neutron to increase in atomic mass. With such neutron capture both the potentially fissionable atom (e.g. U-233) and a neutron are wasted, because U-234 is not fissionable. There are only 3 isotopes available to us which are fissionable by 'thermal' (moderated) neutrons: U-233, U-235, and Pu-239. U-233 is the best of them. In the thermal neutron spectrum U-233 has the best neutronics all all fissionable materials:

Notice the final two columns in this table. With U-233 only 7.7% of neutrons are wasted by capture ( U-233 + n -> U-234 ). The U-234 made does not fission. Proportionally U-235 wastes almost twice as many neutrons and plutonium wastes more than 3 times as many by capture. Uranium-233 also shows a much better neutron economy over a wider neutron energy range than either U-235, or Pu-239 (see chart below). The average eta value (number of neutrons produced for every neutron absorbed) for U-233 = 2.27 in a standard PWR compared to 2.06 for U-235 and 1.84 for Pu-239. Eta must be at least 2 for breeding (to sustain the reaction) because 1 neutron is needed to cause another fission and 1 to breed another U-233 (from Th-232). In practice a breeder reactor needs eta much larger than 2 because many neutrons are lost (absorbed by the reactor, the moderator, or captured by U-233, or even by the intermediate Pa-233.

Thorium has far less toxic waste

There is another thorium advantage: any U-234 made can eventually absorb another neutron to make U-235, so it gets a 2nd chance to fission.

The consequence of this is that the thorium/uranium-233 fuel cycle has a waste stream which is far less radioactive after the fission products have decayed. It contains very little long-lasting radioactive transuranics. The chart below has logarithmic scales. Thorium transuranics show the green curve. All kinds of fission include the blue curve (fission products). The point where the blue curve crosses the dotted orange line happens after 300 years. After this point, thorium waste will be as safe as natural uranium ore, which is generally considered safe. In contrast radioactivity of uranium/plutonium waste from a conventional reactor such as a PWR, or BWR will not fall below uranium ore radioactivity levels for hundreds of thousands of years at best (red).

Saturday, 9 April 2016

The LNT fraud

This is an interview between Atomi Per la Pace, and Dr Edward Calabrese on the history of dose response. In particular how it came to be assumed that no safe dose was the Word Of Science for all carcinogenic substances (for example: radiation). The recording quality is not brilliant which is why I copied the transcript over from youtube. It's also long, at 1 hour 26 minutes. Why is it here? Probably because I like to keep references which can be easily linked to. It's easy to set IDs inside the text below, and search it, so that I can link to a precise point in the interview. Much harder to do those searches and make links with youtube. Lot's of additional information and links are at the video description.

The interview transcript below differs slightly from the youtube transcript. The main difference is I've tried to pull paragraphs and sentences together better, so there are fewer timestamps interrupting the flow. Neither are perfect, word for word, representations. Both (this and the original youtube transcript) are honest in that they put over the original meaning well:

You may not care to know it but this is really a subject affecting our lives in many ways. No safe dose hasn't only been used to kill nuclear power. It's a general regulation for all substances.

APLP: Hi everyone I'm Carlo Pettirossi and I'd like to welcome you to an interview offered by Atomi per la pace. This is actually our first interview, and I'm particularly proud to introduce on behalf of all our staff our exceptional guest, Dr. Edward Calabrese, Professor of Toxicology at the University of Massachusetts. Prof. Calabrese, thanks very much indeed for being with us.
Thank you very much. I appreciate it.
APLP: Before letting you introduce yourself and your work, I'd like to give a very brief introduction to the viewers. Atomi per la pace knew about part of your work thanks to a junkscience article found on Rod Adams' blog atomicinsights, and the link of which (as well as detailed information about you and other links) can be found in the description box. By the way, Rod Adams wrote several other articles about your work - the latest of which is dated March 28 and talks about the presentation given at Cato Institute on March 21. The viewers will find the link to that article too in the description box. This junkscience article - which contains links to two of your papers (both issued in 2011) - talks about the LNT model, which is used by every nuclear regulatory commission worldwide as the radiation dose-response model to follow in order to determine the upper limits within which the human body can be exposed during a certain period of time without permanent consequences.
APLP: Well...according to your studies, the LNT is nothing else but the result of a scientific fraud. But it gets worse than this. In fact, the main character in this fraud - Hermann Muller - got a Nobel Prize for it. I guess it's time now to leave the talking to our guest. Prof. Calabrese, please introduce yourself and your area of expertise;
My name is Edward Calabrese and I'm a professor of toxicology at the University of Massachusetts, school of public health in the environmental health sciences division, and my area of expertise is toxicology which I'm certified. I'm at the university of Massachusetts since 1976. I'm a traditional faculty member, where I teach three courses per year and I have an active research program. I have been very active in the area of dose-response since 1985, and prior to that I spent ten to twelve year with the particular focus on studying intra-individual variation in susceptibility to pollutant toxicity and carcenogenicity trying to find out why some people get sick and others don't.
That led me to looking at animal models [...] things about nature and eventually it all led to the better understanding of the dose-response relationship.
APLP: I'd like to invite the viewer to consult your page on the UMASS site for further details about you. First question: can you please tell us how and why you came up with the idea of criticizing the author of the LNT and his model?
Well it happened by accident. It really happened as a result of the fact that I had written a manuscript - that I was submitting for a publication - concerning historical developments of the LNT and I was going to submit it to a leading toxicology journal.
As a standard mode of operation for me, when I get to a certain point in the manuscript development, I would oftentimes send it out to a number of people (friendly critics) who would evaluate it and give me their honest criticism before I would submit it for a publication consideration to a journal. I sent my manuscript to a colleague who was extremely well known in the area of toxicology. This particular person came back with a specific but somewhat general criticism and he indicated to me that I needed to address the role of Dr. Hermann Muller more insightfully especially in the area of the history of linear dose-response. That draft of my manuscript was not sufficiently informative. I agreed with his criticism and realized that even though I had read some about Muller, I hadn't dug out his life well enough. I then got many of the papers that Muller wrote - which were a considerable amount in his career - and then also re-reading his biography and studying his Nobel Prize lecture. That led me to wanting to know more about Muller and the details of his involvement with other leading radiation geneticists at the time. I was able to get a purchase from the University of Indiana where Muller papers are held. Hundreds of letters he wrote and other type of correspondence.
And then I went to the American Philosophical Association - I think it is in Philadelphia - and purchased other kinds of communications that other people had - and that weren't contained in Muller's own files. I had a substantial quantity of information.
In the course of this, I learned a lot about Muller, and I was really trying to understand the switch from a threshold model that was very dominant up until the mid 1950s to a linear model for a cancer risk assessment. I learned that Muller was very strongly advocating linearity during the 1930s and 1940s, during his own life but he wasn't having a lot of success in convincing the regulatory community and the governmental agency to adopt his views even thought other people within his field - radiation genetics - [agreed] in general with his views of linear dose-response relationship. These studies were not particularly convincing or clarifying going into WW-II, during which the US created a program called the Manhattan Project which was aimed to produce the atomic bomb.
One of aspects of the Manhattan Project was the better understanding the nature of the dose-response in low dose zone - essentially for X-rays, gamma rays and radionuclides. So what happened is that the US government and the atomic energy commission gave a grant to the University of Rochester. One of the recipients was a well known geneticist - Curt Stern - which was conducting his studies on dose-response using the fruit fly model to try to answer the questions on the nature of the dose-response in the low dose zone.
At that time, Muller was a professor at the Amherst College (a mile away from where I'm sitting today). He was retained by the Manhattan Project at the University of Rochester through the actions of Curt Stern to be a consultant to that project.
Muller provided lots of information. He provided one of the strains of fruit flies; he was very helpful in telling the scientists what kind of study design they should have; he was extremely important in attempting to resolve questions on the control group and spontaneous mutation rate; he reviewed manuscripts for publication... Muller was very involved in this research activity. Sorry for the long story, but this leads to an answer, and that is that in the course of this research two large studies were to be done:
one was an acute exposure to X-rays to the fruit flies in a very structured period of time. In that particular study, with Curt Stern and Warren Spencer (a well known drosophila geneticist), they showed that there appeared to be a linear dose-response relationship.
However this dose-response question was to be resolved with a chronic exposure study conducted by Ernst Caspari, another well known researcher was working in Stern's team. The chronic study consisted in exposing the fruit flies to a dose rate which was 1/13000 of the dose rate administered by Warren Spencer in the acute study. Therefore a very different type of study which also incorporated a number of methodological improvements compared to what was done by Spencer. When Caspari got his findings, he went and shared those with his superior (Stern) and his data didn't support a linear dose-response relationship. They actually supported a threshold dose-response. Stern didn't want to accept Caspari findings, and challenged him by claiming that his control group was aberrantly high. This is why his findings led to a threshold interpretation rather than a linear one. Caspari decided to dig into the literature and found a number of studies which did not support the position of his superior Kurt Stern, but they supported the reliability of his own controls. Then Curt Stern contacted Hermann Muller and asked "can you share with us your data because you have been looking at the question of the spontaneous control group mutation in the same model that we've been using?". Muller provided copious amounts of data to Stern. It turned out that Muller's data supported the interpretation of Caspari. In letters between Caspari and Stern (and between Muller and Stern), it's shown that Stern kept backing down reasserting the original interpretation of Caspari (the data supported a threshold). When I was going through this, ... I'm jumping ahead in the story a little bit ... at that point I wasn't sure just how much Muller really knew of the conversations between Caspari and Stern (I was to learn later how much he actually knew).
However, what I was wondering about is that Caspari showed in his excellent study that the data supported a threshold - and this was the strongest study, by far, that had been conducted to that point. I also recalled ...this was August-Semptember-October 1946. In December 1946, Dr Muller receives his Nobel Price and gives his speech over in Stockholm accepting the Nobel Price. What he says in his speech, is that one can no longer accept the use of a threshold model. This needed to be replaced by a linear model. He really lays this out in very strong and definitive terms that the threshold is really intellectually and scientifically dead and it needs to be replaced by this linear model. Having read and studied his comments, I then said to myself:
"I know he was consultant to the Manhattan Project; I know he communicated with Stern and the investigators. But had he really read Caspari's manuscripts that showed that data supported a threshold - in fact Caspari was advocating for a tolerance dose in the manuscripts".
And I said: "I have consulted in a lot of studies in the past and sometimes people share with you information, and sometimes they don't show everything". So I didn't really know what he knew. I said: "did he really see the paper before he gave his nobel price speech?".
So..I had some doubts. In the course of going through all the letters that was sent back and forth between Muller and Stern and others, I found out that on November 6, 1946 Stern sent Caspari's manuscript to Muller asking him to review it.
Muller receives and answers Stern on November 12 saying: "I've received the manuscript of Caspari. I have pretty much skimmed it over, went over quickly. I can see that this is a serious challenge to the linear dose-response model. This study needs to be replicated.
I don't have any reason to doubt Caspari's capabilities to do a proper study. I'll get back to you with detailed comments before my trip to Europe [to get his Nobel Price]". So, at that point, as far I was concerned, he was aware of it, he knew what the implications were, he had some sense as to the credibility of the people he was consultant for. But I was waiting for his "major" review. Then he goes to Stockholm and, as I mentioned before, there he basically goes headlong in the opposite direction asserting that there is no possibility of a threshold! None whatsoever! And yet he had actually seen that there was a very strong challenge to a threshold and it had to be replicated. Now...replication of this kind of studies is not trivial!
It involves at least a year - maybe more - lots of money, big sets of expertise. And you don't want to waste anybody's time, money and everything if it's not significant! But it was significant enough for him to say that. What he should have said, in my opinion, is:
"there is still some uncertainty; we have to do more research to resolve the question of the dose-response relationship...". There wasn't enough data to say you can no longer consider the possibility of a threshold.
He basically - in my opinion - was misleading the audience!
Now people could say to me: "well, maybe he changed his mind between November 12 and December 12". But I was lucky enough to get a January 14, 1947 letter from Muller to Stern in which he gave his detailed evaluation of Caspari's results.
In it - it's almost an opening statement - it says: "there's almost nothing I can add to this. It's a very well conducted paper". He reasserted that the linearity was challenged. The research was so significant that it needed to be replicated as soon as possible.
He didn't necessarily believe at all that the threshold was the correct interpretation. But that's why you do the additional studies! The studies had to be done to resolve these questions! He didn't have any technical issues. Basically the issue was:
"let's get this replicated". So, as far as I was concerned, what happened was that Muller's opinions had not changed. He reaffirmed his original opinion. And he raised further questions to me: "how could he have made his statement to the Nobel Prize committee?
And knowing what he knew - and his opinions didn't change -'s private he was revealing on his true feelings to Stern, but in public he was giving a different story". He was really, in my opinion, misleading, deceiving and being dishonest to the public.
While in private he was being, you know?, a good scientist! He really can't have it both ways. So I didn't go into criticizing Muller because I had any axe to grind with Muller, because I had any issues with him or an historical problem with him. I actually ended up criticizing Muller more or less as an unexpected by-product of trying to produce a better paper. So...sorry for the long story, but that's how I uncovered, so to speak, the beginnings of this controversy over how we transformed from a threshold to an LNT. And I think it really started with this major act of deception and this major public display of scientific aggrandisement, and...where we put one of our major achievers on display and try to learn from him. You're expecting that he's telling you the truth! And he wasn't!
Ok! Next question: can you give us some more details about Muller's and Caspari's experiments?
Yes I can tell you about it. There were two experiments done: the acute and the chronic study. I think this is important. Then I'll get back to your question. The reason why Kurt Stern challenged Caspari is that he and his radiation geneticists community really believed, or wanted to believe, very strongly in a linear dose-response relation. And Caspari's data did not support it. First there was the challenge that Caspari's control group was wrong - and Caspari answers that question. But there is a very interesting thing here: if you read the Caspari paper (with Stern as a co-author) on this topic published in 1948, almost the entire six page discussion was a disclaimer such as "even though our data is what it is, please don't accept and use it until you or we can explain why our findings differ from the earlier acute studies done by Stern and Spencer". It's interesting that Stern forced the chronic study trying to explain why it differed from the acute one - not the other way around ("you can believe Spencer's data but you couldn't Caspari's data"). Beyond that, about 25 methodological differences between the acute and the chronic studies were identified. For instance, in the acute study they used x-rays and in the chronic studies they used gamma rays.
In the acute study they gave a direct exposure to the males, while in the chronic study the copulation had taken place therefore the exposure was actually given to the sperm as it was stored in the females. The organisms receiving the exposure were different.
The diets were totally different. The females used by Caspari received a diet which would prevent the laying of eggs and a totally different diet in the other case. There were then 25 differences between these studies such that you actually couldn't go back and ever figure out exactly why one study would have differed from the other because you had too many simultaneously differing circumstances to resolve. Usually in experiments, you keep everything constant except from one variable. In this case, there were 25 differences between the studies! Stern is a very significant gentleman. He's got a lot of experience. Muller has as much experience - if not more than Stern - and Caspari is very talented himself! They all knew that you could not go back in and resolve these differences. But nonetheless, in this paper that's what they are telling the reader to do. So it made no sense to me. Even looking back after so many years, that these people, who were really outstanding individuals - straight intellects.
All of them - could ever have written this and that anybody else would have believed them. The interesting thing is that, after Muller read the paper - Muller's name was added to the paper as a consultant - the only other change that happened in the manuscript was due to his influence: Caspari and Stern removed every reference of this being a threshold phenomenon. This was like a minor change in a sentence but it removed the key word - tolerance THRESHOLD dose-response. The interesting thing that happens along this line, in my opinion, is that you know that in some point in time this is going to be revealed (Caspari's data being published and supporting a threshold, while Muller was telling the world's audience that there was no possibility that a threshold could ever exist).
Him having seen Caspari's data - and having been a consultant to it - has a possibility of challenging his credibility as a scientist and a Nobel prize winner.
Muller's recommendation that Caspari's study would have to be replicated actually took place - at the University of Rochester. Caspari was applying for a new job. Spencer was going back to his old job. So they needed a new person to take over and do the actual work.
They got a young lady, Delta Uphoff, who was a new graduate student. She came in to work with Stern and replicated some of Caspari's work. In her first experiment, she share her data with Stern, and her control data was reading significantly below what was expected (about 40% below what was reported in the literature and more or less 40% with respect to what Muller's data had shown). Stern and Delta Uphoff decided then that their findings didn't have credibility. They wrote their manuscript based on this experiment.
In it, they claimed that their data were not interpretable because her control group was aberrantly low. They cited Caspari's work and the literature. They also cited Muller and thanked him for submitting his data for them to review.
He was specifically acknowledged as thanking him for allowing them to use his data. He actually had a positive affirmation that he knew what was going on. By the way, in the discussion of their manuscript, the writers (I suspect it was Stern, but I have never seen this before) asserted that one reason why they might have had the aberrant findings was in fact due to investigator bias. I suspect he was referring not to himself but to Delta Uphoff. That was never clear, but there were two names on the paper.
She then went ahead trying to do a second study and she had again the same problem with the control group - another aberrantly low value - and they couldn't use this data either (it was still the "uninterpretable zone").
She then did a third experiment where the results of the control group values appeared to be normal to me. But the low dose that she was studying at, if you were looking at this from the point of view of a linear dose-response, the values were 3 to 4-fold greater than what would occurred with a linear prediction. And so when one looked at this, it would have appeared that her low dose-response was aberrantly high. It appeared that Delta needed more experience in doing these kinds of research.
And essentially the findings were, in my opinion anyway, that many experiments have a lot of credibility. We're in the situation now where Stern is trying to get this work published. He takes Spencer's paper on the acute study already published in his Journal of Genetics as well as Caspari's paper. He then adds the three Delta Uphoff's experiments. He rolls them altogether in a type of what I call "a modern day meta-analysis" and he tries to make sense of them. He presents this only in a single table in a one page paper in the journal of Science, saying, more or less, that the two studies with aberrantly low control groups are normal! He now can interpret them but he doesn't share with the Science audience that less than year before they were uninterpretable (that their results were aberrantly low). He doesn't go back and say that the data are changed and become normal - because the database had not changed - and in every study that came afterwards, he'd actually reassert the aberrantly low nature of the control group in those findings.
In this Science paper he also claims that the control group of Caspari's work had irregularities and therefore is not reliable. He backed into his original position, before Caspari had essentially refuted him; he now claims that his data were reliable.
The Delta's papers he had claimed to be not reliable, he made them reliable. He added them all together and came up with a linear dose-response relationship that supported the foundations of the LNT. He got them in Science and promised the readers, the scientific community, that he would follow through with a detailed report on all the methodologies and all the data that he couldn't show where you could actually see variability, methods, strength, weaknesses...however he worked with his experimental system.
He never followed through with this publication. So what happened is that two papers became very significant in the radiations genetics research community: Spencer's work which showed linearity for the acute exposure, and then there was this meta-analysis that Stern did with Uphoff in Science in which they had all these slight changes...making something appear real that one year before was not real, and basically not sharing it with anybody else. Most of this was hidden in the - until some point in time - classified literature in the Atomic Energy Commission that was never broadly available. Essentially it becomes a story that needs to be discovered. And that's what I actually discovered when I went back in my studies digging through all this and found the manuscripts that other people hadn't read or seen, the correspondence of this...The interesting thing here is that there were two things going on: this paper for Science is very significant because it really reaffirmed the belief in linearity. And other people of that time started citing this and they said: "Stern worked with 50 million fruit can 50 million fruit flies be wrong? Everything points to linearity! Caspari's study had an unusually high control group. He couldn't believe its findings. Therefore it has no credibility".
The other things that were wrong, or misinformation, it's became night; night became day; false became true; true became false. And nobody was trying to follow the know...with the person who was showing you the trick, you had to follow the data.
There was a very high appeal to authority within the community. People liked Stern and Muller. Part of this was going because they really tried to affirm linearity, but they also had to protect Muller's reputation. Muller really had misled the scientific community at the Nobel prize. If Muller had his reputation damaged, what would really hurts were the arguments for the LNT. Both had to be protected at the same time - in my opinion.
Now...I was interested in how Muller responded to all this in 1950. He published two significant papers in 1950..this is really hard to believe... but Muller goes in one of these papers and says: "Caspari published his paper which deviated from Stern. But its control group was aberrantly high". So...Muller, whose data was used to support the reliability of Caspari's findings, now concludes - falsely - that Caspari's data were aberrantly high when Muller actually came to Caspari's defense!
And nobody challenges Muller! Not even Caspari challenges Muller on this!
It was a blatant misrepresentation of the record by a Nobel prize winner, in the aftermath of this deception during the Nobel prize...cause it actually gets even worse: a lie is piled upon another lie! And nobody challenges this!
And I have tracked every single communication, in letter and cable between Muller and Stern during these time periods and see how they went through and how they have tabulated...however they communicated, these are contained in manuscripts submitted for publication.
Stern also tries to take the lowest value in Caspari's study and he bounces down the value, so that he can extend the range over which he claims linearity occurs. So, even if linearity was true - which the data really did not support - you could have claimed a range of let's say 250 thousand range of dose. While Muller was rounding the range down incorrectly, Stern extends it to 400 thousands fold. He does different things in which he either wrong or dishonest, and other people actually cite him as THE authority!
It was Muller's goal, Stern's goal and their colleagues' goal to really change the risk assessment paradigm: to have the ionising radiations seen not as a threshold phenomenon but as a linear dose-response phenomenon for risk assessment purposes.
In 1955 the Rockfeller foundation provided funding to the US National Academy of Scientists to put together a very distinguished broad-base group called the Biological Effects of Atomic Radiations (BEAR-I Committee). It preceded the BEIR Committee that we currently have in the US National Academy of Scientists. There was the BEAR-I and the BEAR-II and then it shifted to BEIR-I where they just changed from "Atomic" to "Ionising". So that's the only difference in it. But the interesting thing is that I knew there had to be a battle coming up between those who supported threshold and those who supported linear. But as it turns out, if you get the transcripts of this first ever BEAR-I Committee - which I have obtained - and you read them all from cover to cover backward inside and out, you find out that there is no debate on the nature of the dose-response in the region of the low dose zone. It is accepted from the moment they walked in that the dose-response was linear. And when you look at the comments of Muller and the comments of other members of that committee in the written language prior to that committee, he (Muller) claims that in the early 1950s the decision had been made amongst this radiations genetics group that it was no longer a threshold: it was really a linear dose-response. So there was no debate! The committee came [...] with essentially a very large proportion of radiation geneticists who grew up with the same mind. And because they were of the same mind the decision was automatic: right to linearity.
When you go back and look at the literature, the one that they all go back insight, is the Spencer's study and the Uphoff and Stern's study in Science. These are the two critical references that basically were the ones upon which the switch in that BEIR-I Committee was evident and based upon. In my opinion, the key one was the chronic study, and that was the Uphoff and Stern's one. This was the one that in my opinion was pretty fraudulence in the ways that I described.
Within a year after the BEIR-I Committee came out with the recommendation, the NCRP decided to recommend that the findings that were for the germ cells mutation (linearity) be generalized to some medics dose, and that just opened up the application to cancer risk assessment.
And ever since then, it has just followed a linear dose-response relationship. And this not only had its impact on ionizing radiations, but the USCPA years later took the same rational and the same bases and applied it to [...] carcinogens, and it just generalized even further. And that's the regulatory history for cancer risk assessment in the US and essentially most of the countries throughout the world! It actually is a very terrible history.
Its foundation is based upon misrepresentation at the highest possible level from the people you're actually depending upon.
Are the original letters between Muller, Stern and Caspari available somewhere on the internet?
They are not available on the internet. I can certainly send you my copies. They are publicly available from the same sources I got them from. Muller's paper from the University of Indiana. Stern communications are at the American Philosophical Association.
However, I have published different articles concerning these letters as part of the papers. And it's not uncommon for the editors to require me to show proof of letters. So I have had to provide copies of the letters to the editors or to the reviewers that they publish here because these are very specific informations that I'm claiming. Somebody to actually affirm that I pass a peer review process, I have to provide documentation and proof to the editorial judgements of these independent journals.
They actually have to have that as backup when my work is criticized. So it's part of what it's called "due diligence to the peer review process". But I can certainly send you a copy of my letters that I have obtained. But the journals know..
I'm required to provide..If what I'm having is not generally available - and these letters weren't considered generally available, cause you'd have to purchase them - then I have to provide it to the journals.
Why do you think the Rockfeller foundation put up the BEAR Committee?
I actually don't know the answer to that question. I know that the Rockfeller foundation was a strong leader in many aspects of the biological science. It has a strong a social/political conscience. I suspect that since in 1955, I was a child during that time period, there was a lot of cold war tension. And I believe - that was the time of the atmospheric testing of the (atomic) bomb - they wanted to better understand what the public health risks might be from probably atmospheric testing..perhaps water know...
the new development of nuclear medicine, nuclear energy..It was a new world that they were entering in 1954-1955. So I think it may have been their far-sightful insights. But I haven't dealt exactly into what was truly motivating them. I'm kind of guessing right now.
Ok thanks. Next question: do you think that testing the (ionizing) radiations on fruit flies (as Muller did) could provide a good estimate of their effects on humans?
I think in qualitative terms it could. In quantitative terms it probably would not be a particularly good idea. It's very difficult to extrapolate, in quantitative sense, from one species to another. There's a lot of uncertainty with that.
It's very difficult in the world of toxicology to extrapolate even from mouse model to a rat model. Let alone from a mouse to a human model. Even after Muller's work was done, there was the research done in Oak Ridge with mice that showed 15 differences in the mutation in the mice versus the mutation in the fruit flies. Muller used just one species of fruit flies. There are many ones you could have studied. The same goes for many mouse species. There are some that are more susceptible, some that are less susceptible.
This is a very difficult area in terms of providing quantitative extrapolations. Qualitatively I think that these models are very useful. They can tell whether a mutation is or isn't occurring.
If it qualitatively happens, it suggests that you should look more carefully at the species of interest.
APlP: What's your opinion on the reason why the regulations on radiations have been based upon a scientific fraud?
The most part came out from this recommendation from the BEAR-I Committee back in 1956 in which the fundamental decision was made to switch from threshold to linearity. And everything followed from that. This was...I'd have to say... a political-philosophical decision that the radiation geneticists community had. They believed, in my opinion, that they wanted to perhaps save the world and future generations from mutations. They may have been, from what I can read, well intentioned people.
However, they basically used and gave up their scientific credibility to make decisions based upon their philosophy. What they owed the country and the world was not their philosophy: they owed the world their scientific judgements!
And then society and its political leaders could weigh how the science fit into the political judgements. But I believe that the radiations geneticists committee in 1956 essentially gave up its authority to Muller and Stern.
It's a very difficult situation to figure exactly, because all those members went to that committee with their minds made up. They all were believing in linearity. I have gone back and looked at the publications in detail of all the members of the BEAR-I committee, and there were only three or four that had significant experience with low dose studies. Most of the other ones were men of achievement, but it was another kind of achievement! They did not have experience with how to design and conduct low dose studies and the experimental nuances that you would have to know, and problems when you do those kind of studies. So in effect what they did is that they had appeals to authority to people like Muller and a couple of other ones on that committee who had the same Muller's views.
And essentially they allowed their decisions to pass through. And Muller and two or three others became the key people that decided the policies for the rest of the 20th century. It all came back to the deception of Stern's studies. It's actually very amazing!
It's hard to believe, that it turns on the dime. But it's a very narrow point where it turns to. It reminds me when the US Challenger fell out of the sky back in the late 1980s, and it was all because several o-rings should have been replaced cause they didn't function properly.
And all these very great engineers in Nasa, all these talented people...everybody thought everybody else was doing their job. But actually somebody missed the o-rings. And people died...a whole disaster. In this case people didn't follow what was going on with Stern and Caspari and Uphoff and Muller. They appealed to authority. And now we have 60 or 70 years of an LNT regulation based upon what I think was a fraud, deception, misleading, not providing all the information, substituting philosophy for science by people that actually were outstanding scientists of the day. People that we looked to for guidance and that we trusted.
Can you explain and give us an example on how a threshold value is determined?
It's interesting that you raise this question. It should be pretty easy. Basically a threshold resonates within our common experience. People watching this (video) might have a sense of a threshold when they drink wine. You might have half glass of wine and you enjoy the taste but you don't feel any sensory sensation, like spinning in your brain. But if you have two or three glasses of wine you may begin to feel the psychological effects of the wine. At some point you pass a threshold and something happens.
Below that level, there is no detectable biological effect. This is pretty much a common person's view of what a threshold is. In a statistical sense, if you go below the threshold we expect there to be what we call variability or noise within a system and you'd expect random bounds but no significant deviation from the unexposed control group. Both perspectives should agree with each other. The perspective of the common person - with the glass of wine - shouldn't be any different from the biostatistician's taking a look at the random bounds below an estimated threshold. They're both telling you the same thing. There shouldn't be any detectable biological effects below a thresholds. Under the radiation/mutation point of view, it should be safe below the threshold.
Are we in presence of a threshold in the hormetic model?
Well..the hormetic dose-response is a by-phasic dose-response. In the case of mutation, cancer and radiations we're looking at a J-shaped dose-response. So when the doses are high you see dose-response relationship such that mutations grow up proportionally (so happens with the cancer risk). But at lower and lower levels, according to the hormetic model, you reach a point which is really the opponent of a threshold, in which there is no treatment related effect compared to the control group.
And you might think that there is no effect as you lower the dose further. But actually, in the hormetic model, if you lower the dose further, we observe that the risk ditches down below the one of the control group. And therefore it shows a J-shape dose-response, compared to a straight line then a flick up in the threshold response. The hormetic dose-response is that which I spend a lot of time studying. And I find this to occur for essentially most chemical and physical stress agents - like ionizing radiations - and it's independent of the biological model or the biological organization. It occurs in the cell, the organ and the organism. And it occurs independently of the biological mechanism as well. It's a very general phenomenon and it's getting more attention today in the pharmaceutical and chemical industry and in the non-ionizing radiations area as well. Not just from a regulatory point of view but from therapeutic applications as well. So how you make these new insights on the nature of the dose-response more helpful to the public health and for therapeutic applications.
APLP: Why do you think the majority of the general public wouldn't or doesn't believe in the existence of a threshold?
Well...the way toxicology is communicated for the general public...for example: if you were to take a look at all the chemicals regulated by the international communities in their own countries, they are very similar. Let's say for drinking water contaminants - carcinogens and non carcinogens alike - when you take a look at the number of molecules that it takes - even the strong carcinogens - before you see that the response is taking place, you actually have to have an individual exposed on a daily bases for 70 years, more or less, to anywhere between probably ten billion to ten trillion molecules per day. Every day, for 70 years for powerful carcinogens to begin to show carcinogenic effects. Over 70 years, we're talking about 10E22 to 10E24 molecules to show beginnings of an effect.
If we just consider this to show how the LNT model is lacking in credibility when it's applied to anything! It's how the risk communication message has been framed; it's been held captive by regulatory agencies whose mission in many ways has been to preserve their regulatory positions in jobs, to frighten the public into concerns that weren't justified. I don't mind being challenged by those whom I'm challenging. I'd say: "show where my interpretations are incorrect". But I happen to find back to all the regulatory estimates;
I have done the calculations and I was surprised...I thought that carcinogenic agents would be much more active at lower doses than non carcinogenic agents. But actually it's about the same round of about - per day - 10 billion to maybe 10 quadrillion of molecules before you see any change in the biology! When you hold up a glass of water that has a contaminant at a drinking water standard, and you might think it's perfectly safe. And that perfectly safe might have a 100 billion molecules of a toxic substance in it.
The glass looks nice and clear. You can't see that there are 100 billion molecules (of a toxic substance) in it. You'll drink it and you'll think that it's nice and safe! And it probably is nice and safe. But in it there are maybe a 100 billion molecules of a regulated toxic substance! And this tells me how toxic is it really if we are allowed to have an approved drinking standard by a regulatory agency a 100 billion molecules in it? Yes, it probably needs to be regulated. But let's see this in relationship to the LNT - cause it says that there is no safe level and that a single molecule can initiate this pathological process, when in fact there would be 100 billion molecules and you'll need to be exposed to them for your entire 70 or 80 years life span.
You have to see that there is a discontinuity between what the regulatory policy is and what the actual scientific issues and understandings are. They really need to be addressed. The LNT concept was adopted with no scrutiny.
It was essentially pressed forward on the bases of ignorance, fear, philosophy and as far as I'm concerned misrepresentation of the scientific records by the leaders we're talking about today.
APLP: Can you talk about the process that would enable the (human) body to protect itself against radiations by means of radiation preconditioning?
The concept of preconditioning is very significant. Radiation preconditioning is in fact a subset of that. But preconditioning for the audience is seen when a very low dose of an agent is given prior to the exposure to an overwhelmingly high dose.
And you try to see whether that initial low dose given maybe a day or two before the high dose affected the toxicity of the high dose. There can be a dose that kills an animal or makes it [...]. In many cases, the prior low dose can profoundly protect against the effect of the more massive dose given subsequently. I know from work that we have done that the chemical called carbon tetrachloride that you could give a very very low dose of it - a dose that doesn't cause almost any extensible changes in the organism - and then, one day later, you give a dose that would kill 95-100% of the animals, and none of the animals dies! The low dose protects them from the subsequent situation. You see that comparably happening with low dose of radiations.
You see this happening - people listening to this might find it really odd ... good portions of us are going to die from heart attack or something related to it. I mean 40-50% of deaths in the US are due to heart related conditions. Researchers at Duke University in 1986 provided a relatively modest hypoxic stress to dogs. A day or two later, they gave a massive myocardial infartion or heart attack to these dogs. And they found that the dogs that had the mild hypoxic stress a day or two before, had essentially 70-80% less damage than the dogs that didn't! The investigators coined the term "preconditioning", and then it was applied to many other systems so that you could actually protect the brain by preconditioning; you could protect the heart by preconditioning; the liver, the skin, and many after that. People found that you could protect the body after the damage by post-conditioning! And now these concepts are being implemented in medicine. How this relates to dose-response: if you could take the preconditioning dose and you could give a very low to a much higher preconditioning dose then followed with a large dose, you'd find that the dose-response was an inverted U just like a hormetic dose-response. So preconditioning is a manifestation of the hormetic dose-response concept.
That's why I'm particularly interested in studying this phenomenon. So, it relates to the world of radiations: radiation biology; radiation therapy...there are so many things that could happen in medicine. You could use a low dose of radiations before giving the patient a massive one; you can protect him from subsequent damages. People now find ways to use preconditioning for patients before they have a massive surgery, so that they will have enhance the recovery during the surgical process. This is a wonderful new series of opportunities emerging on the health care system. I can give you another example to regulatory-wise that we published from our lab: the USCPA and many other regulatory agencies say that if you take a kidney toxin and then a second kidney toxin, the response is additive.
Two bad things: one plus one equals two. But we took inorganic mercury, which is a kidney toxin, and inorganic lead, which is also a kidney toxin. We gave the lead one day before we gave the mercury. According to the EPA, they should be additive. But because we gave them in a preconditioning sense - the lead one day before a strong dose of mercury - we reduced the mercury toxicity by 70 or 80%! It was just like what happened with the dogs then (experiment of the Duke University in 1986). It really showed that the regulatory approach on chemical mixtures, when you separate them by time within a conditional framework was basically not supported at all! It's a new toxicology today, and much of our toxicology was based upon ideas that need improvement. It's very difficult for regulatory agencies to change, to admit they made a mistake and to follow the data. They are tied into defending past decisions even when these can no longer be supported and they are generally known as wrong. And that's the case with LNT.
By the way, what quantities of mercury and lead are we talking about?
I can't recall exactly those amounts, because it's a number of years ago and the study was in mice. But in the framework of preconditioning the quantities can be very minor. For example, in rodent studies, you could take a blood pressure cuff - all of us had blood pressure taken - and wrap it around its thigh. There are protocols for which you could squeeze it a few times - tighten it up, let it go - and that's a preconditioning stress. Something as minor as that! And you could cause a damage to the heart or the brain. That preconditioning stress, if you waited a day or so, would actually result in protecting the brain from damage! It's the same concept using the low level lead against mercury. It's a stress, just like dietary stress. People have shown that if you take food every other day in the animal model, it stresses the animal in such a way that it misses food for a day. This results in the up-regulation of many adaptive responses that will protect you from subsequent stressor agents that you could get the animal in its environment. What we're learning about today is a whole series of ways that the body has to protect itself against low level, moderate level, or high level stress by essentially using "preconditioning vehicles". Even low doses of radiations can serve as preconditioning stress.
So, very significant what's happening and what will happen down the road.
When one talks about radiation, the thought of most people goes immediately to the accidents of Chernobyl first and Fukushima more recently. I read from your CV, page 24, that you participated as guest editor for a publication with the title: "Distribution of Artificial Radionuclides in the Abandoned Cattle in the Evacuation Zone of the Fukushima Daiichi Nuclear Power Plant". On page 29 of your CV, you mention your 2011 paper "Improving the scientific foundations for estimating health risks from the Fukushima incident". Can you please talk about these studies?
These were evaluations of some of the scientific literature that might have relevance to assess risk at Fukushima and perhaps other places as well. The message here is that, even though there can be improvement in the science, if you feed all your science through a linear dose-response model the interpretations are going to come out wrong. Because what you end up doing in any human study on the populations or do this using animal models, you still have to somehow extrapolate to levels that are extremely low - if you have a patient or animal exposure.
So what I have tried to do here - in the papers or elsewhere - is to try to say that our fundamental way to assess risks was wrongly constructed and it has lead to incorrect estimates of risk that applied to Chernobyl, to Fukushima and other places.
And the result of those grotesque overestimates of risk, are policy decisions aiming to shut down areas, to evacuate people, to have kinds of all actions taking place that sometimes are far worse than the exposure to agents that we have. So what we really need is a much more scientifically defensible risk assessment process that is consistent with the toxicological and epidemiological literature. Basically once you believe that a single ionization can initiate the process of carcinogenesis, there is no return, because everything then becomes fearful. And you think there's a risk with a single anything. And as I mentioned to you a few minutes ago, the amount of chemical carcinogens in drinking water - but you can relate this to radionuclides as well - we saw that in EPA drinking water standards you can be exposed to 100 billion molecules per day for an entire lifetime - 70 years - and have no increase in measurable cancer risk. None! So the model is totally wrong! And yet if you were to apply the LNT model to this, you'd be concluding that a single molecule is a concern. At least to the general public no level is safe. And it's a wrong message! It's not a scientifically defensible message! I call it a radicalised perversion of science that is now like a disease that captured government's policy when it comes to risk assessment that, in my opinion, needs to be challenged and changed. And this would play out management very differently in places like Fukushima. Because there is also a psycho-social component to this as well. And that is: when you frighten people unnecessarily, then you change the status of the whole family structure. And the mind can play strange games on the body health-wise and social relationships-wise. This is a very serious situation that needs to be confronted. And if you try to challenge this, as I'm trying to challenge this, people attack you for your models, they make up models that don't exist. All I'm looking for is that people look up at the data and let's go from there together and see what the data actually reveal.
APLP: Is the hormetic model beginning to be considered by the nuclear regulatory commissions?
I think the hormetic model is growing in its scientific acceptance and utility. It's widely used by the pharmaceutical industry. To all the one listening to this, I can tell that the anti anxiety drug, the anti seizure drug, the memory drug and many other are based upon hormetic dose-responses in the preclinical data with mice or rats models - or whatever animal they used. They are [...] in a totally pervasive actuality in the pharmaceutical literature. Even those people who totally would oppose to hormesis, don't know that most of the pharmaceutical they take to preserve their health are based upon the hormetic dose-response model. They are totally accepting it in one domain - not knowing that they are - and then in this other domain called environmental regulations they say "we can't go down this road!" when in fact they've embraced it. It's a very interesting contradiction. But I think that things have to actually change within regulatory agencies such as the US EPA. But they create the conditions of the game. In the game of risk assessment, EPA has defined risk assessment in such a way that it excludes the concept of benefit. Hormesis could actually incorporate a benefit into the equation. For example, at a low dose of radiation - or a chemical - extended our life, it would be significant to know whether they could reduce the occurrence of certain kinds of illnesses or adverse effects. If an agent extended our life for 30 years, EPA would never consider it. They would regulate the agent to avoid the benefit! It's just a way to spend a tax payer's money, as far as I'm concerned.
But the interesting thing along these lines, is that if hormesis were to show - and it could do - that a low dose below the threshold dose is harmful - i.e. a low dose of a drug enlarged the prostate gland - and it could do this in an inverted U by-phasic dose-response, the EPA would say "ok! Let's set the standards upon that! As long as hormesis shows the harmful effects!" But hormesis is not fine if it shows that something is potentially beneficial. That kind of dichotomy, inconsistency makes no sense to me from a public policy point of view and it really needs to change. Better follow the data and do what's best for the general public.
Alright! Last question: just an evaluation if possible: how would the safety costs of a nuclear power plant reduce after honestly reviewing the LNT?
I think that there would be profound savings within the community. The industry, the payers, the users are right across the borders. If LNT were replaced by a threshold model or the hormetic model, you would have profound savings from top to bottom. And the interesting thing is that not only what you'd save tremendously in terms of money, but the general health of the public would be significantly enhanced. It's difficult to understand how the public tolerates regulatory communities that impose costs on them and reduce their health status.
But that's what our regulatory position has been within the US and other countries. And it's my hope that in fact rationality and following science will lead to a reversal of these kinds actions and positions in the future
APlP: Prof. Calabrese thanks very much for your time and for being with us
Thank you very much for the opportunity