Monday, January 7, 2019

Another "That's Funny" About Radiation

Just about six years ago, I touched on the topic of radiation dosing and the effects of Chernobyl. The essence of that was the "no safe amount" dosage doctrine that's the official position of establishment science wasn't holding up well to examined conditions in the closed area.  Briefly,
If you believe no amount is tolerable, you have to ask yourself some tough questions.  Start with the fact that some places on earth have naturally occurring radioactivity at much greater than background levels.  Why doesn't life die off in these places - or have horrible cancer rates?  Colorado, with 15-20% higher natural radiation than surrounding states, has a lower cancer rate than the states with lower radiation.  When radioactivity was first discovered, all sorts of claims were made about radiation being good for you.  The claims resurface from time to time, here's one from 2002, but these claims get little traction in the general public. 
Another one of those papers has surfaced.  Real Clear Science ran the article, "Was Low-Dose Radiation From the Atomic Bombs Beneficial?" before the end of 2018, summarizing results from an article published on 19 December 2018 in the journal Genes and Environment.  What I've read of that is fun reading.

The paper studies survivors of the Hiroshima and Nagasaki bombings and found they had lower mortality rates than people farther out who received less radiation.
Researcher Shizuyo Sutou of Shujitsu Women's University is the author of the paper. Sutou examined data from the Life Span Study, which has followed 120,000 survivors of the atomic bomb blasts since 1950. His analysis showed that survivors exposed to between 0.005 and 0.5 Grays of radiation had lower relative mortality than control subjects not exposed to atomic bomb radiation.

Sutou's finding is in line with the hormetic theory of radiation (hormesis), which states that very low doses of ionizing radiation might actually be beneficial, producing adaptive responses like stimulating the repair of DNA damage, removing aberrant cells via programmed cell death, and eliminating cancer cells through learned immunity.
Radiation hormesis is backed by a number of studies, but scoffed at by the establishment.  In that piece from six years ago, a couple of people left very interesting comments, including Mike Sivertson who left a really good introduction to the concept.

Real Clear Science puts it this way:
Radiation hormesis is backed by a number of studies, but it is currently not accepted by organizations like the National Academy of Sciences or United Nations Scientific Committee on the Effects of Atomic Radiation, which support the linear no-threshold (LNT) model of radiation protection. This model effectively states that any dose of ionizing radiation is harmful. Scientists like Carol Marcus, a Professor in Nuclear Medicine at UCLA, thinks this stance is overly cautious to the point of itself being hazardous. Irrational fear of radiation, no matter the amount, is counterproductive, she says.

"Forced evacuations in Fukushima have caused some 1600 premature deaths; forced evacuees from Chernobyl have a higher death rate than the 'babooshkas' who returned to the area despite government policy against it," she wrote, referencing studies suggesting that potentially unnecessary Fukushima evacuations disrupted healthcare services.
Hiroshima and Nagasaki are easier areas to analyze than Fukushima, as well as being easier to visualize.  The sources of radiation from the A-bombs were point sources which created an exposure that was over quickly compared to Chernobyl or Fukushima (itself a tiny fraction of Chernobyl), both of which went on for an extended time, spreading radiation over a wider area.  The model is easy to visualize.  Those closest to the blasts died instantly or almost instantly for those a little farther out.  Deaths further out were due to the radiation, particularly by damage to their immune system.  The low-dose survivors in this study were much further out than that. All of them lived because they weren't close enough to be killed.  The complication is that there appears to be a range of doses where lifespan went up instead of down.
Taking these facts into consideration, the effects on lifespan and cancer incidence of A-bomb survivors were reexamined for the present analyses. Letting the data speak, one would hear that low-dose radiation from A-bombs has extended survivor lifespan and reduced cancer mortality on average for A-bomb survivors and not-in-the-city control subjects (NIC). The key to resolving the apparent discrepancy between the received notions and actual data is radiation hormesis and the radiation doses of a hormesis range to which a large fraction of A-bomb survivors and NIC were exposed. Of course, A-bomb survivors who received high doses exhibited shortened lifespan and increased cancer mortality, but they accounted for a minor fraction of all local residents. Therefore, results show that the “average lifespan” was longer and that “average cancer mortality” was reduced overall.
I concluded that piece six years ago by saying, "The conclusion here isn't "Chernobyl Was Good!" ".  We can substitute "the conclusion here isn't that dropping atomic bombs was good".  I just see evidence that our body's homeostasis systems that keep us normal despite everything that happens to us deal with radiation better than the linear no-threshold LNT says.  I see that, perhaps, the persistent low level radiation leads to some sort of adaptation.  It that makes me think the LNT model of radiation protection is not well-supported by evidence. 

It's admittedly hard to study in a controlled environment because it would be unethical to dose people with radiation without knowing how much is harmful or how much is beneficial.  Population association studies like this one (and like the vast majority of epidemiological studies) are the only way to keep getting data.


Hiroshima. 

"Offered for your consideration".



11 comments:

  1. There is a good article here at: https://gettingstronger.org/hormesis/
    The EPA, of course, denies there are "good" levels of anything they have labeled toxic, but the article states that even dioxin - treated as one of the nastiest substances in their lexicon of toxic substances - can be beneficial in low doses, as shown by testing with rats, anyway.

    This has been known to medicine for thousands of years. Cults that have dealt with poisonous snakes have indicated that getting small doses over a period of time can provide immunity from a bite that injects a large amount that would normally be fatal. As a child back in the fifties, I was treated with small injected amounts of substances to which I was allergic. After getting those injections bi-weekly for about a year (IIRC), I developed the ability to tolerate exposure to them - possibly an actual immunity to some, if not all, of those substances.

    So it isn't difficult for me to accept it could be so with radiation as well. I don't know how someone would set parameters on what amount would lead to beneficial amounts as opposed to those which would cause significant sickening and/or death, but it's good to know that ol' Nietzsche might have been right about _something_. :-)

    ReplyDelete
  2. The problem with radiation and long term effects is it is VERY difficult to study.
    We have a decent handle on the outcomes from higher level exposures but it's virtually impossible to study low level exposures over time because it's not possible to control for ALL the variables. And ethics also is an issue. How many people are going to sign up to be dosed with low levels of radiation just to see what MIGHT happen 30 years in the future. And while the government could and has done such studies unethically on people it's not very good science. To prove hormesis one must be able to assign cause and effect. We still can't assign cause and effect to the negative outcomes from low level exposure. It's almost impossible to say that Leukemia A in 2015 was caused by Cat Scan B done in 1980. There may have been some other trigger for the disease. The ONLY methodology we have for low level radiation outcomes is LARGE population studies where you compare the populatio downwind from say Chernobyl or Fukushima over a period of time to a similar control of unexposed people. These studies usually show an increase in some types of cancers, specifically thyroid and leukemia over time. But even then there are OTHER CAUSES for such cancers that contribute to inaccuracies. So while the theory of hormesis is not debunked it's virtually impossible to prove.

    ReplyDelete
  3. What we do know is that the current standards for acceptable radiation exposure are insane. Sleeping next to your spouse exposes you to unacceptable levels of radiation emitted from the human body.

    ReplyDelete
  4. All other substances have some level of 'no harm' dose (even if some people don't admit it), so logically radiation must also.
    After all, radio waves and other ionizing radiation have acceptable doses, so nuclear radiation must also.

    ReplyDelete
  5. Perhaps a longitudinal study of airline pilots and flight attendants with a cut into groups of five years exposure would shed light on the question.

    ReplyDelete
  6. Research on radiation improving crop yields was all the rage in the 1930s and 1940s. One could find pictures of irradiated seed potatoes producing twice as much as regular seed potatoes.

    The conventional thinking, now, is that radiation at appropriate levels removed the latent virus from the seed potatoes. Imagine a clone of you with the flu or a severe head cold competing with a non-impaired version of you.

    This may have been the genesis of the Captain America and Spiderman meme, super-human performance as a result of appropriate radiation.

    Just another perspective on history.

    ReplyDelete
  7. My "Radiological Science for Technologists" textbook says the population of radiological technologists (normal people call 'em X-ray techs) has no greater incidence of cancer than the general population.

    I ran the numbers and realized getting my Maximum Daily Dose would bring me to my Maximum Lifetime Limit just as I wanted to retire. I could be careless and have an excuse to retire earlier rather than later. Win-win. If I had stayed in X-ray, maybe I would be healthier than I am now...

    ReplyDelete
  8. The no safe level of (man made) radiation (Linear No Threshold) approach is a leftover from the Manhattan Project and the immediate post war era. With an expanding nuclear industry, a set of standards for exposure to ionizing radiation and radioactive element was needed quickly. There was a some data on high levels of exposure from bomb survivors and accidents. Research was on going for low level exposures in workers and animals but to come to a conclusion on this research would take years or even decades. Atomic workers were being exposed now and administrative limits were needed now. The available data indicated a linear correlation between absorbed dose and biological effect. Those early scientists assumed that like other toxins and insults to the body there was threshold below which there would be no detrimental effects. The problem was that there wasn't enough data to reliably determine that threshold. The solution was to assume the linear effect extended to zero. Everyone knew that was wrong, but it was conservative. Based on anecdotal observations of workers and those living in high background areas, an exposure below which only minimal changes were observed, a high safety factor was selected, and this became the occupational limit, an additional factor was used an this became the public exposure limit. Note the the limits excluded medical exposure.

    As nuclear energy became a fixture of modern life and politics reared its ugly head, the linear no threshold model became an icon of the antinuclear movement. After all if there is no safe level of radiation exposure, why are you building nuclear power plants. Some sixty years later, we have better data but discussion of a threshold or even hormesis is not permitted outside certain areas of health physics, and no one will fund any research or compilation of existing data from radiation workers.

    ReplyDelete
    Replies
    1. Thanks for that summary. It sounds supremely believable - that really seems to be the way standards get set. Which is one reason people still argue over standards for non-ionizing radiation, such as whether or not they should be in a house with a WiFi router.

      Depending on how one sets their "precautionary principle", you get the majority of Western countries with one set of limits that agree pretty well, and another few countries that set limits 1/10 or 1/100 of the majority. Can they prove the higher limit causes harm? Can anyone prove it doesn't?

      Some sixty years later, we have better data but discussion of a threshold or even hormesis is not permitted outside certain areas of health physics, and no one will fund any research or compilation of existing data from radiation workers. This is probably the same for the study of pilots and flight attendants that Ole Grump suggested above. (1/8 at 1506) Yet we know that Colorado, with 15-20% higher natural radiation than surrounding states, has a lower cancer rate than the states with lower radiation.

      Delete
  9. Interesting question.

    I am always thinking of this as firing a rifle round into an aircraft and hoping its capabilities would improve. Maybe if the administered radiation was part of a chemical or drug that guided the hot stuff to somewhere useful. If we where that was.

    As far as radiology technicians, all the ones that I have seen or known step behind a shield before pressing the "On" button. One of my step sons spent years on a team that implanted pacemakers and defibrillators using fluoroscopes and x-ray devices. He has a new job and was over-joyed to give away his lead gown, apron and face shield. Older kiddie docs [now mostly gone] and nurses frequently had "burned" hands from holding infants still for X-rays.

    Colorado is a strange place. Assessing the effects of radiation might require factoring in age and duration of exposure, Oxygen stress, composition of drinking water, composition of air borne dust, food sources, immune status, etc.

    In Colorado you might be able to do population studies if you could get the funding. Administering systemic doses of radiation as part of a long term study is likely to get you in prison or shot. ANYTHING that happened to test subjects would be "your" fault.

    Interesting, however. Maybe if you knew what you were actually trying to accomplish, a precision delivery system [probably chemical, e.g. a targeted drug such as those used for functional MRI] might be useful and keep you out of prison or the morgue.

    And then there is the matter of the off-spring of the test subjects [One head or two?].

    Sounds interesting, although your [academic] children or grandchildren will get to write the report.

    It sounds like understanding the target system would be more useful than "dropping the bomb" and crossing your fingers.

    ReplyDelete
    Replies
    1. It is an interesting question maybe because a randomized, double-blind controlled trial just isn't possible or even ethical. All we can get are population studies from something like the occasional Chernobyl or Fukushima (and thank God they're occasional!)

      Population studies are notoriously misleading because of factors that aren't considered and nothing can be controlled for. The diet/health world is full of them - almost nothing but bad studies. Here, all the researchers know is that people in one ring around the blast areas had lower relative mortality rates than those closer in or farther out. It seems hard to believe it was something to do with the people, e.g., all one genetic background, and just as hard to believe it was something in the physical environment such as a mineral in the ground, or water or something. In the case of Chernobyl, we know that the ecosystem hasn't collapsed but has become an oasis of wildlife not seen elsewhere in the region. I have to assume that's partly due to lack of pressure from people in the area. But it also bears a resemblance to something like our homeostasis systems that keep the processes in our bodies working within strict limits, only on an ecosystem level.

      It's possible that the radiation had nothing to do with the lower mortality rates around the bomb sites, but that something else that happened to those people was what did it. Perhaps famine while waiting for rescue and supplies, is what made the difference. The 2016 Nobel prize in medicine was awarded for the discovery of autophagy - "self-eating" - how the body "cleanses itself" of damaged or unneeded cells during fasting. Perhaps radiation damaged cells were destroyed during starvation after the bombing.

      I'm a fan of the saying that the most important things in science don't follow "Eureka, I found it!" but rather, "that's funny..." and following up on the oddity. The "that's funny" things are where you get some effect named after you, or a new field is discovered. More often you find you screwed up the measurement or the math. This is a "that's funny".

      Delete