Comparing “hormesis” theory of radiation with linear No Threshold Theory

US Nuclear Regulatory Commission (NRC): Consultation. US Nuclear Regulatory Commission (NRC): Consultation. Dr Ian Fairlie Consultant on Radioactivity in the Environment LONDON United Kingdom www.ianfairlie.org, 28 Aug 15, Dr Ian Fairlie Consultant on Radioactivity in the Environment LONDON United Kingdomwww.ianfairlie.org “………..Comments on Hormesis It is true that some cell and animal experiments indicate that if small amounts of radiation were administered before later larger amounts, the damage done is less than if no previous small amount were given. (The word “tickle” is used in radiobiology lingo to denote such small amounts.)

On the other hand, other cell and animal studies using different doses, durations and endpoints fail to show this effect, and there is no human evidence, ie from epidemiology. But it is true that some evidence from chemistry indicates the same effect, and there is some theoretical support for an adaptive effect in animals and plants.

Hormesis advocates typically argue that although radiation attacks DNA and causes mutations, DNA repair mechanisms quickly correct these. These mechanisms are certainly numerous and busy – it is estimated over 15,000 repairs per hour are carried out in each cell – but from the sheer number of repairs, many misrepairs occur and it is the misrepairs that cause the damage.

But even if the existence of hormesis were accepted, the question remains – what relevance would it have for radiation protection? The answer- as stated repeatedly in official reports by UNSCEAR and BEIR etc – is zero.

For example, do we give “tickle” doses to people about to undergo radiation therapy, or to nuclear workers? Of course, we don’t. And what about background radiation? All of us receive small “tickle” doses of radiation – about 3 mSv per year of which about 1 mSv is from external gamma radiation.

Do these somehow protect us from subsequent radiation? How would we notice? And if it did, so what? That is, what relevance would it have for radiation protection, eg setting radiation standards? The answer is again ….none.

Indeed, as we show below, increasing evidence exists that even background radiation itself is harmful.

Comments on LNT On the other hand, the scientific evidence for the LNT is plentiful, powerful and persuasive. It comes from epidemiological studies, radiobiological evidence, and official reports. Let’s examine these in turn.

A. Epidemiological Studies Does the available epidemiological evidence show risks declining linearly with dose at low doses? Yes, recent epidemiology studies do indeed show this, and the important new points are that these are (a) very large studies with good confidence intervals, and (b) at very low doses, even down to background levels. In other words, the usual caveats about the validity of the linear shape of the dose response relationship down to low doses are unjustified. The most recent evidence is from a particularly powerful study by Leuraud et al (2015) which shows linearly-related risks down to very low levels (average dose rate = 1.1 mGy per year).http://www.thelancet.com/journals/lanhae/article/PIIS2352- 3026%2815%2900094-0/fulltext The main findings from the Leuraud study are shown in graph 1.

graph radiationred bone marrow dose

Two interesting things about this study are that 5 of the 13 authors are from US scientific institutes, including the Centers for Disease Control and Prevention, the National Institute for Occupational Safety and Health, the Department of Health and Human Services, University of North Carolina, and Drexel University School of Public Health.
Also that the study was funded by many international agencies, including the US Centers for Disease Control and Prevention, US National Institute for Occupational Safety and Health, US Department of Energy, and the US Department of Health and Human Service.
It is legitimate to ask whether the NRC is in contact with these official US agencies about its consultation. The Leuraud et al study is merely the latest of many studies providing good evidence for the LNT model. Second is the Zablotska study after Chernobyl. Graph 2 below, [on original document] reproduced from Zablotska et al (2012), shows statistically significant risks for all leukemias and for chronic lymphocytic leukemia (CLL) in over 110,000 Chernobyl cleanup workers. It can also be seen that there are 6 data points showing increased risks below 100 mSv – a commonly cited cut-off point.
Third is the very recent cohort study of radiation exposures from medical CT scans in the UK by Pearce et al (2012). 74 out of 178,604 patients diagnosed with leukaemia and 135 out of 176,587 patients diagnosed with brain tumours were analyzed. As shown in graph 3 reproduced from their study, [on original document] the authors noted a positive association between radiation doses from CT scans and leukaemia and brain tumours .The large dashed line showed a linear fit to the data with a 95% confidence interval shown by small dashed lines.
Fourth are the risks from background radiation – yes, even from background radiation. Kendall et al in 2012 conducted a large UK record-based case–control study testing associations between childhood cancer and natural background radiation with over 27,000 cases and 37,000 controls. Surprisingly, they observed an elevated risk of childhood leukaemia with cumulative red bone marrow dose from natural background gamma radiation. See the similar findings in a very recent study by Spycher et al (2015) discussed on page 10 below……..[more explanation and graph on original document]
Fifth is the final analysis of the UK National Registry for Radiation Workers (NRRW). This study of observed 11,000 cancer cases and 8,000 cancer deaths in 175,000 UK radiation workers with an average individual cumulative dose of 25 mSv and an average follow-up of 22 years. Graph 5 reproduced from the study shows the relative risks for all solid cancers with the continuous blue line representing the NRRW data, and the continuous red line the results from the US BEIR VII report for comparison – the two are very similar, as can be seen. An estimated ERR of 0.27 per Sv can be derived from this graph. [on original]
Sixth is the meta-analysis of 13 European studies in 9 EU countries on indoor radon exposure risks by Darby et al (2005). This examined lung cancer risks at measured residential Rn concentrations with over 7,000 cases of lung cancer and 14,000 controls. The action level for indoor radon in most EU countries is 200 Bq per m3 , corresponding to about 10 mSv per year. (This is derived from a UNSCEAR (2000) reference value of 9 nSv per Bq·h/m3 . This means that people living 2/3rds of their time indoors (5,780 h/year) at a Rn concentration of 200 Bq/m3 would receive an effective dose of ~10 mSv/year. Graph 6 [on original] reproduced from the study shows elevated risks at concentrations well below this level. The solid line is the authors’ linear fit to the data.
No evidence below 100 mSv? It is necessary at this point to directly address the argument often raised by hormesis advocates – that there is little evidence of effects below 100 mSv. This is incorrect. Older evidence exists -seehttp://www.ianfairlie.org/news/a-100-msv-threshold-forradiation-effects/ for a list of studies and the newer evidence, as we have just seen, clearly shows this fact as well.
B. Radiobiological Evidence Current radiobiological theory is consistent with a linear dose-response relationship down to low doses (ie below ~10 mSv). The radiobiological rationale for linearity comes from the stochastic nature of energy deposition of ionising radiation. It was explained by 15 of the world’s most eminent radiation biologists and epidemiologists in a famous article (Brenner et al, 2003) as follows:
“1. Direct epidemiological evidence demonstrates that an organ dose of 10 mGy of diagnostic x-rays is associated with an increase in cancer risk.
2. At an organ dose of 10 mGy of diagnostic x-rays, most irradiated cell nuclei will be traversed by one or, at most, a few physically distant electron tracks. Being so physically distant, it is very unlikely that these few electron tracks could produce DNA damage in some joint, cooperative way; rather, these electron tracks will act independently to produce stochastic damage and consequent cellular changes.
3. Decreasing the dose, say by a factor of 10, will simply result in proportionately fewer electron tracks and fewer hit cells. It follows that those fewer cells that are hit at the lower dose will be subject to (i) the same types of electron damage and (ii) the same radiobiological processes as would occur at 10 mGy
4. Thus, decreasing the number of damaged cells by a factor of 10 would be expected to decrease the biological response by the same factor of 10; i.e., the response would decrease linearly with decreasing dose. One could not expect qualitatively different biological processes to be active at, say, 1 mGy that were not active at 10 mGy, or vice versa. The argument suggests that the risk of most radiation -induced endpoints will decrease linearly, without a threshold, from ~10 mGy down to arbitrarily low doses.”……….. http://www.ianfairlie.org/wp-content/uploads/2015/08/US-NRC-Consultation-4-1.pdf
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: