Archive for the ‘radiation’ Category

Nuclear bomb tests at Maralinga triggered Hedley Marston to study fallout over Australia

November 3, 2022

ABC Radio Adelaide / By Daniel Keane 10 Aug 22,

Hedley Marston could be charming, genial and witty but he was not above fulmination, especially where fulminations of a different kind were concerned.

In the mid-1950s, the CSIRO biochemist emerged as arguably the most significant contemporary critic of Britain’s nuclear weapons testing program, which was launched on Australia’s Montebello Islands almost 70 years ago in October 1952.

Despite the imminent anniversary Marston remains an obscure figure, but his biographer Roger Cross believes that should change.

“He appears to be totally unknown to the Australian public and, of course, to South Australians — he was a South Australian after all,” Dr Cross said.

Marston’s reservations about the nuclear program were far from spontaneous; indeed, his strongest concerns weren’t voiced until several years after the first test, when he recorded a radioactive plume passing over Adelaide.

The source of that plume was Operation Buffalo, a series of four nuclear blasts in 1956, and Marston was especially outraged by the fact that the general population was not warned.

“Sooner or later the public will demand a commission of enquiry on the ‘fall out’ in Australia,” he wrote to nuclear physicist and weapons advocate Sir Mark Oliphant.

“When this happens some of the boys will qualify for the hangman’s noose.”

What made Marston’s fury difficult to dismiss, especially for those inclined to deride opposition to nuclear testing as the exclusive preserve of ‘commies’ and ‘conchies’, was the fact that he was no peacenik.

Detractors might have damned him as an arriviste, but never as an activist: his cordial relations with Oliphant and other scientific grandees demonstrate that Marston was, in many respects, an establishment man.

Dr Cross has described Marston’s elegant prose as “Churchillian”, and the adjective is apposite in other ways.

While the roguish Marston might not have gone as far as the British wartime leader’s assertion that, during conflict, truth is so precious “that she should always be attended by a bodyguard of lies”, he had, in a 1947 letter to the editor, publicly defended scientific secrecy:

Under present conditions of fear and mistrust among nations it is obvious that military technology must be kept secret; and to achieve this end it should be conducted in special military laboratories where strictest security measures may be observed.”

But by late 1956, Marston’s alarm at radioactive fallout across parts of Australia was such that he was privately demanding greater disclosures to the general public.

Much of his ire was aimed at the Atomic Weapons Tests Safety Committee — a body established before the Maralinga tests, but after blasts had already occurred at Emu Fields* and the Montebello Islands.

“He was the only senior Australian scientist to express concerns and, because of his character, the concerns that he expressed were very forthright,” said Dr Cross, whose biography of Marston, aptly entitled Fallout, inspired the documentary Silent Storm.

“When the safety committee after each explosion said there was absolutely no effect on Australians, he believed that they were lying.”

‘If the wind changes, we need to go’

The experiments that led Marston, whose reputation largely rested on his expertise in sheep nutrition, to reach this conclusion were two-fold.

In the more protracted one, he analysed the presence of radioactive iodine-131 — a common component of nuclear fallout — in the thyroids of sheep.

“One group he kept penned up under cover eating dried hay, which had been cut some time before. The other group, he put outside eating the grass,” Dr Cross said.

“He tested the thyroids in each group – the ones on the hay only had background amounts of iodine-131.

“But the ones in the fields had a tremendously high concentration of this radioactive isotope, both north and south of the city.”

A fallout map from the 1985 royal commission, which stated that while fallout at Maralinga Village from the October 11, 1956,  test was “considered to be ‘negligible from a biological point of view’ it does suggest difficulties with the forecast prior to the test”.(Royal Commission into British Nuclear Tests in Australia)

For the other experiment, Marston conducted air monitoring in Adelaide.

He was especially alarmed by what he found for the period following the Maralinga test of October 11, 1956.

“There was a wind shear and at least part, maybe the major part, of that cloud, blew in a south-easterly direction and that took it towards Adelaide and the country towns in between,” Dr Cross said.

“The safety committee — who must have known of the wind shear — had done nothing about warning Adelaide people perhaps to stay indoors.”……………………………………………………

Despite Marston’s reservations, the nuclear program carried on regardless.

Less than a year after the Operation Buffalo tests, Maralinga was hosting Operation Antler.

In September 1957, newspapers around Australia reported on an upcoming “second test” that would, weather permitting, proceed as part of a “spring series”.

If it hadn’t been for the presence of the words “atomic” and “radioactive”, a reader might easily have inferred that what was being described was as commonplace as a game of cricket.

 https://www.abc.net.au/news/2022-08-10/hedley-marston-maralinga-nuclear-bomb-tests-and-fallout/101310032

How Iodine Tablets Block Some Nuclear Radiation

November 3, 2022

Associated Press, News 18,  OCTOBER 18, 2022,

“……………………………………………….This radioactive material can increase the risk of thyroid cancer if it gets into the body, for example by breathing it in or eating contaminated food. It’s especially dangerous for children, and its health risks can last for many years after exposure, according to the World Health Organization.

Iodine tablets work by filling up the thyroid with a stable version of iodine so that the radioactive kind can’t get in. If the thyroid is already packed with potassium iodide, it won’t be able to pick up the harmful iodine that’s left after a nuclear accident.

 what are iodine pills? And what can they do — and what can’t they do — in the case of a nuclear leak or attack?

Potassium iodide, or KI, offers specific protection against one kind exposure. It prevents the thyroid — a hormone-producing gland in the neck — from picking up radioactive iodine, which can be released into the atmosphere in a nuclear accident.

The pills are cheap and sold all over the world, and many countries, including the U.S., stockpile them.

But potassium iodide doesn’t protect against other kinds of radioactive threats. A nuclear bomb, for example, can release many different kinds of radiation and radioactive material that can harm many parts of the body.

Health authorities caution that potassium iodide should only be taken in certain nuclear emergencies, and works best if it’s taken close to the time of exposure. It shouldn’t be taken as a preventive measure ahead of time.

Potassium iodide doses can come with some side effects like rash, inflammation or an upset stomach. Those over 40 years old generally shouldn’t take iodine tablets unless their expected exposure is very high, according to guidelines from the U.S. Food and Drug Administration.  https://www.news18.com/news/explainers/explained-how-iodine-tablets-block-some-nuclear-radiation-6187801.html

Race Correction and the X-Ray Machine — The Controversy over Increased Radiation Doses for Black Americans in 1968

November 3, 2022

New England Journal of Medicine Itai Bavli, Ph.D.,  and David S. Jones, M.D., Ph.D.

In May 23, 1968, Howard Goldman, director of the New York Bureau of X-Ray Technology, acknowledged that x-ray technicians routinely exposed Black patients to doses of radiation that were higher than those White patients received.1 This practice, which adhered to guidelines from x-ray machine manufacturers, may have been widespread in the 1960s. Senate hearings held that month, as political unrest rocked the country, prompted public outcry and led to calls from state and federal officials to end the practice. Yet in the 21st century, despite growing interest in the problems of race and racism in medicine, race adjustment of x-rays has received little attention.2-6 It’s important to understand the origins of this practice, its rationales, its possible harms, and related controversies. The history shows how assumptions about biologic differences between Black and White people affected the theory and practice of medicine in the United States in ways that may have harmed patients. These insights can inform ongoing debates about the uses of race in medicine.7-10

………………………………….. despite recent attempts to mitigate the harmful effects of racial biases in medicine, race-based beliefs and practices, especially the use of racial categories, remain widespread.8 The history of race adjustment for x-ray dosing reveals how mistaken assumptions can be admitted into medical practices — and how those practices can be ended.

Racialization of the X-Ray

The discovery of x-rays in 1895 revolutionized medicine. It allowed doctors to diagnose and treat many medical problems more easily.22 The ability to image teeth also transformed dental care. However, as x-ray technology developed in the early 20th century, false beliefs about biologic differences between Black and White people affected how doctors used this technology.

Ideas about racial differences in bone and skin thickness appeared in the 19th century and remained widespread throughout the 20th.

………………………………… The belief that Black people have denser bones, more muscle, or thicker skin led radiologists and technicians to use higher radiation exposure during x-ray procedures.

…………………………………….. In the 1950s and 1960s, x-ray technologists were told to use higher radiation doses to penetrate Black bodies. Roentgen Signs in Clinical Diagnosis, published in 1956, described the radiographic examination of a Black person’s skull as a “technical problem” that required a modified technique……………………………..

Debate and Denial in the Senate

The practice of giving larger x-ray doses to Black patients was brought to national attention in May 1968, when the U.S. Senate held hearings about the Radiation Control for Health and Safety Act of 1968.27

………………………… At the hearings on May 15, Ralph Nader mentioned that technicians exposed Black patients to higher x-ray doses: “A practice widespread around the country is that by technologists and their supervisors giving Negroes one-fourth to one-half larger X-ray dosages than white patients because of a generalized intuition or folklore.”27 

…………………………………… Race classifications have traditionally been based on skin pigmentation and other superficial physical traits. One might have expected x-ray technologies, which see through the skin to deeper structures beneath, to be spared racialization. They were not. During the 20th century, radiologists and device manufacturers embedded racial assumptions in the basic practices of radiology. Nader, a consumer advocate working on radiation safety, exposed the practices of race adjustment to public scrutiny, triggering investigation and rapid action by federal and state officials and by physicians and device manufacturers. However, radiologists and technicians retained the ability to determine x-ray exposures. We do not know how long the practice of race adjustment actually endured……………………….. more https://www.nejm.org/doi/full/10.1056/NEJMms2206281

British soldiers used as radiation guinea pigs in nuclear bomb tests in Australia

August 4, 2022

British veterans ‘ordered to march through smoking craters’ in nuclear bomb tests, Brian Tomlinson claims the state dumped him and his comrades, many of whom died from cancer after being used in a shocking human experiment,

 https://www.mirror.co.uk/news/uk-news/british-veterans-ordered-march-through-27563987 Susie Boniface, Reporter, 24 Jul 2022,

A veteran of nuclear bomb tests has told how British ­servicemen were ordered to march through a smoking crater to find how radioactive it was.

Brian Tomlinson said he also had to dig out scientific instruments buried in the contaminated soil and revealed he was left with bleeding ulcers on his palms for two decades.

But he claims the state dumped him and his comrades, many of whom died from cancer in the years after they were used in a shocking human experiment in the Australian outback.

And Brian supports the Mirror’s campaign for a medal for heroes of the nuclear tests in the 50s.

“That place is still radioactive, it’s in the soil for a hell of a long time, so what chance does a human being have?” he said.

“A medal would get us a little bit of recognition for those who took part. It says you’re someone who’s been noticed and not discarded, which is how we’ve felt for so long.”

Last month, Boris Johnson became the first PM to meet veterans, and promised action before October’s 70th anniversary of the first test. His resignation threw it into doubt and campaigners are seeking ­reassurances from Rishi Sunak or Liz Truss that they will do the same.

Brian, now 85, was a sapper sent to Maralinga, South Australia, in 1957 to take part in Operation Antler, a series of three atomic bomb tests designed to help build the more powerful H-bomb.

His troop of Royal Engineers were blended with Australian soldiers, and 40 of them lived for a year inside the blast zone in canvas tents.

The main base, where scientists, top brass, and most troops stayed, was called Maralinga Village. Brian’s unit was 14 miles deeper into the testing grounds, at Roadside Camp. From there, it was just 9 miles to Ground Zero.

Brian, a 20-year-old corporal at the time, said: “Nobody told us what it was all about, or checked us for ­radiation, but every morning we went into the forward area.

We had pneumatic drills, and had to blast down through the soil. There was about 12 inches of earth, red dust, and below that was rock.”

For each of 3 blasts, the crew had to bury dozens of large steel containers 8ft square. Each had instruments inside to measure the explosion, with pipes protruding above ground level. Those closest to the bombs were sandbagged and concreted to protect them from the shockwave.

A few hours after each bomb, Brian and his crew – wearing only shorts, socks, boots and a hat – had to drive back in, remove the sandbags and concrete, and extract the instruments.

Scientists who went with them wore radiation suits and badges, but Brian said for the first two blasts he had neither.

He added: “After the third bomb, we were given little rubber boots, and a white overall, and a dose badge. We were told to walk through the crater. The mushroom cloud was still overhead. The wind had started to push it away. It was only a few hours after, not very long.”

The first two bombs, ­codenamed Tadje and Biak, were one kiloton and 6kts respectively.

But the third, Taranaki, was 25kts, as powerful as the weapon which destroyed ­Nagasaki in 1945.

Brian, of Yate, near Bristol, said: “As you approached the bomb site it was quite amazing, because it was like a bowling green. Everything was green and smooth. It was only when you were on it you realised the heat from the bomb had crystallised the earth underneath it. It was a crust of molten sand, like glass.

“The crater left there was huge. They told us to walk into that, down into the crater, and up the other side, and then check our meters to see how high the dose was.”

Brian said: “When it reached a certain point they told us to come out. It didn’t take long for it to reach that point. We weren’t told at the time what the dose was supposed to be. But it was just as bad as going through the centre of the bomb as soon as it had gone off.”

The first two bombs were detonated on top of 100ft-high towers built by the sappers, but desert sand was sucked into the fireball and fell to the ground as toxic fallout. The third bomb was tethered to barrage balloons 980ft up, supposedly minimising the risk.

But the size of the bomb, and perhaps the fact the same site was used for previous weapons tests, meant there was still fallout.

After they left the crater, Brian was taken to a decontamination area. The men’s clothes were stripped off and taken away, and the men were put through showers.

“We spent 5 or 6 minutes scrubbing away, then put ourselves in this meter, it was like standing on a weighing machine, and you push your hands through these bars to be tested. If a bell rang, you were still radioactive and had to go back in and scrub under your nails, everywhere, in your hair. I had to do it 3 times. They didn’t give us any more information.”

Documented safety measures at Maralinga included wire fences through which sand could easily be blown, and one wooden post barrier that Brian’s unit passed through each morning.

Brian was not checked for radiation while excavating amid the fallout, nor given long-term medical follow-ups. Six years later, he was medically discharged with a duodenal ulcer.

Radiation is known to cause problems with the lining of the gut, and earlier this year a government study reported nuclear test veterans were 20 per cent more likely than other servicemen to die from stomach cancer.

Brian said: “It wasn’t until later I started having skin problems. It would cover me from head to toes, rashes on my back, chest, legs, thighs. They used to come out on the palms of my hands.

“I’d get a little itchy blister in the centre of my palm, it would break and then spread over the fingers. I used to wear white cotton gloves to ease the pain and itching.

“The skin would go hard, then crack and bleed, and it would start all over again. I had that for 20 years, and no doctor could work out what it was.”

https://get-latest.convrse.media/?url=https%3A%2F%2Fwww.mirror.co.uk%2Fnews%2Fuk-news%2Fbritish-veterans-ordered-march-through-27563987&cre=bottom&cip=45&view=web

Today, cancer patients are warned radiotherapy using beta radiation can lead to radiodermatitis, which causes rashes, skin peeling, and ulceration. It is caused by the decay of isotopes, including plutonium and cobalt-60, both of which were in the Antler bombs.

Brian said: “I would have a constant itch, all over, and had to take cold showers just to stop the itching and have something of a normal life. I got depressed, to the point where I didn’t want to go and see the doctors because they just have me the same old medication and it never did me any good. Then one day, after 20 years, it just stopped, as suddenly as it came.”

Two decades after his discharge, Brian also had an operation to finally cure his ulcer. It involved cutting the vagus nerve, which controls digestion as well as carrying sensory information from the skin’s surface.

“I told all my consultants what was done to me out there in Maralinga, and asked if it was due to fallout. They all denied it,” said Brian. “Nobody’s ever done anything for us nuclear test veterans except withhold information from us.”

Campaigners have asked the Prime Minister for a medal and a service of national recognition at Westminster Abbey to mark the Plutonium Jubilee in 3 months’ time.

A spokesman for the MoD said it was grateful to veterans, and claimed they were well-­monitored and protected. He added: “The Prime Minister met with veterans recently, and asked ministers to explore how their dedication can be recognised. We remain committed to considering any new evidence”

************************************************************************

For 40 years, the Mirror has campaigned for justice for the brave men who took part in Britain’s nuclear weapons tests.

The Ministry of Defence has fought back every step of the way.

We have told countless heartbreaking stories of grieving mums, children with deformities, men aged before their time and widows struggling to hold their families together, all while campaigning for recognition.

Two years ago we launched an appeal for a medal for the 1,500 survivors.

For the first time we were able to prove some were unwittingly used in experiments.

Our appeal was backed by then-Defence Secretary Gavin Williamson but his review foundered after he lost his job.

It had only six meetings in two years. They never asked to meet veterans. They never questioned the evidence.

Instead they asked for information from the MoD, which has a track record of denying what its own paperwork later proves.

And as our medal campaign gathered steam, civil servants simultaneously withdrew public documents from the National Archives.

Would anyone working in Whitehall today stay there, if 3 megatons of plutonium exploded south of the river?

The test veterans and their families will never stop fighting. The Mirror will never cease to demand they are heard.

Prime Minister, listen to them. Overturn this disgraceful decision.

Decadal trends in 137Cs concentrations in the bark and wood of trees contaminated by the Fukushima nuclear accident.

August 4, 2022

Published: 04 July 2022

Abstract

Understanding the actual situation of radiocesium (137Cs) contamination of trees caused by the Fukushima nuclear accident is essential for predicting the future contamination of wood. Particularly important is determining whether the 137Cs dynamics within forests and trees have reached apparent steady state. We conducted a monitoring survey of four major tree species (Japanese cedar, Japanese cypress, konara oak, and Japanese red pine) at multiple sites. Using a dynamic linear model, we analyzed the temporal trends in 137Cs activity concentrations in the bark (whole), outer bark, inner bark, wood (whole), sapwood, and heartwood during the 2011–2020 period. The activity concentrations were decay-corrected to September 1, 2020, to exclude the decrease due to the radioactive decay. The 137Cs concentrations in the whole and outer bark samples showed an exponential decrease in most plots but a flat trend in one plot, where 137Cs root uptake is considered to be high. The 137Cs concentration ratio (CR) of inner bark/sapwood showed a flat trend but the CR of heartwood/sapwood increased in many plots, indicating that the 137Cs dynamics reached apparent steady state within one year in the biologically active parts (inner bark and sapwood) and after several to more than 10 years in the inactive part (heartwood). The 137Cs concentration in the whole wood showed an increasing trend in six plots. In four of these plots, the increasing trend shifted to a flat or decreasing trend. Overall, the results show that the 137Cs dynamics within forests and trees have reached apparent steady state in many plots, although the amount of 137Cs root uptake in some plots is possibly still increasing 10 years after the accident. Clarifying the mechanisms and key factors determining the amount of 137Cs root uptake will be crucial for predicting wood contamination.

Introduction

After the Fukushima Dai-ichi Nuclear Power Plant (FDNPP) accident in March of 2011, a wide area of forests in eastern Japan was contaminated with radionuclides. In particular, radiocesium (137Cs) has the potential to threaten the forestry and wood production in the contaminated area for many decades because it was released in large amounts (10 PBq)1 and has a relatively long half-life (30 years). Radiocesium levels for some wood uses are strictly regulated in Japan (e.g., 40 Bq kg−1 for firewood2 and 50 Bq kg−1 for mushroom bed logs3), meaning that multipurpose uses of wood from even moderately contaminated areas are restricted. Although a guidance level of radiocesium in construction wood has not been declared in Japan, the permissible levels in some European countries (370–740 Bq kg−1)4,5,6 suggest that logging should be precautionary within several tens of kilometers from the FDNPP, where the 137Cs activity concentration in wood potentially exceeds 1,000 Bq kg−1 [refs. 7,8]. To determine whether logging should proceed, the long-term variation in wood 137Cs concentration must be predicted as accurately as possible. Many simulation models successfully reproduce the temporal variations in the early phase after the FDNPP accident, but produce large uncertainties in long-term predictions9. To understand the 137Cs dynamics in forests and trees and hence refine the prediction models, it is essential to provide and analyze the observational data of 137Cs activity concentrations in tree stem parts.

Accident-derived 137Cs causes two types of tree contamination: direct contamination by 137Cs fallout shortly after the accident, and indirect contamination caused by surface uptake from directly contaminated foliage/bark10,11 and root uptake from contaminated soil12. The 137Cs concentration in bark that pre-exists the accident was affected by both 137Cs drop/wash off from bark surfaces and 137Cs uptake because the bark consists of a directly contaminated outer bark (rhytidome) and an indirectly contaminated inner bark (phloem). Given that the 137Cs content was 10 times higher in the outer bark than in the inner bark in 201213 and the 137Cs concentration in the whole bark decreased during the 2011–2016 period at many study sites8, the temporal variation in the whole bark 137Cs concentration during the early post-accident phase must be mainly contributed by drop/wash off of 137Cs on the outer bark surface.

In contrast, stem wood (xylem) covered by bark was contaminated only indirectly. Although 137Cs distribution in sapwood (outer part of stem wood; containing living cells) and heartwood (inner part of stem wood; containing no living cells) is non-uniform and species-specific8,13,14,15, the 137Cs concentration in whole wood depends on the amount of 137Cs uptake. Because the dissolvable 137Cs on the foliar/bark surface decreased significantly within 201116, the main route of 137Cs uptake since 2012 is likely root uptake rather than surface uptake. A monitoring survey during 2011–2016 showed that the temporal trend in the whole wood 137Cs concentration can be increasing, decreasing, or flat8, suggesting that 137Cs root uptake widely differs among sites and species.

Meanwhile, many simulation models have predicted an initial increase in the whole wood 137Cs concentration after the accident, followed by a gradual decline9. The initial increase is attributable to the increase in soil 137Cs inventory, and the following decline is mainly attributed to radioactive decay, dilution by wood biomass increment, and immobilization in the soil. Therefore, the trend shift from increasing to decreasing is a good indicator that shows the 137Cs dynamics within the forest have reached apparent steady state, which is characterized by slower changes in 137Cs concentration, bioavailability, and partitioning in the forest12,17,18. However, the timing of the trend shift predicted by the models have large uncertainty, varying from several years to a few decades from the accident9. Moreover, the trend shift has not been confirmed by observational data after the FDNPP accident. Although our monitoring survey cannot easily identify the key driving factors of the temporal trends, it can directly discern the trend shift from increasing to decreasing, and the timeframe of the increasing trend. The confirmation of the trend shift will accelerate the understanding of key factors of 137Cs root uptake, because important parameters such as transfer factor and CR are originally defined for a steady state condition18.

The present study aims to clarify the temporal trends of 137Cs concentrations in bark and wood of four major tree species (Japanese cedar, Japanese cypress, konara oak, and Japanese red pine) at multiple sites during the 10 years following the FDNPP accident. Detecting a trend shift from increasing to decreasing in the wood 137Cs concentration was especially important to infer whether the 137Cs dynamics within the forest have reached apparent steady state. We update Ohashi et al.8, who analyzed the monotonous increasing or decreasing trends during 2011–2016, with observational data of 2017–2020 and a more flexible time-series analysis using a dynamic linear model (DLM). The DLM is suitable for analyzing data including observational errors and autocorrelation, and has the advantage of being applicable to time-series data with missing years. For a more detailed understanding of bark contamination and the 137Cs dynamics in tree stems, we also newly provide data on the 137Cs concentrations in the outer and inner barks. The temporal trends in the 137Cs CRs of outer bark/inner bark, heartwood/sapwood, and inner bark/sapwood were analyzed to confirm whether the 137Cs dynamics within the trees have reached apparent steady state.

Materials and methods

Monitoring sites and species

The monitoring survey was conducted at five sites in Fukushima Prefecture (sites 1–4 and A1) and at one site in Ibaraki Prefecture (site 5), Japan (Fig. 1). Sites 1, 2, and A1 are located in Kawauchi Village, site 3 in Otama Village, site 4 in Tadami Town, and site 5 in Ishioka City. Monitoring at sites 1–5 was started in 2011 or 2012, and site A1 was additionally monitored since 2017. The tree species, age, mean diameter at breast height, initial deposition density of 137Cs, and sampling year of each sample at each site are listed in Table 1. The dominant tree species in the contaminated area, namely, Japanese cedar (Cryptomeria japonica [L.f.] D.Don), Japanese cypress (Chamaecyparis obtusa [Siebold et Zucc.] Endl.), konara oak (Quercus serrata Murray), and Japanese red pine (Pinus densiflora Siebold et Zucc.) were selected for monitoring. Japanese chestnut (Castanea crenata Siebold et Zucc.) was supplementally added in 2017. The cedar, cypress, and pine are evergreen coniferous species, and the oak and chestnut are deciduous broad-leaved species. Sites 1 and 3 each have three plots, and each plot contains a different monitoring species. Site A1 has one plot containing two different monitoring species, and the remaining sites each have one plot with one monitoring species, giving ten plots in total.

Locations of the monitoring sites and initial deposition densities of 137Cs (decay-corrected to July 2, 2011) following the Fukushima nuclear accident in Fukushima and Ibaraki Prefectures. Open circles indicate the monitoring sites and the cross mark indicates the Fukushima Dai-ichi Nuclear Power Plant. Data on the deposition density were provided by MEXT19,20 and refined by Kato et al.21. The map was created using R (version 4.1.0)22 with ggplot2 (version 3.3.5)23 and sf (version 1.0–0)24 packages.

Sample collection and preparation

Bulk sampling of bark and wood disks was conducted by felling three trees per year at all sites during 2011–20168,25 and at sites 3–5 and A1 during 2017–2020. Partial sampling from six trees per year was conducted at sites 1 and 2 during 2017–2020 (from seven trees at site 2 in 2017) to sustain the monitoring trees. All the samples were obtained from the stems around breast height. During the partial sampling, bark pieces sized approximately 3 cm × 3 cm (axial length × tangential length) were collected from four directions of the tree stem using a chisel, and 12-mm-diameter wood cores were collected from two directions of the tree stem using an automatic increment borer (Smartborer, Seiwa Works, Tsukuba, Japan) equipped with a borer bit (10–101-1046, Haglöf Sweden, Långsele, Sweden). Such partial sampling increases the observational errors in the bark and wood 137Cs concentrations in individual trees26. To mitigate this error and maintain an accurate mean value of the 137Cs concentration, we increased the number of sampled trees from three to six. The sampling was conducted mainly in July–September of each year; the exceptions were site-5 samples in 2011 and 2012, which were collected irregularly during January–February of the following year. The collected bark pieces were separated into outer and inner barks, and the wood disks and cores were split into sapwood and heartwood. The outer and inner bark samples during 2012–2016 were obtained by partial sampling of barks sized approximately 10 cm × 10 cm from 2–3 directions on 2–3 trees per year.

The bulk samples of bark, sapwood, and heartwood were air-dried and then chipped into flakes using a cutting mill with a 6-mm mesh sieve (UPC-140, HORAI, Higashiosaka, Japan). The pieces of the outer and inner bark were chipped into approximately 5 mm × 5 mm pieces using pruning shears, and the cores of the sapwood and heartwood were chipped into semicircles of thickness 1–2 mm. Each sample was packed into a container for radioactivity measurements and its mass was measured after oven-drying at 75 °C for at least 48 h. Multiplying this mass by the conversion factor (0.98 for bark and 0.99 for wood)8 yielded the dry mass at 105 °C.

Radioactivity measurements

The radioactivity of 137Cs in the samples was determined by γ-ray spectrometry with a high-purity Ge semiconductor detector (GEM20, GEM40, or GWL-120, ORTEC, Oak Ridge, TN). For measurements, the bulk and partial samples were placed into Marinelli containers (2.0 L or 0.7 L) and cylindrical containers (100 mL or 5 mL), respectively. The peak efficiencies of the Marinelli containers, the 100-mL container, and the 5-mL container were calibrated using standard sources of MX033MR, MX033U8PP (Japan Radioisotope Association, Tokyo, Japan), and EG-ML (Eckert & Ziegler Isotope Products, Valencia, CA), respectively. For the measurement of the 5-mL container, a well-type Ge detector (GWL-120) was used under the empirical assumption that the difference in γ-ray self-absorption between the standard source and the samples is negligible27. The measurement was continued until the counting error became less than 5% (higher counting errors were allowed for small or weakly radioactive samples). The activity concentration of 137Cs in the bark (whole) collected by partial sampling was calculated as the mass-weighted mean of the concentrations in the outer and inner barks; meanwhile, the concentration in the wood (whole) was calculated as the cross-sectional-area-weighted mean of sapwood and heartwood concentrations. The activity concentrations were decay-corrected to September 1, 2020, to exclude the decrease due to the radioactive decay.

Discussion

Causes of temporal trends in bark 137Cs concentration

The 137Cs concentration in the whole bark decreased in many plots, clearly because the outer bark 137Cs concentration decreased. However, the whole bark 137Cs concentration showed a relatively small decrease or even a flat trend in some plots (site-2 cedar and site-1 cypress and oak). In the site-1 cypress plot, where the whole bark 137Cs concentration decreased relatively slowly, the inner bark 137Cs concentration notably increased. Similarly, although we lack early phase monitoring data in the site-2 cedar and site-1 oak plots, the inner bark 137Cs concentration in both plots is considered to have increased prior to monitoring because the sapwood 137Cs concentration increased in both plots and the CR of inner bark/sapwood was constant in all other plots. Therefore, the low-rate decrease or flat trend in the whole bark 137Cs concentration in some plots was probably caused by an increase in the inner bark 137Cs concentration, itself likely caused by high 137Cs root uptake (as discussed later).

The 137Cs concentration in the outer bark decreased in all four plots monitored since 2012 (site-1 and site-3 cedar, site-1 cypress, and site-3 pine), confirming the 137Cs drop/wash off from the bark surface. The constant (exponential) decrease in three of these plots indicates that the 137Cs drop/wash off was still continuing in 2020 but with smaller effect on the outer bark 137Cs concentration. In contrast, the decrease in the site-1 cypress plot seemed to slow down since around 2017. Furthermore, Kato et al.32 reported no decrease in 137Cs concentration in the outer bark of Japanese cedar during the 2012–2016 period. Such cases cannot be fitted by a simple decrease of the outer bark 137Cs concentration. As a longer-term perspective, in the outer bark of Norway spruces (Picea abies) affected by the Chernobyl nuclear accident, the biological half-life of 137Cs concentration was extended in areas with higher precipitation, suggesting that high root uptake of 137Cs hinders the decreasing trend33. The present study showed that 70–80% or more of the 137Cs deposited on the bark surface (outer bark) was removed by drop/wash off after 10 years from the accident and that the 137Cs CR of outer bark/inner bark became constant in some plots. These facts suggest that the longer-term variations in outer bark 137Cs concentration will be more influenced by 137Cs root uptake, although it is uncertain whether root uptake caused the slowing down of the decrease rate seen in the site-1 cypress plot. Further studies are needed to understand the 137Cs concentration in newly formed outer bark and to determine the 137Cs CR of outer bark/inner bark at steady state.

Causes of temporal trends in wood 137Cs concentration

The temporal trends of the 137Cs concentration in the whole wood basically corresponded to those in the sapwood. The exceptions were the site-3 and site-4 cedar plots, where the sapwood 137Cs concentration did not increase but the whole wood 137Cs concentration was raised by the notable increase in the heartwood 137Cs concentration. This behavior can be attributed to a species-specific characteristic of Japanese cedar, which facilitates Cs transfer from sapwood to heartwood8,15,34. The present study newly found that the increase in the 137Cs CR of heartwood/sapwood in the cedar plots became smaller or shifted to a flat trend around 2015–2016, indicating that 137Cs transfer between the sapwood and heartwood has reached apparent steady state at many sites 10 years after the accident. Therefore, after 2020, the whole wood 137Cs concentration in cedar is unlikely to increase without a concomitant increase in the sapwood 137Cs concentration.

The increasing trends in the 137Cs concentrations in whole wood and sapwood (site-2 cedar, site-1 cypress, and site-1 and site-3 oak plots) are seemingly caused by the yearly increase in 137Cs root uptake; however, the wood 137Cs concentration can also increase when the 137Cs root uptake is constant or even slightly decreases each year. This behavior can be shown in a simple simulation of the temporal variation in the wood 137Cs content (the amount of 137Cs in stem wood of a tree). If the 137Cs dynamics within a tree have reached steady state and the proportion of 137Cs allocated to stem wood become apparently constant, the wood 137Cs content in a given year can be considered to be determined by the amount of 137Cs root uptake and the amount of 137Cs emission via litterfall. The flat 137Cs CR trend of inner bark/sapwood during 2012–2020 (see Fig. 5) indicates that the 137Cs dynamics, at least those between the inner bark and sapwood, reached apparent steady state within 2011. Here we assume that (1) the annual amount of 137Cs root uptake is constant, (2) the proportion of 137Cs allocated to stem wood is apparently constant, and as assumed in many forest Cs dynamics models17,35,36,37, (3) a certain proportion of 137Cs in the stem wood is lost via litterfall each year. Under these conditions, the simulated amount of 137Cs emission balanced the amount of 137Cs root uptake after sufficient time, and the wood 137Cs content approached an asymptotic value calculated as [root uptake amount × allocation proportion × (1/emission proportion − 1)]. Note that the asymptotic value increases with increasing root uptake amount and decreasing emission proportion and does not depend on the amount of 137Cs foliar/bark surface uptake in the early post-accident phase. Nevertheless, the amount of 137Cs surface uptake in the early phase critically determines the trend of the wood 137Cs content. More specifically, the trend in the early phase will be increasing (decreasing) if the surface uptake is smaller (larger) than the asymptotic value. Finally, the temporal variation of the 137Cs concentration in wood is thought to be the sum of the dilution effect of the increasing wood biomass and the above-simulated variation in the wood 137Cs content. Therefore, in the early post-accident phase, the wood 137Cs concentration will increase when the wood 137Cs content increases at a higher rate than the wood biomass. As the wood 137Cs content approaches its asymptotic value (i.e., steady state), its increase rate slows and the dilution effect proportionally increases. Then, the wood 137Cs concentration shifts from an increasing trend to a decreasing trend. The trends of the 137Cs concentrations in whole wood and sapwood in the site-3 oak plot follow this basic temporal trend, which is similarly predicted by many simulation models9.

In other plots with the increasing trend (site-2 cedar and site-1 cypress and oak), the increase in the 137Cs concentrations in whole wood and sapwood became smaller or shifted to a flat trend around six years after the accident; however, it did not shift to a decreasing trend. This lack of any clear shift to a decreasing trend, which was similarly seen at sites with hydromorphic soils after the Chernobyl nuclear accident38,39, cannot be well explained by the above simulation. A core assumption of the simulation that the yearly amount of 137Cs root uptake is constant is probably violated in these plots, leading to underestimations of the root uptake amount. Although the inventory of exchangeable 137Cs in the organic soil layer has decreased yearly since the accident, that in the mineral soil layer at 0–5 cm depth has remained constant40. In addition, the downward migration of 137Cs has increased the 137Cs inventory in the mineral soil layer below 5-cm depth41,42. If the steady state 137Cs inventory of the root uptake source can be regarded as sufficient for trees, any increase in the 137Cs root uptake is likely explained by expansion of the root distribution and the increase in transpiration (water uptake) with tree growth. When the wood 137Cs content increases at a similar rate to the wood biomass, the increasing trend will not obviously shift to a decreasing trend. Therefore, assuming the 137Cs allocation and emission proportions in the mature trees do not change considerably with time, the amount of 137Cs root uptake is considered to be increasing yearly in these four plots.

In the remaining plots with the decreasing or flat trend (site-1 cedar, site-4 cedar without outliers, site-5 cypress, and site-3 pine), according to the above simulation, the amount of initial 137Cs surface uptake was larger than or similar to the asymptotic value, i.e. the amount of 137Cs root uptake is relatively small and/or the proportion of 137Cs emission via litterfall is relatively high. However, the amount of 137Cs root uptake in the plots with the flat trend is possibly increasing because the flat trend has not shifted to a decreasing trend. In these plots, although it is difficult to confirm apparent steady state of the soil–tree 137Cs cycling because of the lack of an initial increasing trend, the recent flat trends in the 137Cs CRs of heartwood/sapwood and inner bark/sapwood indicate that the 137Cs dynamics, at least within the trees, have reached apparent steady state.

Various factors were found to increase the 137Cs root uptake after the Chernobyl nuclear accident; for example, high soil water content, high soil organic and low clay content (i.e., low radiocesium interception potential [RIP]), low soil exchangeable K concentration, and high soil exchangeable NH4 concentration12,43. After the FDNPP accident, the 137Cs transfer from soil to Japanese cypress and konara oak was found to be negatively correlated with the soil exchangeable K concentration44,45 and the 137Cs mobility is reportedly high in soils with low RIP46. However, neither the soil exchangeable K and Cs concentrations nor the RIP have explained the different 137Cs aggregated transfer factors (defined as [137Cs activity concentration in a specified component/137Cs activity inventory in the soil]) of Japanese cedars at sites 1–446,47. Because the 137Cs dynamics within the forest and trees in many plots reached apparent steady state at 10 years after the FDNPP accident, the 137Cs aggregated transfer factor is now considered to be an informative indicator of the 137Cs root uptake. Therefore, a comprehensive analysis of the 137Cs aggregated transfer factor and the soil properties at more sites than in the present study will be important to understand key factors determining the amount of 137Cs root uptake by each tree species at each site.

Validity and limitation of the trend analyses

Although the application of the smooth local linear trend model failed in plots monitored for less than five years, it was deemed suitable for analyzing the decadal trend because it removes annual noises, which are probably caused by relatively large observational errors (including individual variability)26. Moreover, the algorithm that determines the trend and its shift between 2 and 4 delimiting years was apparently reasonable, because the detected trends well matched our intuition. However, when judging a trend, the algorithm simply assesses whether the true state values significantly differ between the delimiting years. Therefore, it cannot detect changes in the increase/decrease rate (i.e., whether an increasing/decreasing trend is approaching a flat trend). For example, the whole bark 137Cs concentration in the site-1 cypress plot was determined to decrease throughout the monitoring period. In fact, the decrease rate slowed around 2014 and the decreases were slight between 2014 and 2020 (see Fig. 2). Similarly, the sapwood 137Cs concentration in the site-1 cypress and oak plots was determined to increase throughout the monitoring period, but the increase rate has clearly slowed since around 2017. To more sensitively detect the shift from an increasing/decreasing trend to a flat trend, other algorithms are required. Nevertheless, this algorithm is acceptable for the chief aim of the present study; that is, to detect a trend shift from increasing to decreasing.

Conclusions

In many plots monitored at Fukushima and Ibaraki Prefectures, the 137Cs concentrations in the whole and outer bark decreased at almost the same yearly rate for 10 years after the FDNPP accident, indicating that the direct contamination of the outer bark was mostly but not completely removed during this period. Moreover, the 137Cs concentration in the whole bark decreased at relatively low rates or was stable in plots where the 137Cs root uptake was considered to be high. This fact suggests that indirect contamination through continuous root uptake can reach the same magnitude as direct contamination by the accident.

In all of our analyzed plots, the 137Cs CR of inner bark/sapwood has not changed since 2012, indicating that 137Cs transfer among the biologically active parts of the tree stem had already reached apparent steady state in 2011. In contrast, the 137Cs CR of heartwood/sapwood in six out of nine plots increased after the accident. In four of these plots, the 137Cs CR of heartwood/sapwood plateaued after 3–6 years; in the other two plots, the plateau was not reached even after 10 years. Therefore, saturation of 137Cs in heartwood (an inactive part of the tree stem) requires several years to more than one decade.

The 137Cs concentration in the whole wood showed an increasing trend in six out of nine plots. In four of these plots, the increasing trend shifted to a flat or decreasing trend, indicating that the 137Cs dynamics in many forests reached apparent steady state at 10 years after the accident. However, the lack of the clear shift to a decreasing trend indicates that the 137Cs root uptake is probably still increasing in some plots. Continuous monitoring surveys and further studies clarifying the complex mechanisms of 137Cs root uptake in forests are needed in order to refine the simulation models and improve their prediction accuracy.

https://www.nature.com/articles/s41598-022-14576-1

Putting People First in Low-Dose Radiation Research

August 4, 2022

Putting People First in Low-Dose Radiation Research, Bemnet Alemayehu  Natural Resources Defense Council. 7 June 22.It is urgent and feasible to improve our understanding of low-dose and low-dose-rate ionizing radiation health effects according to a new report released by the National Academies of Sciences, Engineering, and Medicine (NAS). At the request of the U.S. Congress, the NAS formed a committee of experts to conduct the study, sponsored by the U.S. Department of Energy. The report’s primary goal was to recommend a research program to increase the certainty of how exposure to low-dose and low-dose-rate radiation affects human health.  

NRDC agrees that this is the right time to reconsider low-dose interdisciplinary radiation research in the United States and explore opportunities that advances in radiation health physics and information technology are providing. A large fraction of the U.S. population is exposed to low-dose, and low-dose-rate radiation and this number is increasing. Low-dose radiation research is most relevant to impacted communities due to disproportionate level of radiation exposure these communities have experienced compared to the general U.S. population due to activities carried out as part of the U.S. nuclear weapons program. Going forward, the study should give an opportunity for stakeholders and impacted communities to have deep and meaningful engagement at all stages of the research program by identifying priorities of research that concern them. The study should also prioritize trust building and make use of local community expertise.

How are we exposed to low-dose radiation?

People are exposed to ionizing radiation from a variety of sources. Most of this exposure comes from background radiation sources and from medical procedures.

Ionizing radiation is radiation that carries with it enough energy to remove an electron from an atom. This process can initiate a chain of events leading to health problems. When considering the health effects of radiation, understanding the amount of radiation dose absorbed by a person or an organ is critical.

Low-dose and low-dose-rate (low-dose accumulated over several years) are defined to mean a dose below 100 milligray and 5 milligray per hour, respectively. Gray is a unit used to measure the amount of radiation absorbed by an object or person, reflecting the amount of energy that radioactive sources deposit in materials through which they pass. Low-dose radiation exposure includes exposure to natural radiation, medical applications, and occupational exposures. According to the NAS report, low doses of radiation delivered over long periods do not cause prompt tissue or organ damage but may cause cellular damage that increases an individual’s long-term risk of cancer and hereditary disorders in a stochastic (or probabilistic) fashion.

The NAS report identified the following seven low-dose and low-dose-rate radiation exposure sources to be relevant for the study:

  • exposure from natural radiation sources
  • exposure to patients from medical applications
  • occupational exposures
  • exposure of workers that results from nuclear power routine operations and accidents
  • exposure from nuclear or radiological incidents
  • exposures from the nuclear weapons program, and
  • exposure from nuclear waste.

Key recommendations from the report

Research agenda

Ionizing radiation occurs in a wide range of settings and the number of exposed individuals is increasing. However, the relationship between exposure to radiation and cancer risk at the very low doses is not well established. Currently, there is also no dedicated low-dose and low-dose-rate radiation research program or coordinated research strategy in the United States.

The report recommended research programs that leverage advances in modern science to obtain direct information on low-dose and low-dose-rate radiation health effects. These are:

  • advances in epidemiological study design and analysis
  • advances in radiobiological research
  • advances in biotechnology and research infrastructure

For the research to achieve its goals, integration and interaction between these research programs is critical.

Program funding

The report found that a significant investment over a sustained period spanning several decades is necessary to accomplish the research goals. The report estimated that $100 million annually is needed during the first 10 to 15 years with periodic assessments. The report cautioned that inadequate funding for the program would lead to the possible inadequate protection of patients, workers, and members of the public from the adverse effects of radiation.

Leadership for low-dose research in the United States

The report proposed joint Department of Energy and National Institute of Health leadership for low-dose radiation research that involves division of tasks based on capabilities. The report also recommended that the Department of Energy take strong and transparent steps to mitigate the issues of distrust toward research that it manages.

Engagement with impacted communities

Success of the low-dose radiation program would depend not only on its scientific integrity but also on its ability to meaningfully engage and communicate with the stakeholders, which includes impacted communities.

Impacted communities, according to the report, include indigenous communities; atomic veterans; nuclear workers; uranium miners, transporters, and their families; and individuals or communities impacted by radioactive contamination or nuclear fallout due to nuclear weapons testing, offsite radiation releases from nuclear weapons production sites, and nuclear waste cleanup activities. 

Impacted communities have strongly objected to the Department of Energy’s management of the low-dose radiation program due to the Department’s responsibility for management and cleanup of nuclear sites conflicting with its role as a manager of studies on low-dose and low-dose-rate radiation health effects.

For the success of the low-dose radiation program, the program needs to:

  • develop a transparent process for stakeholder identification, engagement, and communication
  • include members of the impacted communities in the independent advisory committee so that they may participate in various aspects of research planning and implementation, and
  • set up additional advisory subcommittees with substantial stakeholder participation to advise on specific projects that involve human populations exposed to low-dose radiation.

Tritium isn’t harmless

August 4, 2022

Dumping Fukushima’s radioactive water is one of many wrong options

Tritium isn’t harmless — Beyond Nuclear International Japan plan to dump tritiated water into the ocean comes with big risks  https://wordpress.com/read/feeds/72759838/posts/4028994254
On May 18, Japan’s Nuclear Regulation Authority gave its initial approval for Tokyo Electric Power to release radioactive water from the destroyed Fukushima nuclear power plant into the Pacific Ocean, claiming that there are no safety concerns. But science disagrees with this conclusion. In a September 2019 blog entry, now updated by the author, Dr. Ian Fairlie looks at the implications of dumping largely tritiated water into the sea and whether there are any viable alternatives.
By Ian Fairlie

At the present time, over a million tonnes of tritium-contaminated water are being held in about a thousand tanks at the site of the Fukushima Daichi nuclear power station in Japan. This is being added to at the rate of ~300 tonnes a day from the water being pumped to keep cool the melted nuclear fuels from the three destroyed reactors at Fukushima. Therefore new tanks are having to be built each week to cope with the influx.

These problems constitute a sharp reminder to the world’s media that the nuclear disaster at Fukushima did not end in 2011 and is continuing with no end in sight.

Recently TEPCO / Japanese Government have been proposing to dilute, then dump, some or all of these tritium-contaminated waters from Fukushima into the sea off the coast of Japan. This has been opposed by Japanese fishermen and environment groups.

There has been quite a media debate, especially in Japan, about the merits and demerits of dumping tritium into the sea. 

Many opinions have been voiced in the debate: most are either incorrect or uninformed or both. This post aims to rectify matters and put the discussion on a more sound technical basis.

  1. TEPCO / Japanese Government have argued that, as tritium is naturally-occurring, it is OK to discharge more of it. This argument is partly correct but misleading. It is true that tritium is created in the stratosphere by cosmic ray bombardment, but the argument that, because it exists naturally, it’s OK to dump more is false. For example, dioxins, furans and ozone are all highly toxic and occur naturally, but dumping more of them into the environment would be regarded as anti-social and to be avoided.
  2. TEPCO / Japanese Government have argued that it is safe to dump tritium because it already exists in the sea. Yes, tritium is there but at low concentrations of a few becquerels per litre (Bq/l). But the tritium concentrations in the holding tanks at Fukushima are typically about a megabecquerel per litre (MBq/l). In layman’s terms, that’s about a million times more concentrated.
  3. TEPCO / Japanese Government have argued coastal nuclear plants routinely dump water that contains tritium into the ocean. Yes, this does (regrettably) occur as their cooling waters become tritiated during their transits of reactor cooling circuits. But two wrongs do not make a right. Moreover, the annual amounts are small compared with what is being proposed at Fukushima. A one GW(e) BWR reactor typically releases about a terabecquerel (trillion Bq) of tritium to sea annually. But Fukushima’s tanks hold about one petabecquerel (PBq or a thousand trillion Bq) of tritium – that is, a thousand times more. A much bigger problem.
  1. Readers may well ask where is all this tritium coming from? Most (or maybe all) the tritium will come from the concrete structures of the ruined Fukushima reactor buildings. After ~40 years’ operation they are extremely contaminated with tritium. (Recall that tritium is both an activation product and a tertiary fission product of nuclear fission.) And, yes, this is the case for all decommissioned (and by corollary, existing) reactors: their concrete structures are all highly contaminated with tritium. The older the station, the more contaminated it is. In my view, this problem constitutes an argument for not building more nuclear power stations: at the end of their lives, all reactor hulks will remain radioactive for over 100 years.
  2. What about other radioactive contaminants? Reports are emerging that the tank waters also remain contaminated with other nuclides such as caesium-137 and especially strontium-90. This is due to the poor performance of Hitachi’s Advanced Liquid Processing System (ALPS). Their concentrations are much lower than the tritium concentrations but they are still unacceptably high.

For example, on 16 October 2018, the UK Daily Telegraph stated:

“Tokyo Electric Power Co (Tepco) which runs the plant, has until recently claimed that the only significant

contaminant in the water is safe levels of tritium, which can be found in small amounts in drinking water, but is dangerous in large amounts. The [Japanese] government has promised that all other radioactive material [apart from tritium] is being reduced to “non-detect” levels by the sophisticated (ALPS). 

“However documents provided to The Telegraph by a source in the Japanese government suggest that the ALPS has consistently failed to eliminate a cocktail of other radioactive elements, including iodine, ruthenium, rhodium, antimony, tellurium, cobalt and strontium. 

“That adds to reports of a study by the regional Kahoko Shinpo newspaper which it said confirmed that levels of iodine-129 and ruthenium-106 exceeded acceptable levels in 45 samples out of 84 in 2017. Iodine 129 has a half-life of 15.7 million years and can cause cancer of the thyroid; ruthenium 106 is produced by nuclear fission and high doses can be toxic and carcinogenic when ingested. 

In late September 2017, TEPCO was forced to admit that around 80 per cent of the water stored at the Fukushima site still contains radioactive substances above legal levels after the Ministry of Economy, Trade and Industry held public hearings in Tokyo and Fukushima at which local residents and fishermen protested against the plans. It admitted that levels of strontium 90, for example, are more than 100 times above legally permitted levels in 65,000 tons of water that has been through the ALPS cleansing system and are 20,000 times above levels set by the government in several storage tanks at the site.”

So what is to be done?

First of all, the ALPS system has to be drastically improved. After that, some observers have argued that, ideally, the tritium should be separated out of the tank waters. Some isotopic tritium removal technologies have been proposed, for example by the International Atomic Energy Agency, but the picture is complicated. The only operating facility I’m aware of, is located at Darlington near Toronto in Canada, though secret military separation facilities may exist in the US or France.

However the Darlington facility was extremely difficult and expensive to construct (~12 years to build and to get working properly), and its operation consumes large amounts of electricity obtained from the Darlington nuclear power station nearby. Its raison d’ȇtre is to recover very expensive deuterium for Canadian heavy water reactors.

Other proposed remedies will probably be more expensive. One problem is basic physics. The tritium is in the form of tritiated water, which is effectively the same as water itself, so that chemical separation or filtration methods simply do not work. 

Another problem is inefficiency: with isotope separation, one would have to put the source hydrogen through thousands of times to get even small amounts of separated non-radioactive hydrogen. A third problem is that hydrogen, as the smallest element, is notoriously difficult to contain, so that gaseous tritium emissions would be very large each year.

None of these technologies is recommended as a solution for Japan: any such facility would release large amounts of tritium gas and tritiated water vapor to air each year, as occurs at Darlington. Tritium gas is quickly converted to tritiated water vapor in the environment. The inhalation of tritiated water vapor from any mooted Japanese facility would likely result in higher collective doses than the ingestion of tritiated sea food, were the tritium to be dumped in the sea.

I recommend neither of these proposed solutions.

There are no easy answers here. Barring a miraculous technical discovery which is unlikely, I think TEPCO/Japanese Government will have to buy more land and keep on building more holding tanks to allow for tritium decay to take place. Ten half-lives for tritium is 123 years: that’s how long these tanks will have to last – at least.

This will allow time not only for tritium to decay, but also for politicians to reflect on the wisdom of their support for nuclear power.

Radiation: Does iodine help?

April 30, 2022

Radiation: Does iodine help?  https://www.dw.com/en/radiation-does-iodine-help/a-61020889 4 Mar 22,

Fears have grown about radiation exposure since Russia’s attack on a Ukrainian nuclear facility. But taking iodine won’t always help. It can, in fact, be dangerous.

When there is an accident at a nuclear power plant — if there’s an explosion or a leak or it’s damaged in some way in war — radioactive iodine is one of the first substances that’s released into the atmosphere.

If that radioactive iodine gets into the body, it can damage cells in the thyroid and result in cancer.

You can inhale radiation, or it can get into your body via the skin. But you can’t see, smell or taste it in the air. It’s an invisible threat.

Some of the worst effects of an overexposure to radiation are thyroid cancer, tumors, acute leukemia, eye diseases and psychological or mental disorders. Radiation can even damage your genes for generations to come.

In the most extreme cases, a high dose of radiation over a short period of time will cause death within days or even hours.    

Is it worth taking iodine against radiation?

Our bodies do not produce iodine themselves. But we do need it, so we consume iodine through food or supplements.

You can purchase iodine in the form of a tablet. When consumed, the iodine is collected or stored in the thyroid gland, where it is used to produce hormones. They help various bodily functions and even support the development of the brain.

The thyroid can, however, become saturated with iodine. And when that happens it can’t store anymore.

So, the theory is that if you take enough “good” iodine, there will be no room left in the thyroid for any “bad” or radioactive iodine. That radioactive iodine should then simply pass through the body and get excreted via the kidneys.

But don’t take iodine as a precaution

There is no point in taking iodine as a precautionary measure to prevent against radiation exposure after a leak or attack on a nuclear power plant.

The thyroid only stores iodine for a limited amount of time.

And taking too much iodine — even the good stuff — can be dangerous.

Many people in Germany, for instance, suffer from an overactive thyroid. And health experts advise against taking any iodine supplements unless there is an acute medical reason to do so. 

Germany’s Federal Ministry for the Environment, Nature Conservation, Nuclear Safety and Consumer Protection (BMUV) says iodine supplements can help after a nuclear power plant accident in a radius of up to 100 kilometers (62 miles).

But the iodine is still only effective if taken when it is needed. Experts say an iodine “block” only has a chance of helping if the good iodine is taken just before or during contact with radioactive iodine. 

Cesium, strontium absorbed by the body

The radioactive isotopes iodine 131 and iodine 133 cause thyroid cancer. They are also the isotopes most associated with radiation exposure caused by a leak or explosion at a nuclear power plant.

The radioactive isotopes strontium 90 and cesium 137 are also part of the mix. They settle in bone tissue and likewise increase the risk of cancer.

The radioactive isotopes strontium 90 and cesium 137 are also part of the mix. They settle in bone tissue and likewise increase the risk of cancer.

Our body mistakes these isotopes for calcium. It can absorb and use them in the physiological processes of our muscles and bones. If that happens, the bone marrow can spin out of control.

Bone marrow is responsible for producing new blood cells. And when it fails, it can lead to a blood cancer known as leukemia, which is often fatal.

Damage to genetic material

Radioactive exposure can also damage genetic material in the body.

That is known to have happened after atomic bombs were dropped on the Japanese cities of Nagasaki and Hiroshima at the end of World War II — children were born with deformities after the war.

Long-term effects were also observed after an accident at the Chernobyl nuclear facility in Ukraine in April 1986.

Twenty years after the catastrophe, cancer rates in most of the affected regions had risen by 40%. An estimated 25,000 people in Russia died as a result of having helped clean up the reactor site.

Almost no treatment for radiation exposure

There is hardly any treatment for radiation exposure. What’s decisive is whether a person has been “contaminated” or whether the radiation has been “incorporated” into the body.

In the case of a contamination, radioactive waste settles on the surface of the body.

It may sound ridiculous, but the first thing people should do in those cases is wash off the radioactive waste with normal soap and water.

A “radioactive incorporation” is far more dangerous. Once radioactive waste has made its way into the body, it’s almost impossible to flush it out again.

There is hardly any treatment for radiation exposure. What’s decisive is whether a person has been “contaminated” or whether the radiation has been “incorporated” into the body.

In the case of a contamination, radioactive waste settles on the surface of the body.

It may sound ridiculous, but the first thing people should do in those cases is wash off the radioactive waste with normal soap and water.

A “radioactive incorporation” is far more dangerous. Once radioactive waste has made its way into the body, it’s almost impossible to flush it out again.

Intensity and time

Radioactivity is measured in millisieverts.

Exposure with 250 millisieverts (or 0.25 sievert), over a short period of time, is enough to cause radiation sickness.

To put that in context, Germany’s Federal Office for Radiation Protection (BfS) tends to measure an average of 2.1 millisievert in the environment. That’s over a whole year.

At a measure of 4,000 millisievert (or 4 sievert), acute radiation sickness starts quickly. The risk of death increases significantly. At 6 sievert, the risk of death is 100% — there is no chance of survival. Death is almost immediate.

Electromagnetic radiation, said by telecom companies to be harmless, could be hurting wildlife.

April 30, 2022

EMFs’ toxic effects on an animal’s cells, DNA and chromosomes have been observed in laboratory specimens — and thus would apply to wildlife, according to the report.

Many types of wildlife are exposed to EMFs from wireless sources, such as deer, seals, whales, birds, bats, insects, amphibians and reptiles, the report said. Many species have been found more sensitive to EMFs than humans in some ways.

Report says wireless radiation, said by telecom companies to be harmless, could be hurting wildlife Source: Environmental Health TrustSanta Fe New MexicanBy Scott Wyland swyland@sfnewmexican.com, Feb 5, 2022

Health researchers raised concerns in the 1990s about the possible harmful effects of wireless radiation from cellphones and towers, and their warnings met pushback from telecommunications companies on the verge of growing a mega-industry.

Industry-backed researchers assured federal agencies health concerns — especially those centered on the possibility of low-level microwaves causing cancer — lacked conclusive evidence.

Regulators accepted their assessments, and the alarm bells went silent.

Now a trio of researchers have compiled a report saying the widespread installation of cell towers and antennas is generating electromagnetic fields — EMFs for short — that could be physiologically harmful.

The report focuses on potential impacts on wildlife, trees, plants and insects, such as bees, because there are no regulations protecting them from EMFs emanating from wireless antennas. Wildlife protections are becoming more vital as this radiation — known more specifically as radiofrequency EMFs — escalates through 5G technologies, the researchers warn.

“There needs to be regulatory standards to address EMFs affecting wildlife,” said Albert Manville, a retired U.S. Fish and Wildlife Service biologist and one of the paper’s authors.

Manville also is an adjunct science professor at Johns Hopkins University.

He said he provided the Federal Communications Commission with some research on how the electromagnetic pollution can hurt wildlife and the steps that could be taken to lessen the impacts.

But the FCC has been unresponsive, Manville said, arguing the agency tends to accommodate the industry it’s supposed to regulate.

“That’s unfortunate, but that’s just the way it is,” he said.

The FCC did not respond to questions about whether it would consider making efforts to reduce animals’ EMF exposure.

The three authors drew from 1,200 peer-reviewed studies to compile a three-part, 210-page report titled “Effects of non-ionizing electromagnetic fields on flora and fauna.” It was published in the journal Reviews on Environmental Health.

Science journalist Blake Levitt, who also co-wrote the report, said they dug up overlooked studies that contained compelling research on how living organisms react to low-level EMFs. Their compilation invalidates any claims that the EMFs don’t cause biological effects, she said.

We just blew the whole thing out of the water and took it to the ecosystem level, which is really where it needed to go,” Levitt said. “Nobody had done that before. We need a whole lot more scrutiny put to the low-intensity stuff.”

Ambient EMFs have risen exponentially in the past quarter-century, as cellphones were widely adopted, to become a ubiquitous and continuous environmental pollutant, even in remote areas, the report said, adding studies indicate EMFs can affect animals’ orientation, migration, food finding, reproduction, nest building, territorial defense, vitality, longevity and survival.

EMFs’ toxic effects on an animal’s cells, DNA and chromosomes have been observed in laboratory specimens — and thus would apply to wildlife, according to the report.

Many types of wildlife are exposed to EMFs from wireless sources, such as deer, seals, whales, birds, bats, insects, amphibians and reptiles, the report said. Many species have been found more sensitive to EMFs than humans in some ways.

The report recommends new laws that include the redesign of wireless devices and infrastructure to reduce the rising ambient levels.

It comes several months after a federal court in Washington, D.C., ordered the FCC to review its guidelines for wireless radiation and justify why it should retain them, as the standards haven’t been updated since 1996. This radiation should not be confused with radioactivity, the court noted, adding microwaves used in transmitting signals are low enough to not heat tissues in what are known as “thermal effects.”

But medical studies suggest the lower-level radiation could cause cancer, reproductive problems, impaired learning and motor skills, disrupted sleep and decreased memory.

These studies and others were submitted to the FCC after it opened a notice of inquiry in 2013 under the administration of former President Barack Obama to probe the adequacy of the 1996 guidelines, which were geared toward avoiding thermal effects, the court said.

In 2019, the Trump administration’s FCC deemed the inquiry unnecessary, saying the 1996 rules were sufficient and required no revision.

Two judges called that FCC action “arbitrary and capricious,” saying the FCC made the decision out of hand, ignoring all the science presented and offering no reasonable, fact-based argument to back it up.

The agency also failed to look at the technological developments in the past 25 years and how they’ve changed the degree of exposure, the judges wrote. And they said it refused to examine possible health effects from EMFs that fall below the threshold set in 1996………………………………..     https://www.santafenewmexican.com/news/local_news/report-says-wireless-radiation-said-by-telecom-companies-to-be-harmless-could-be-hurting-wildlife/article_1ae80fc0-7d5d-11ec-8c13-4f3411ea8ea1.html

Scientists trace the path of radioactive cesium in the ecosystem of Fukushima

April 30, 2022

Scientists trace the path of radioactive cesium in the ecosystem of Fukushima  https://phys.org/news/2022-01-scientists-path-radioactive-cesium-ecosystem.html

by National Institute for Environmental Studies  In 2011, the nuclear accident at Fukushima, Japan, resulted in the deposit of radioactive cesium (radiocesium) into habitats in the vicinity. A decade after the accident, researchers from the National Institute of Environmental Studies, Japan, have collated the complicated dynamics of radiocesium within forest-stream ecosystems. Understanding radiocesium flow in the environment could help mitigate contamination and inform future containment strategies.

In the aftermath of the Fukushima nuclear accident, the Japanese government performed intensive decontamination in the human-occupied parts of the affected area by removing soil surface layers. But a major affected region consists of dense, uninhabited forests, where such decontamination strategies are not feasible. So, finding ways to avoid the spread of radioactive contaminants like radiocesium to areas of human activity that lie downstream to these contaminated forests is crucial.

The first step to this is to understand the dynamics of radiocesium flow through forest-stream ecosystems. In the decade since the accident, a vast body of research has been dedicated to doing just that. Scientists from the National Institute of Environmental Studies, Japan, sifted through the data and detangled the threads of individual radiocesium transport processes in forest-stream ecosystems. “We identified that radiocesium accumulates primarily in the organic soil layer in forests and in stagnant water in streams, thereby making them potent sources for contaminating organisms. Contamination management in these habitats is crucial to provisioning services in forest-stream ecosystems,” says Dr. Masaru Sakai, who led the study. The findings of this study was made available online on 6 July 2021 and published in volume 288 of the journal Environmental Pollution on 1st November 2021.

The research team reviewed a broad range of scientific research on radiocesium in forests and streams to identify regions of radiocesium accumulation and storage. After the accident, radiocesium was primarily deposited onto the forest canopy and forest floor. This radiocesium reaches the earth eventually—through rainfall and falling leaves—where it builds up in the upper layers of the soil. Biological activities, such as those of detritivores (insects and fungi that live off leaf debris etc.) ensure that radiocesium is circulated through the upper layers of the soil and subsequently incorporated into plants and fungi. This allows radiocesium to enter the food web, eventually making its way into higher organisms. Radiocesium is chemically similar to potassium, an essential mineral in living organisms, contributing to its uptake in plants and animals. “Fertilizing” contaminated areas with an excess of potassium provides an effective strategy to suppress the biological absorption of radiocesium.

Streams and water bodies in the surrounding area get their share of radiocesium from runoff and fallen leaves. Most radiocesium in streams is likely to be captured by the clay minerals on stream beds, but a small part dissolves in the water. Unfortunately, there is little information on the relationship between dissolved radiocesium and aquatic organisms, like fish, which could be important to the formulation of contamination management strategies. Radiocesium in streams also accumulates in headwater valleys,pools, and other areas of stagnant water. Constructions such as reservoir dams provide a way to effectively trap radiocesium but steady leaching from the reservoir sediments causes re-contamination downstream.

This complicated web of radiocesium transport is hard to trace, making the development of a one-stop solution to radiocesium contamination impossible. Dr. Sakai and team recommend interdisciplinary studies to accelerate a full understanding of radiocesium pathways in forest-stream ecosystems so that measures can be developed to reduce future contamination. “This review can serve as basal knowledge for exploring future contamination management strategies. The tangled radiocesium pathways documented here may also imply the difficulties of creating successful radiation contamination management strategies after unwished-for nuclear accidents,” explains Dr. Sakai.

Nuclear power is often touted as a solution to the energy crisis, but it is important to plan response measures to unpredictable contamination events. To address the essential need for clean energy in view of the climate crisis, contamination management in societies depending on nuclear power is integral. Fully understanding the behavior of radiocesium in ecosystems can not only lead to the successful management of existing contamination but can also ensure the swift containment of potential future accidents.