Archive for March, 2017

Top salesman for nuclear war – Lockheed Martin

March 9, 2017

Lockheed Martin Used Pentagon Dollars to Lobby Congress for Nuclear Weapons Funding One of the uses of the billions of dollars from these contracts is to recycle them back into lobbying the government to push for additional conventional and nuclear weapons spending, as reported by William Hartung and Stephen Miles. Of course, in addition, these funds are used to support a general environment of fear and insecurity, through contributions supporting hawkish think tanks.

Trump Is Bankrupting Our Nation to Enrich the War Profiteers
 March 06, 2017 By Jonathan King and Richard KrushnicTruthout | News Analysis

“……..Corporations that contract with the Department of Defense (DOD) for nuclear weapons complex work do not report revenues and profits from this work separately from their other military work, although they do break up government work from civilian work, and sometimes break up military work from other government work. Hence, it is not possible to determine profits made from nuclear weapons complex work from the annual reports and Securities and Exchange Commission (SEC) filings of large military corporations. However, it is possible to estimate, and to demonstrate how a significant amount of military R&D and production not recorded as nuclear weapons work is in fact partially nuclear weapons work. The nuclear weapons work financed by the US Department of Energy (DOE) is (not surprisingly) carried out in a semi-secret insiders club that insulates it from public knowledge and oversight. The first contracts for the upgrading of the nuclear weapons triads have already been awarded — one to Northrop Grumman — for a new generation of long-range bomber. But the public remains in the dark as to how many tens of billions of their tax dollars will be spent on the project.

From 2012-2014, according to Lockheed Martin’s 2014 annual report, the company realized an average of $46 billion a year in revenue, with an average of $3.2 billion in profits — 7 percent of revenue, and a 76 percent return on $4.2 billion of investor equity. The annual report informs us that 59 percent of 2014 revenue came from the Pentagon. We know from other sources that $1.4 billion a year is coming from the DOE for operation of the Sandia nuclear weapons lab, and we are estimating that an additional $600 million a year is coming for DOE nuclear weapons complex work. Information in the annual report indicates that around $6.1 billion came from foreign military sales. This adds up to around $35 billion of military revenue, or 75.3 percent of total 2014 revenue. The single biggest revenue earner in recent years is the F-35 jet fighter, bringing in $8.2 billion, 17 percent of total corporation revenue, in 2014. (William Hartung’s recent report describes additional aspects of Lockheed Martin’s military business, and his book Prophets of War: Lockheed Martin and the Making of the Military Industrial Complex provides extensive background).

The only references to Lockheed Martin’s nuclear weapons complex work in its 2014 annual report is a sentence noting provision of infrastructure and site support to the DOE’s Hanford complex, and a phrase noting continuing work on the Trident missile. The words “nuclear weapons” never appear in the report.

Lockheed Martin’s Nuclear Weapons Operations

In spite of the lack of mention in the annual report, Lockheed Martin is a partner with Bechtel ATK, SOC LLC and subcontractor Booz Allen Hamilton in Consolidated Nuclear Security LLC (CNS), in running the DOE Pantex Plant and the Y-12 Complex. Pantex does nuclear weapons life extension, dismantlement, development, testing and fabrication of high explosive nuclear warhead components. Y-12 stores and processes uranium, and fabricates uranium weapons components.

Lockheed Martin produced the Trident strategic nuclear missile for the 14 US Ohio-class nuclear submarines and for the four British Vanguard-class submarines. The 24 Tridents on each Ohio-class submarine each carry either eight or 12 warheads, all of them 20 to 50 times more powerful than the bombs dropped on Hiroshima and Nagasaki. Each warhead is capable of killing most of the people in any one of the world’s largest cities — either immediately or later, from radiation, burns, other injuries, starvation and disease. Lockheed MArtin is not producing new Trident missiles now, but it maintains and modifies them. Previously, Lockheed Martin and its subcontractors received $65 million for each of the 651 Trident missiles, in addition to the $35 billion in earlier development costs.

The other primary strategic nuclear weapon delivery vehicle is Boeing’s land-based Minuteman III strategic missile, also with many warheads per missile. About 450 of them are in silos in Colorado and northern plains states. Lockheed Martin produced and continues to produce key systems for the Minuteman III, and plays a large role in maintaining them. It was awarded a $452 million contract for this work in 2014.

Lockheed’s Sandia Subsidiary

Regarding the Pentagon’s nuclear weapons upgrades planned for the next decade; particularly important is the role of Sandia National Laboratories (SNL). Outside of Albuquerque, New Mexico, this DOE lab’s 10,600 employees make 95 percent of the roughly 6,500 non-nuclear components of all seven US nuclear warhead types. Components arm, fuse, fire, generate neutrons to start nuclear reactions, prevent unauthorized firing, preserve the aging nuclear weapons stockpile and mate the weapons to the missiles, planes and ships that deliver them to targets. Sandia Corporation LLC, wholly owned by Lockheed Martin, operates Sandia. The DOE is spending at least $1.4 billion a year on Sandia nuclear weapons work. The secret Lockheed Martin nuclear warhead assembly plant uncovered in Sunnyvale in 2010 is an extension of Lockheed Martin’s Sandia operations. Again, none of this received any mention or revenue numbers in Lockheed Martin’s 2014 annual report.

Lockheed Martin Used Pentagon Dollars to Lobby Congress for Nuclear Weapons Funding

One of the uses of the billions of dollars from these contracts is to recycle them back into lobbying the government to push for additional conventional and nuclear weapons spending, as reported by William Hartung and Stephen Miles. Of course, in addition, these funds are used to support a general environment of fear and insecurity, through contributions supporting hawkish think tanks. Technically, the federal government does not allow military contracting firms to use awarded funds to lobby Congress. Lobbying funds must come from other parts of the companies’ businesses. In reality, this is a non-functional restriction, since profits from various business segments are fungible; that is, once they are profits, they are intermingled, so in reality, the firms can use the profits from military contracts to lobby Congress. But Lockheed Martin went ahead and spent military contract funds from 2008-2012 as part of the contract expenditures. It didn’t even bother to book the lobbying expenditures as expenditures of profits. In 2015, the US Department of Justice required Lockheed Martin’s Sandia subsidiary to repay $4.9 million of a Sandia contract award to the Pentagon that the firm had spent under the contract for lobbying of Congressman the DOE secretary and the secretary’s family and friends………

Military operations in Italy left lasting thorium pollution

March 9, 2017

Subject:  Alarming levels of thorium-232 at the military firing range lying between Cordenons, San Quirino, Vivaro and San Giorgio della Richinvelda, in the province of Pordenone

The Italian Army operates a military firing range lying between the districts of Cordenons, San Quirino, Vivaro and San Giorgio della Richinvelda in the province of Pordenone, in the vicinity of the River Cellina and the River Meduna, and the drills carried out at this firing range have led to the area becoming radioactively contaminated.

As has been reported by the press, in late December 2013 the Commander of the 132nd Ariete Armoured Division in Cordenons, the Commander-in-Chief of the Italian Army, the offices of the region of Friuli-Venezia Giulia, the province of Pordenone and the affected districts, the prefect of Pordenone, and lastly Local Health Authority (ASS) No 6, were all sent the results of tests that had been carried out by the Friuli-Venezia Giulia provincial department of the Italian Regional Environmental Protection Agency (ARPA), which showed alarming levels of thorium-232 in the area.

Thorium-232 is a notoriously radioactive metal, which emits particles that are six times more hazardous to human health than those released by depleted uranium. It is at its most toxic between around 20 and 25 years after use. More specifically, out of the eight targets (the shells of armoured tanks used for firing practice) tested by the ARPA, four were found to contain thorium-232 at markedly higher levels than those that generally occur naturally; these levels were therefore unnatural, and presumably attributable to military firing operations.

In all likelihood, such levels are the legacy left behind by the drills carried out at the site in the 1980s and 1990s: between 1986 and 2003, the Italian Army’s units were equipped with ‘Milan’ shoulder-fired anti-tank missiles, which emitted thorium-232(1). The ARPA has indicated that it will shortly carry out more extensive tests in the area. It is recalled that, as a result of the area’s geological make-up, materials tend to trickle down to the lowest layers, which makes their future recovery appear rather difficult.

Consequently, there is an acute risk that the ‘Magredi’ region, and the rocky terrain that makes it so distinctive, will be devastated; what is more, the area is protected as both a site of Community importance and a Special Protection Area within the meaning of the Habitats Directive (92/43/EEC) and the Birds Directive (2009/147/EC), due to the wide variety of flora and fauna present there(2).

1. Is the Commission aware of this contamination?

2. Can it report whether any similar cases have occurred in the EU, how they were tackled and whether the areas affected were restored to their original state?

3. What initiatives does it intend to implement in order to prevent similar episodes from occurring in the EU, and in particular to prevent the contamination of aquifers?

The war profiteers – USA

March 9, 2017

Trump Is Bankrupting Our Nation to Enrich the War Profiteers March 06, 2017 By Jonathan King and Richard KrushnicTruthout | News Analysis “………The Role of Weapons Contractors

We have previously argued that it is the guaranteed profits from nuclear weapons manufacture that leads contractors to resist nuclear disarmament and promote the concept of danger from abroad.

The profitability derives from three distinct aspects of such weapons contracts:

  • First, they cannot be outsourced to lower cost suppliers, such as in China or Mexico, by congressional edict.
  • Second, the contracts are cost-plus. That is, no matter what the companies spend on the manufacture, they are guaranteed a healthy profit on top. And, of course, the more they run up the costs, the more they make.
  • And third, the contracts are screened from oversight, such as proper audits, by national security considerations.

The current 2017 congressional military authorization calls for spending of some $350 billion over the next decade for upgrades of our nuclear weapons ($35 billion a year) — land-based missiles in silos, long-range bombers and their bombs, new Trident submarines and upgraded Trident missiles and new nuclear-capable cruise missiles. The so-called “modernization” program that Trump supports will spend more than $1 trillion — a thousand billion — income tax dollars over the next 30 years.

Given that the Soviet Union no longer exists, that China has become a capitalist economy and that the major difficulties faced abroad are ISIS (also known as Daesh) and related groups, it is deeply questionable why the congressional budget still devotes tens of billions of dollars to Cold War-era nuclear weapons. Yet the Trump administration is proposing to spend a trillion dollars or more over the next three decades upgrading the US nuclear weapons triad.

Where does the pressure for these wasteful and provocative programs — which almost certainly decrease national security — come from? While military high command and the intelligence agencies also press for nuclear weapons upgrades, corporate profits derived from nuclear weapons contracts may be the most powerful driving force, supported by members of Congress with military research and development (R&D) and production facilities in their districts.

A closer look at Lockheed Martin, the largest weapons contractor in the world, reveals how this coupling between corporate profits and the continuation of nuclear weapons delivery programs operates……….

USA’s nuclear weapons testing – and its toll on health

March 9, 2017

U.S. nuclear testing ceased in 1992. In 2002, the Centers for Disease Control estimated that virtually every American that has lived since 1951 has been exposed to nuclear fallout, and that the cumulative effects of all nuclear testing by all nations could ultimately be responsible for up to eleven thousand deaths in the United States alone.

America’s Forgotten Nuclear War (On Itself), National Interest 
Kyle Mizokami, 4 Mar  `17 , Nuclear weapons have a mysterious quality. Their power is measured in plainly visible blast pressure and thermal energy common to many weapons, but also invisible yet equally destructive radiation and electromagnetic pulse. Between 1945 and 1992, the United States conducted 1,032 nuclear tests seeking to get the measure of these enigmatic weapons. Many of these tests would be today be considered unnecessary, overly dangerous and just plain bizarre. These tests, undertaken on the atomic frontier, gathered much information about these weapons—enough to cease actual use testing—yet scarred the land and left many Americans with long-term health problems.

The majority of U.S. nuclear tests occurred in the middle of the Western desert, at the Nevada Test Site. The NTS hosted 699 nuclear tests, utilizing both above-ground and later underground nuclear devices. The average yield for these tests was 8.6 kilotons. Atmospheric tests could be seen from nearby Las Vegas, sixty-five miles southeast of the Nevada Test site, and even became a tourist draw until the Limited Test Ban Treaty banned them in 1963. Today the craters and pockmarks from underground tests are still visible in satellite map imagery.

The bulk of the remaining nuclear tests took place in Pacific, at the islands of Bikini, Enewetak, Johnson Island and Christmas Island. The second nuclear test, after 1945’s Trinity Test, took place at Bikini Atoll. The Pacific tests were notable not only for their stunning visuals, the most compelling imagery of nuclear weapons since Hiroshima, but also the forced relocation of native islanders. Others that were near tests were exposed to dangerous levels of radioactive fallout and forced to fleet. In 1954, the crew of the Japanese fishing boat Daigo Fukuryu Maru accidentally sailed through fallout from the nearby fifteen-megaton Castle Bravo test. Contaminated with nuclear fallout, one crew member died, and the rest were sickened by radiation.

The first test of a thermonuclear, or fusion, bomb took place on November 1952 at Enewetak Island. Nicknamed Ivy Mike, the huge eighty-two-ton device was more of a building than a usable nuclear device. The device registered a yield of 10.4 megatons, or the equivalent of 10,400,000 tons of TNT. (Hiroshima, by contrast, was roughly eighteen thousand tons of TNT.) Ivy Mike was the biggest test by far, creating a fireball 1.8 miles wide and a mushroom cloud that rose to an altitude of 135,000 feet.

One of the strangest atmospheric tests occurred in 1962 at the NTS, with the testing of the Davy Crockett battlefield nuclear weapon. Davy Crockett was a cartoonish-looking recoilless rifle that lobbed a nuclear warhead with an explosive yield of just ten to twenty tons of TNT. The test, code-named Little Feller I, took place on July 17, 1962, with attorney general and presidential adviser Robert. F. Kennedy in attendance. Although hard to believe, Davy Crockett was issued at the battalion level in both Germany and North Korea.

Also in 1962, as part of a series of high-altitude nuclear experiments, a Thor rocket carried a W49 thermonuclear warhead approximately 250 miles into the exoatmosphere. The test, known as Starfish Prime, had an explosive yield of 1.4 megatons, or 1,400,000 tons of TNT, and resulted in a large amount of electromagnetic pulse being released over the Eastern Pacific Ocean. The test, conducted off Johnston Island, sent a man-made electrical surge as far Hawaii, more than eight hundred miles away. The surge knocked out three hundred streetlights and a telephone exchange, and caused burglar alarms to go off and garage doors to open by themselves.

Nuclear tests weren’t just restricted to the Pacific Ocean and Nevada. In October 1964, as part of Operation Whetstone, the U.S. government detonated a 5.3-kiloton device just twenty-eight miles southwest of Hattiesburg, Mississippi. The test, nicknamed Salmon, was an experiment designed to determine if nuclear tests could be detected by seismometer. This was followed up in 1966 with the Sterling test, which had a yield of 380 tons.

In 1967, as part of a misguided attempt to use nuclear weapons for peaceful purposes, the United States detonated a nuclear device near Farmington, New Mexico. Project Gasbuggy was an early attempt at nuclear “fracking,” detonating a twenty-nine-kiloton nuke 4,227 feet underground just to see if the explosion would fracture surrounding rock and expose natural-gas reserves. The experiment was unsuccessful. Two similar tests, Rulison and Rio Blanco, took place in nearby Colorado. Although Rulison was a success in that it uncovered usable gas reserves, the gas was contaminated with radiation, leaving it unsuitable for practical commercial use.

A handful of nuclear tests were conducted in Alaska, or more specifically the Aleutian island of Amchitka. The first test, in October 1965, was designed to test nuclear detection techniques and had a yield of eighty kilotons. A second test occurred four years later, and had a yield of one megaton, or one thousand kilotons. The third and largest test, Cannikin, was a test of the Spartan antiballistic-missile warhead and had a yield of less than five megatons.

During the early years of nuclear testing it was anticipated that nuclear weapons would be used on the battlefield, and that the Army and Marine Corps had better get used to operating on a “nuclear battlefield.” During the 1952 Big Shot test, 1,700 ground troops took shelter in trenches just seven thousand yards from the thirty-three-kiloton explosion. After the test, the troops conducted a simulated assault that took them to within 160 meters of ground zero. This test and others like them led to increases in leukemia, prostate and nasal cancers among those that participated.

U.S. nuclear testing ceased in 1992. In 2002, the Centers for Disease Control estimated that virtually every American that has lived since 1951 has been exposed to nuclear fallout, and that the cumulative effects of all nuclear testing by all nations could ultimately be responsible for up to eleven thousand deaths in the United States alone. The United States did indeed learn much about how to construct safe and reliable nuclear weapons, and their effects on human life and the environment. In doing so, however, it paid a terrible and tragic price.

Kyle Mizokami is a defense and national-security writer based in San Francisco who has appeared in the DiplomatForeign PolicyWar is Boring and the Daily Beast. In 2009, he cofounded the defense and security blog Japan Security Watch. You can follow him on Twitter: @KyleMizokami.

Radiation and milk

March 9, 2017

What’s up with milk and radiation? , Connect Savannah, 14 Sept 2011, 

1. It’s a food. While an external dusting of radionuclides isn’t healthy, for efficient long-term irradiation of vulnerable organs there’s no substitute for actually ingesting the stuff.

2. It’s fast. Not to knock potatoes and chicken, but growing these items can take weeks or months. With milk, the fallout simply drifts over the pasture and lands on the grass, which the cows then eat. The radioactive particles are deposited in the cows’ milk, the farmers milk the cows, and in a day or two the contaminated product shows up in the dairy case.

3. Because it’s processed quickly, milk makes effective use of contaminants that would otherwise rapidly decay. A byproduct of uranium fission is the radioactive isotope iodine-131. Iodine is critical to functioning of the thyroid gland, and any iodine-131 consumed will be concentrated there. However, iodine-131 has a half-life of just eight days. The speed of dairying eliminates this impediment.

4. Milk also does a good job of delivering other radioactive contaminants, such as cesium-134 and cesium-137. Although not important for human health, radioactive cesium mimics potassium, which we do need, and is readily absorbed by the body. Another uranium breakdown product is strontium-90, which is especially hazardous to children, since it can be incorporated into growing bones. In contrast to radioactive iodine, strontium-90 has a half-life of about 29 years, so once it gets embedded in you, you are, as the Irish say, fooked.

5. That brings us to the most fiendish property of radioactive milk-it targets the young. Children (a) drink a lot more milk and (b) are smaller, which when you add it up means they get a much stiffer dose. Some cancers triggered by radioactivity have a long latency period; older people may die of something else first, but kids bear the full brunt.

For all these reasons, testing milk and dumping any contaminated is at the top of the list of disaster-response measures following a nuclear accident, and it’s unusual, though not unknown, for bad milk to find its way into the food supply. For example:

• Iodine contamination during the 1979 Three Mile Island accident was negligible, 20 picocuries per liter. The FDA’s “action level” at the time was 12,000 picocuries per liter; the current limit of 4,600 picocuries is still far in excess of what was observed.

• After the problems with the Fukushima reactors in Japan, one batch of hot milk did test at nine times the current limit, and milk and vegetable consumption was prohibited in high-risk areas. But most bans were rescinded after a couple months.

• In 1957, after a fire at the Windscale plutonium processing plant in the UK, radiation levels of 800,000 picocuries per liter and higher were found in local milk. Though contamination of milk wasn’t well understood at the time, authorities figured 800,000 of anything involving curies can’t be good and banned the stuff.

• Then there’s Chernobyl. Milk sales were banned in nearby cities after the 1986 reactor explosion, but feckless Soviet officials let the sizable rural population fend for itself. Not surprisingly, 6,000 cases of thyroid cancer subsequently developed, proving there’s no catastrophic situation that stupidity can’t make worse.

One last thing. We’ve been talking about cow’s milk, but be aware that iodine-131, strontium-90, and other radioactive contaminants can also be transferred through human milk…..

UK’s nuclear waste cleanup costs – up to £219 billion, with development of autonomous robots

March 9, 2017

UK funding development of autonomous robots to help clear up nuclear waste A new UK consortium will be developing robots to handle nuclear sites, bomb disposal, space and mining. International Business Times,     By   February 28, 2017 The UK government is funding a new consortium of academic institutions and industrial partners to jump start the robotics industry and develop a new generation of robots to help deal with situations that are hazardous for humans.

It is estimated that it will cost between £95 billion and £219 billion to clean up the UK’s existing nuclear facilities over the next 120 years or so. The environment is so harsh that humans cannot physically be on the site, and robots that are sent in often encounter problems, like the small IRID Toshiba shape-shifting scorpion robot used to explore Fukushima’s nuclear reactors, often break down and cannot be retrieved.Remote-controlled robots are needed to turn enter dangerous zones that haven’t been accessed in over 40 years to carry out relatively straightforward tasks that a human could do in an instant.

The problem is that robots are just not at the level they need to be yet, and it is very difficult to build a robot that can successfully navigate staircases, move over rough terrain and turn valves.

To fix this problem, the Engineering and Physical Sciences Research Council is investing £4.6m ($5.7m) into a new group consisting of the University of Manchester, the University of Birmingham, the University of the West of England (UWE) and industrial partners Sellafield, EDF Energy, UKAEA and NuGen…….

Ocean acidification spreading rapidly in Arctic Ocean,

March 9, 2017

International team reports ocean acidification spreading rapidly in Arctic Ocean, EurekAlert, 28 Feb 17, UNIVERSITY OF DELAWARE  Ocean acidification (OA) is spreading rapidly in the western Arctic Ocean in both area and depth, according to new interdisciplinary research reported in Nature Climate Changeby a team of international collaborators, including University of Delaware professor Wei-Jun Cai.

The research shows that, between the 1990s and 2010, acidified waters expanded northward approximately 300 nautical miles from the Chukchi slope off the coast of northwestern Alaska to just below the North Pole. Also, the depth of acidified waters was found to have increased, from approximately 325 feet to over 800 feet (or from 100 to 250 meters).

“The Arctic Ocean is the first ocean where we see such a rapid and large-scale increase in acidification, at least twice as fast as that observed in the Pacific or Atlantic oceans,” said Cai, the U.S. lead principal investigator on the project and Mary A.S. Lighthipe Professor of Earth, Ocean, and Environment at UD.

“The rapid spread of ocean acidification in the western Arctic has implications for marine life, particularly clams, mussels and tiny sea snails that may have difficulty building or maintaining their shells in increasingly acidified waters,” said Richard Feely, NOAA senior scientist and a co-author of the research. Sea snails called pteropods are part of the Arctic food web and important to the diet of salmon and herring. Their decline could affect the larger marine ecosystem.

Among the Arctic species potentially at risk from ocean acidification are subsistence fisheries of shrimp and varieties of salmon and crab.

Other collaborators on the international project include Liqi Chen, the Chinese lead principal investigator and scientist with the Third Institute of Oceanography of State Oceanic Administration of China; and scientists at Xiamen University, China and the University of Gothenburg, Sweden, among other institutions…….

Arctic ocean ice melt in the summer, once found only in shallow waters of depths less than 650 feet or 200 meters, now spreads further into the Arctic Ocean.

“It’s like a melting pond floating on the Arctic Ocean. It’s a thin water mass that exchanges carbon dioxide rapidly with the atmosphere above, causing carbon dioxide and acidity to increase in the meltwater on top of the seawater,” said Cai. “When the ice forms in winter, acidified waters below the ice become dense and sink down into the water column, spreading into deeper waters.”

Possibility of drastic cooling in North Atlantic

March 9, 2017

Drastic cooling in North Atlantic beyond worst fears, scientists warn

Climatologists say Labrador Sea could cool within a decade before end of this century, leading to unprecedented disruption, reports Climate News Network, Guardian,  , 25 Feb 17, For thousands of years, parts of northwest Europe have enjoyed a climate about 5C warmer than many other regions on the same latitude. But new scientific analysis suggests that that could change much sooner and much faster than thought possible.

Climatologists who have looked again at the possibility of major climate change in and around the Atlantic Ocean, a persistent puzzle to researchers, now say there is an almost 50% chance that a key area of the North Atlantic could cool suddenly and rapidly, within the space of a decade, before the end of this century.

That is a much starker prospect than even the worst-case scientific scenario proposed so far, which does not see the Atlantic ocean current shutdown happening for several hundred years at least.

A scenario even more drastic (but fortunately fictional) was the subject of the 2004 US movie The Day After Tomorrow, which portrayed the disruption of the North Atlantic’s circulation leading to global cooling and a new Ice Age.

To evaluate the risk of extreme climate change, researchers from the Environnements et Paléoenvironnements Océaniques et Continentaux laboratory (CNRS/University of Bordeaux, France), and the University of Southamptondeveloped an algorithm to analyse the 40 climate models considered by the Fifth Assessment Report.

The findings by the British and French team, published in the Nature Communications journal, in sharp contrast to the IPCC, put the probability of rapid North Atlantic cooling during this century at almost an even chance – nearly 50%.

Current climate models foresee a slowing of the meridional overturning circulation (MOC), sometimes known also as the thermohaline circulation, which is the phenomenon behind the more familiar Gulf Stream that carries warmth from Florida to European shores. If it did slow, that could lead to a dramatic, unprecedented disruption of the climate system.

In 2013, drawing on 40 climate change projections, the IPCC judged that this slowdown would occur gradually, over a long period. Its findings suggested that fast cooling of the North Atlantic during this century was unlikely.

But oceanographers from EU emBRACE had also re-examined the 40 projections by focusing on a critical spot in the northwest of the North Atlantic: the Labrador Sea.

The Labrador Sea is host to a convection system ultimately feeding into the ocean-wide MOC. The temperatures of its surface waters plummet in the winter, increasing their density and causing them to sink. This displaces deep waters, which bring their heat with them as they rise to the surface, preventing the formation of ice caps.

The algorithm developed by the Anglo-French researchers was able to detect quick sea surface temperature variations. With it they found that seven of the 40 climate models they were studying predicted a total shutdown of convection, leading to abrupt cooling of the Labrador Sea by 2C to 3C over less than 10 years. This in turn would drastically lower North Atlantic coastal temperatures.

But because only a handful of the models supported this projection, the researchers focused on the critical parameter triggering winter convection: ocean stratification. Five of the models that included stratification predicted a rapid drop in North Atlantic temperatures.

The researchers say these projections can one day be tested against real data from the international OSnap project, whose teams will be anchoring scientific instruments within the sub-polar gyre (a gyre is any large system of circulating ocean currents).

If the predictions are borne out and the North Atlantic waters do cool rapidly over the coming years, the team says, with considerable understatement, climate change adaptation policies for regions bordering the North Atlantic will have to take account of this phenomenon.

NASA project – Oceans Melting Greenland (OMG) studies future sea level rise

March 9, 2017

OMG measurements of Greenland give us a glimpse of future sea rise 24 February 2017 by John Abraham  If you meet a group of climate scientists, and ask them how much sea levels will rise by say the year 2100, you will get a wide range of answers. But, those with most expertise in sea level rise will tell you perhaps 1 meter (a little over three feet). Then, they will immediately say, “but there is a lot of uncertainty on this estimate.” It doesn’t mean they aren’t certain there will be sea level rise – that is guaranteed as we add more heat in the oceans. Here, uncertainty means it could be a lot more or a little less.

Why are scientists not certain about how much the sea level will rise? Because there are processes that are occurring that have the potential for causing huge sea level rise, but we’re uncertain about how fast they will occur. Specifically, two very large sheets of ice sit atop Greenland and Antarctica. If those sheets melt, sea levels will rise hundreds of feet.

Parts of the ice sheets are melting, but how much will melt and how fast will the melting occur? Are we talking decades? Centuries? Millennia? Scientists really want to know the answer to this question. Not only is it interesting scientifically, but it has huge impacts on coastal planning.

One reason the answer to this question is illusive is that melting of ice sheets can occur from above (warm air and sunlight) or from below (warm ocean waters). In many instances, it’s the melting from below that is most significant – but this melting from below is really hard to measure.

With hope we will have a much clearer sense of ice sheet melting and sea level rise because of a new scientific endeavor that is part of a NASA project – Oceans Melting Greenland (OMG). This project has brought together some of the best oceanographers and ice experts in the world. The preliminary results are encouraging and are discussed in two recent publications here and here.

In the papers, the authors note that Greenland ice loss has increased substantially in recent decades. It now contributes approximately 1/3 to total sea level rise. The authors want to know whether this contribution will change over time and they recognize that underwater processes may be the most important to study. In fact, they note in their paper:

Specifically, our goal is improved understanding of how ocean hydrographic variability around the ice sheet impacts glacial melt rates, thinning and retreat.

In plain English, they want to know how water flow around Greenland affects the ice melt.

Their experiments are measuring a number of key attributes. First, yearly changes in the temperature of ocean water near Greenland. Second, the yearly changes to the glaciers on Greenland that extend into the ocean waters. Third, they are observing marine topography (the shape of the land underneath the ocean surface).

The sea floor shape is quite complicated, particularly near Greenland. Past glaciers carved deep troughs in the sea floor in some areas, allowing warm salty water to reach huge glaciers that are draining the ice sheet. As lead OMG investigator Josh Willis said:

What’s interesting about the waters around Greenland is that they are upside down. Warm, salty water, which is heavy, sits below a layer of cold, fresh water from the Arctic Ocean. That means the warm water is down deep, and glaciers sitting in deep water could be in trouble.

As the warm water attacks marine glaciers (glaciers that extend into the ocean), the ice tends to break and calve, retreating toward land. In some cases, the glaciers retreat until their grounding line coincides with the shore. But in other cases the undulating surface allows warm water to wear the glacier underside for long distances and thereby increase the risk of large calving events.

Oftentimes, when glaciers near the coast break off they uncork other ice that can then more easily flow into the oceans.

Click here to read the rest

Transatomic Power’s false claims about Generation IV nuclear reactors

March 9, 2017

It’s interesting the way that, for dubious nuclear enterprises, they like to put a young woman at the top. Is this to make the nuclear image look young and trendy? Or is it so they she can cop the flak when it all goes wrong?

Nuclear Energy Startup Transatomic Backtracks on Key Promises The company, backed by Peter Thiel’s Founders Fund, revised inflated assertions about its advanced reactor design after growing concerns prompted an MIT review. MIT Technology Review by James Temple  February 24, 2017 
Nuclear energy startup Transatomic Power has backed away from bold claims for its advanced reactor technology after an informal review by MIT professors highlighted serious errors in the company’s calculations, MIT Technology Review has learned.

The Cambridge, Massachusetts-based company, founded in 2011 by a pair of MIT students in the Nuclear Science & Engineering department, asserted that its molten salt reactor design could run on spent nuclear fuel from conventional reactors and generate energy far more efficiently than them. In a white paper published in March 2014, the company proclaimed its reactor “can generate up to 75 times more electricity per ton of mined uranium than a light-water reactor.”

Those lofty claims helped it raise millions in venture capital, secure a series of glowing media profiles (including in this publication), and draw a rock-star lineup of technical advisors. But in a paper on its site dated November 2016, the company downgraded “75 times” to “more than twice.” In addition, it now specifies that the design “does not reduce existing stockpiles of spent nuclear fuel,” or use them as its fuel source. The promise of recycling nuclear waste, which poses tricky storage and proliferation challenges, was a key initial promise of the company that captured considerable attention.

“In early 2016, we realized there was a problem with our initial analysis and started working to correct the error,” cofounder Leslie Dewan said in an e-mail response to an inquiry from MIT Technology Review.

The dramatic revisions followed an analysis in late 2015 by Kord Smith, a nuclear science and engineering professor at MIT and an expert in the physics of nuclear reactors.

At that point, there were growing doubts in the field about the company’s claims and at least some worries that any inflated claims could tarnish the reputation of MIT’s nuclear department, which has been closely associated with the company. Transatomic also has a three-year research agreement with the department, according to earlier press releases.

In reviewing the company’s white paper, Smith noticed immediate red flags. He relayed his concerns to his department head and the company, and subsequently conducted an informal review with two other professors.

“I said this is obviously incorrect based on basic physics,” Smith says. He asked the company to run a test, which ended up confirming that “their claims were completely untrue,” Smith says.

He notes that promising to increase the reactor’s fuel efficiency by 75 times is the rough equivalent of saying that, in a single step, you’d developed a car that could get 2,500 miles per gallon.

Ultimately, the company redid its analysis, and produced and posted a new white paper………

The company has raised at least $4.5 million from Peter Thiel’s Founders Fund, Acadia Woods Partners, and Daniel Aegerter of Armada Investment AG. Venture capital veteran Ray Rothrock serves as chairman of the company.

Founders Fund didn’t immediately respond to an inquiry……