Archive for the ‘TECHNOLOGY’ Category

Computer errors that almost started nuclear wars

December 4, 2018

The argument from cyberspace for eliminating nuclear weapons  NOVEMBER 9, 2018 “…….Computer errors that almost started nuclear wars

Unclassified reports reveal that problems within the computers of nuclear command and control date back to at least the 1970s, when a deficient computer chip signalled that 200 Soviet missiles were headed towards the U.S. Computer problems have persisted: In 2010, a loose circuit card caused a U.S. launch control centre to lose contact with 50 nuclear missiles. In both cases, the accident might have been mistaken for a deliberate attack. Failing to recognize the mistake could have resulted in the U.S. launching nuclear weapons.

These cases were presumably the result of unintentional errors, not deliberate actions. But hacking and other forms of targeted cyberattacks greatly increase the risk of accidental nuclear launch or other devastating actions. Overconfidence on the part of the officials overseeing the nuclear arsenal is therefore negligent and dangerous.

A more recent compounding factor is the ongoing, roughly trillion-dollar upgrade of the U.S. nuclear arsenal started by the Obama administration. This so-called modernization effort included upgrades to the nuclear command and control system. The Trump administration continues to make this a priority.

Modernization increases the possibility that changes to the nuclear command and control system will introduce new or reveal hitherto unknown vulnerabilities into the system. The evidence from the GAO report and other publicly available documents indicates that the officials in charge will be emphasizing speed, convenience, or cost over cybersecurity.

In its conclusion, the GAO report explained that the DOD “has taken several major steps to improve weapon systems cybersecurity.” But the DOD “faces barriers that may limit its ability to achieve desired improvements,” such as constraints on information sharing and workforce shortages. That is not reassuring.

There is a more basic problem that we have emphasized above: the risks associated with cyberattacks can be ameliorated but not fully eliminated. When this intrinsic risk is integrated with the sheer destructiveness of nuclear weapons, the only way to avoid a catastrophic accident at some point in time is to embrace efforts to abolish the weapons themselves.

Advertisements

Thorium Molten Salt Nuclear reactor (MSR) No Better Than Uranium Process

November 3, 2018

The safety issue is also not resolved, as stated above: pressurized water leaking from the steam generator into the hot, radioactive molten salt will explosively turn to steam and cause incredible damage.  The chances are great that the radioactive molten salt would be discharged out of the reactor system and create more than havoc.  Finally, controlling the reaction and power output, finding materials that last safely for 3 or 4 decades, and consuming vast quantities of cooling water are all serious problems.  

The greatest problem, though, is likely the scale-up by a factor of 500 to 1, from the tiny project at ORNL to a full-scale commercial plant with 3500 MWth output.   Perhaps these technical problems can be overcome, but why would anyone bother to try, knowing in advance that the MSR plant will be uneconomic due to huge construction costs and operating costs, plus will explode and rain radioactive molten salt when (not if) the steam generator tubes leak.

The Truth About Nuclear Power – Part 28, Sowells Law Blog , 14 July 2014 Thorium MSR No Better Than Uranium Process, 

Preface   This article, number 28 in the series, discusses nuclear power via a thorium molten-salt reactor (MSR) process.   (Note, this is also sometimes referred to as LFTR, for Liquid Fluoride Thorium Reactor)   The thorium MSR is frequently trotted out by nuclear power advocates, whenever the numerous drawbacks to uranium fission reactors are mentioned.   To this point in the TANP series, uranium fission, via PWR or BWR, has been the focus.  Some critics of TANP have already stated that thorium solves all of those problems and therefore should be vigorously pursued.  Some of the critics have stated that Sowell obviously has never heard of thorium reactors.   Quite the contrary, I am familiar with the process and have serious reservations about the numerous problems with thorium MSR.

It is interesting, though, that nuclear advocates must bring up the MSR process.  If the uranium fission process was any good at all, there would be no need for research and development of any other type of process, such as MSR and fusion. (more…)

Debunking the claims about generation IV nuclear waste

November 3, 2018

Generation IV nuclear waste claims debunked, Nuclear Monitor 24 Sept 18   Lindsay Krall and Allison Macfarlane have written an important article in the Bulletin of the Atomic Scientists debunking claims that certain Generation IV reactor concepts promise major advantages with respect to nuclear waste management. Krall is a post-doctoral fellow at the George Washington University. Macfarlane is a professor at the same university, a former chair of the US Nuclear Regulatory Commission from July 2012 to December 2014, and a member of the Blue Ribbon Commission on America’s Nuclear Future from 2010 to 2012.

Krall and Macfarlane focus on molten salt reactors and sodium-cooled fast reactors, and draw on the experiences of the US Experimental Breeder Reactor II and the US Molten Salt Reactor Experiment.

The article abstract notes that Generation IV developers and advocates “are receiving substantial funding on the pretense that extraordinary waste management benefits can be reaped through adoption of these technologies” yet “molten salt reactors and sodium-cooled fast reactors – due to the unusual chemical compositions of their fuels – will actually exacerbate spent fuel storage and disposal issues.”

Here is the concluding section of the article:

“The core propositions of non-traditional reactor

proponents – improved economics, proliferation resistance,

safety margins, and waste management – should be

re-evaluated. The metrics used to support the waste

management claims – i.e. reduced actinide mass and total

radiotoxicity beyond 300 years – are insufficient to critically

assess the short- and long-term safety, economics, and

proliferation resistance of the proposed fuel cycles.

“Furthermore, the promised (albeit irrelevant) actinide

reductions are only attainable given exceptional

technological requirements, including commercial-scale

spent fuel treatment, reprocessing, and conditioning

facilities. These will create low- and intermediate-level

waste streams destined for geologic disposal, in addition

to the intrinsic high-level fission product waste that will

also require conditioning and disposal.

 

“Before construction of non-traditional reactors begins,

the economic implications of the back end of these nontraditional

fuel cycles must be analyzed in detail; disposal

costs may be unpalatable. The reprocessing/treatment

and conditioning of the spent fuel will entail costs, as will

storage and transportation of the chemically reactive fuels.

These are in addition to the cost of managing high-activity

operational wastes, e.g. those originating from molten

salt reactor filter systems. Finally, decommissioning the

reactors and processing their chemically reactive coolants

represents a substantial undertaking and another source

of non-traditional waste. …

“Issues of spent fuel management (beyond temporary

storage in cooling pools, aka “wet storage”) fall outside

the scope of the NRC’s reactor design certification

process, which is regularly denounced by nuclear

advocates as narrowly applicable to light water reactor

technology and insufficiently responsive to new reactor

designs. Nevertheless, new reactor licensing is contingent

on broader policies, including the Nuclear Waste Policy

Act and the Continued Storage Rule. Those policies are

based on the results of radionuclide dispersion models

described in environmental impact statements. But the

fuel and barrier degradation mechanisms tested in these

models were specific to oxide-based spent fuels, which

are inert, compared to the compounds that non-traditional

reactors will discharge.

 

“The Continued Storage Rule explicitly excludes most

non-oxide fuels, including those from sodium-cooled fast

reactors, from the environmental impact statement. Clearly,

storage and disposal of non-oxide commercial fuels should

require updated assessments and adjudication.

“Finally, treatment of spent fuels from non-traditional

reactors, which by Energy Department precedent is

only feasible through their respective (re)processing

technologies, raises concerns over proliferation and fissile

material diversion. Pyroprocessing and fluoride volatilityreductive

extraction systems optimized for spent fuel

treatment can – through minor changes to the chemical

conditions – also extract plutonium (or uranium 233 bred

from thorium). Separation from lethal fission products

would eliminate the radiological barriers protecting the

fuel from intruders seeking to obtain and purify fissile

material. Accordingly, cost and risk assessments of

predisposal spent fuel treatments must also account for

proliferation safeguards.

 

“Radioactive waste cannot be “burned”; fission of

actinides, the source of nuclear heat, inevitably generates

fission products. Since some of these will be radiotoxic

for thousands of years, these high-level wastes should

be disposed of in stable waste forms and geologic

repositories. But the waste estimates propagated by

nuclear advocates account only for the bare mass of

fission products, rather than that of the conditioned waste

form and associated repository requirements.

“These estimates further assume that the efficiency

of actinide fission will surge, but this actually relies on

several rounds of recycling using immature reprocessing

technologies. The low- and intermediate-level wastes

that will be generated by these activities will also be

destined for geologic disposal but have been neglected

in the waste estimates. More important, reprocessing

remains a security liability of dubious economic benefit,

so the apparent need to adopt these technologies simply

to prepare non-traditional spent fuels for storage and

disposal is a major disadvantage relative to light water

reactors. Theoretical burnups for fast and molten salt

reactors are too low to justify the inflated back-end costs

and risks, the latter of which may include a commercial

path to proliferation.

 

“Reductions in spent fuel volume, longevity, and total

radiotoxicity may be realized by breeding and burning

fissile material in non-traditional reactors. But those

relatively small reductions are of little value in repository

planning, so utilization of these metrics is misleading to

policy-makers and the general public. We urge policymakers

to critically assess non-traditional fuel cycles,

including the feasibility of managing their unusual waste

streams, any loopholes that could commit the American

public to financing quasi-reprocessing operations, and

the motivation to rapidly deploy these technologies. If

decarbonization of the economy by 2050 is the end-goal,

a more pragmatic path to success involves improvements

to light water reactor technologies, adoption of Blue

Ribbon Commission recommendations on spent fuel

management, and strong incentives for commercially

mature, carbon-free energy technologies.”

Lindsay Krall and Allison Macfarlane, 2018, ‘Burning

waste or playing with fire? Waste management

considerations for non-traditional reactors’, Bulletin of the

Atomic Scientists, 74:5, pp.326-334, https://tandfonline.

com/doi/10.1080/00963402.2018.1507791

Molten salt reactors and sodium-cooled fast reactors make the radioactive waste problem WORSE

October 9, 2018
Burning waste or playing with fire? Waste management considerations for non-traditional reactors https://www.tandfonline.com/doi/full/10.1080/00963402.2018.1507791, Lindsay Krall &Allison Macfarlane, 31 Aug 18

 ABSTRACT

Nuclear energy-producing nations are almost universally experiencing delays in the commissioning of the geologic repositories needed for the long-term isolation of spent fuel and other high-level wastes from the human environment. Despite these problems, expert panels have repeatedly determined that geologic disposal is necessary, regardless of whether advanced reactors to support a “closed” nuclear fuel cycle become available. Still, advanced reactor developers are receiving substantial funding on the pretense that extraordinary waste management benefits can be reaped through adoption of these technologies. 

Here, the authors describe why molten salt reactors and sodium-cooled fast reactors – due to the unusual chemical compositions of their fuels – will actually exacerbate spent fuel storage and disposal issues. Before these reactors are licensed, policymakers must determine the implications of metal- and salt-based fuels vis a vis the Nuclear Waste Policy Act and the Continued Storage Rule.

Robots the hope for cleaning up the world’s riskiest and massive nuclear waste storage pool, at Sellafield, UK.

October 9, 2018

Only Cthulhu can solve Sellafield’s sludgy nuclear waste problem, Wired,    , 14 June 18 

Cleaning up Sellafield’s nuclear waste costs £1.9 billion a year. To help with the toxic task, robots are evolving fast.  Sellafield has been called the most dangerous place in the UK, the most hazardous place in Europe and the world’s riskiest nuclear waste site. At its heart is a giant pond full of radioactive sludge, strewn with broken metal, dead animals and deadly nuclear rods. The solution to clearing up Sellafield’s nuclear waste and retrieving the missing nuclear fuel? Robots, of course. And to tackle this mammoth task, the robots are being forced to evolve.

Sellafield’s First-Generation Magnox Storage Pond is a giant outdoor body of water that’s the same size as two Olympic swimming pools. It was built in the 1960s to store used fuel rods from the early Magnox reactors – which had magnesium alloy cladding on the fuel rods – as part of Britain’s booming nuclear program. In 1974, there was a delay in reprocessing; fuel rods started corroding and the pond became murky. The pool was active for 26 years until 1992 and is now finally being decommissioned as part of the £1.9 billion spent each year on Sellafield’s mammoth cleanup operation.

The pond contains about six metres of radioactive water and half a metre of sludge, composed of wind-blown dirt, bird droppings and algae – the usual debris that builds up in any open body of water. Unlike other mud, it conceals everything from dropped tools and bird carcasses to corroded Magnox cladding and the remains of uranium fuel rods.

A number of robotic creations have bee used to get to the bottom of the pool’s sludge but struggle to break through the hostile environment. Tethered swimming robots do not have the sensors to find objects in the fine mud, and lack the leverage to lift chunks of metal. Experience at Fukushima has shown robots that are not well adapted to the environment are a waste of time.

Enter Cthulhu, a tracked robot that can drive along the pond bed, feeling its way with tactile sensors and sonar. The robot, which is currently in development, is approaching Sellafield’s problem differently. The robot will be able to identify nuclear rods and then pick them up. “Rather than trying to mimic a human, we’re building a robot that can do things humans can’t do with senses that humans don’t have,” says Bob Hicks of QinetiQ, which is leading the project.

Terra Power’s Traveling Wave Nuclear Reactor sounds great – BUT!

October 9, 2018

TerraPower’s Nuclear Reactor Could Power the 21st Century. The traveling-wave reactor and other advanced reactor designs could solve our fossil fuel dependency IEEE Spectrum, By Michael Koziol  3 June 18,    “….  ..In a world defined by climate change, many experts hope that the electricity grid of the future will be powered entirely by solar, wind, and hydropower. Yet few expect that clean energy grid to manifest soon enough to bring about significant cuts in greenhouse gases within the next few decades. Solar- and wind-generated electricity are growing faster than any other category; nevertheless, together they accounted for less than 2 percent of the world’s primary energy consumption in 2015, according to the Renewable Energy Policy Network for the 21st Century.

¥1.13 trillion of taxpayers’ money later, Japan’s Monju nuclear reprocessing reactor a spectacular failure

October 9, 2018

Monju reactor project failed to pay off after swallowing ¥1.13 trillion of taxpayers’ money: auditors https://www.japantimes.co.jp/news/2018/05/11/national/monju-reactor-project-failed-pay-off-swallowing-%C2%A51-13-trillion-taxpayers-money-auditors/#.WvZw_u-FPGg

The Monju fast-breeder reactor experiment yielded few sufficient results despite an investment of at least ¥1.13 trillion ($10.3 billion) worth of taxpayers money since 1994, state auditors confirmed on Friday.

The trouble-plagued prototype, which only ran for 250 days, was designed to play a key role in Japan’s quest to set up a nuclear fuel recycling program, but the project only achieved 16 percent of the intended results, the Board of Audit said.

The government finally decided to scrap Monju in December 2016 at an estimated additional cost of ¥375 billion. But the audit board noted that the 30-year decommissioning plan could cost even more.

The reactor, which started operations in 1994, was designed to produce more plutonium than it consumes while generating electricity, experienced several problems over its more than two-decade run, including a sodium coolant leak and attempted cover-up, and equipment inspection failures.

“Flawed maintenance led to the decommissioning,” the auditors concluded in their report.

But the report also spotlights the absence of a systematic evaluation system for the project. During the entire experiment, the auditors expressed their opinions on Monju’s research and development costs only once — in 2011.

Monju was only up and running for 250 days in total after repeatedly failing to complete test items, according to the report.

As for the decommissioning costs, the report said they might expand because the current estimate does not include personnel costs and taxes. It also noted that the cost of removing the radioactive sodium coolant could change.

Energy Hogs: Can World’s Huge Data Centers Be Made More Efficient?

June 2, 2018

Energy Hogs: Can World’s Huge Data Centers Be Made More Efficient?
The gigantic data centers that power the internet consume vast amounts of electricity and emit 3 percent of global CO2 emissions. To change that, data companies need to turn to clean energy sources and dramatically improve energy efficiency.
 Yale  Environment 360   

The cloud is coming back to Earth with a bump. That ethereal place where we store our data, stream our movies, and email the world has a physical presence – in hundreds of giant data centers that are taking a growing toll on the planet.

Data centers are the factories of the digital age. These mostly windowless, featureless boxes are scattered across the globe – from Las Vegas to Bangalore, and Des Moines to Reykjavik. They run the planet’s digital services. Their construction alone costs around $20 billion a year worldwide.

The biggest, covering a million square feet or more, consume as much power as a city of a million people. In total, they eat up more than 2 percent of the world’s electricity and produce 3 percent of CO2 emissions, as much as the airline industry. And with global data traffic more than doubling every four years, they are growing fast.

Yet if there is a data center near you, the chances are you don’t know about it. And you still have no way of knowing which center delivers your Netflix download, nor whether it runs on renewable energy using processors cooled by Arctic air, or runs on coal power and sits in desert heat, cooled by gigantically inefficient banks of refrigerators.

We are often told that the world’s economy is dematerializing – that physical analog stuff is being replaced by digital data, and that this data has minimal ecological footprint. But not so fast. If the global IT industry were a country, only China and the United States would contribute more to climate change, according to a Greenpeace report investigating “the race to build a green internet,” published last year.

Storing, moving, processing, and analyzing data all require energy. Lots of it. The processors in the biggest data centers hum with as much energy as can be delivered by a large power station, 1,000 megawatts or more. And it can take as much energy again to keep the servers and surrounding buildings from overheating.

Almost every keystroke adds to this. Google estimates that a typical searchusing its services requires as much energy as illuminating a 60-watt light bulb for 17 seconds and typically is responsible for emitting 0.2 grams of CO2. Which doesn’t sound a lot until you begin to think about how many searches you might make in a year.

And these days, Google is data-lite. Streaming video through the internet is what really racks up the data count. IT company Cisco, which tracks these things, reckons video will make up 82 percent of internet traffic by 2021, up from 73 percent in 2016. Around a third of internet traffic in North America is already dedicated to streaming Netflix services alone.

Two things matter if we are to tame these runaway beasts: One is making them use renewable or other low-carbon energy sources; the other is ramping up their energy efficiency. On both fronts, there is some good news to report. Even Greenpeace says so. “We are seeing a significant increase in the prioritization of renewables among some of the largest internet companies,” last year’s report concluded.

More and more IT companies are boasting of their commitment to achieving 100 percent reliance on renewable energy. To fulfil such pledges, some of the biggest are building their own energy campuses. In February, cloud giant Switch, which runs three of the world’s top 10 data centers, announced plansfor a solar-powered hub in central Nevada that will be the largest anywhere outside China.

More often, the data titans sign contracts to receive dedicated supply from existing wind and solar farms. In the U.S., those can still be hard to come by. The availability of renewable energy is one reason Google and Microsoft have recently built hubs in Finland, and Facebook in Denmark and Sweden. Google last year also signed a deal to buy all the energy from the Netherlands’ largest solar energy park, to power one of its four European data centers.

Of the mainstream data crunchers for consumers, Greenpeace singled out Netflix for criticism. It does not have its own data centers. Instead, it uses contractors such as Amazon Web Services, the world’s largest cloud-computing company, which Greenpeace charged with being “almost completely non-transparent about the energy footprint of its massive operations.” Amazon Web Services contested

this. A spokesperson told Yale Environment 360 that the company had a “long-term commitment to 100 percent renewable energy” and had launched a series of wind and solar farm projects now able to deliver around 40 percent of its energy. Netflix did not respond to requests for comment.

Amazon Web Services has some of its largest operations in Northern Virginia, an area just over the Potomac River from Washington D.C. that has the largest concentration of data centers in the world. Virginia gets less than 3 percent of its electricity from renewable sources, plus 33 percent from nuclear, according to Greenpeace.

Some industry insiders detect an element of smoke and mirrors in the green claims of the internet giants. “When most data center companies talk about renewable energy, they are referring to renewable energy certificates,” Phillip Sandino, vice-president of data centers at RagingWire, which has centers in Virginia, California, and Texas, claimed in an online trade journal recently. In the U.S. and some other countries, renewable energy certificates are issued to companies generating renewable energy for a grid, according to the amount generated. The certificates can then be traded and used by purchasers to claim their electricity is from a renewable source, regardless of exactly where their electricity comes from. “In fact,” Sandino said, “the energy [the data centers] buy from the power utility is not renewable.”

Others, including Microsoft, help sustain their claims to carbon neutrality through carbon offsetting projects, such as investing in forests to soak up the CO2 from their continued emissions.

All this matters because the differences in carbon emissions between data centers with different energy sources can be dramatic, says Geoff Fox, innovation chief at DigiPlex, which builds and operates centers in Scandinavia. Using data compiled by Swedish state-owned energy giant Vattenfall, he claims that in Norway, where most of the energy comes from hydroelectricity, generating a kilowatt-hour of electricity emits only 3 grams of CO2. By comparison, in France it is 100 grams, in California 300 grams, in Virginia almost 600 grams, in New Mexico more than 800 grams.

Meanwhile, there is growing concern about the carbon footprint of centers being built for Asian internet giants such as Tencent, Baidu, and Alibaba in China; Naver in South Korea; and Tulip Telecom in India. Asia is where the fastest global growth in data traffic is now taking place. These corporations have been tight-lipped about their energy performance, claims Greenpeace. But with most of the region’s energy coming from coal-fired power stations, their carbon footprint cannot be anything but large.

Vattenfall estimates the carbon emissions in Bangalore, home of Tulip’s giant Indian data center, at 900 grams per kilowatt-hour. Even more troubling, the world’s largest center is currently the Range International Information Hub, a cloud-data store at Langfang near the megacity of Tianjin in northeast China, where it takes more than 1,000 grams of CO2 for every kilowatt-hour.

Almost as important as switching data centers to low-carbon energy sources is improving their energy efficiency. Much of this comes down to the energy needed to keep the processors cool. Insanely, most of the world’s largest centers are in hot or temperate climates, where vast amounts of energy are used to keep them from overheating. Of the world’s 10 largest, two are in the desert heat of Nevada, and others are in Georgia, Virginia, and Bangalore.

Most would dramatically reduce their energy requirements if they relocated to a cool climate like Scandinavia or Iceland. One fast-emerging data hub is Iceland, where Verne Global, a London company, set up its main operation.

…….. Greenpeace says the very size of the internet business, and its exposure to criticism for its contribution to climate change, has the potential to turn it from being part of the problem to part of the solution. Data centers have the resources to change rapidly. And pressure is growing for them to do so.The hope is that they will bring many other giant corporations with them. “The leadership by major internet companies has been an important catalyst among a much broader range of corporations to adopt 100 percent renewable goals,” says Gary Cook, the lead author of the Greenpeace report. “Their actions send an important market signal.”

But the biggest signal, says Fox, will come from us, the digital consumers. Increasingly, he says, “they understand that every cloud lives inside a data center. And each has a different footprint.” We will, he believes, soon all demand to know the carbon footprint of our video streams and internet searches. The more far-sighted of the big data companies are gearing up for that day. “I fully expect we may see green labelling for digital sources as routine within five years.” https://e360.yale.edu/features/energy-hogs-can-huge-data-

David Noonan’s Submissions to Australian Senate regarding Reprocessing Nuclear Fuel and Safety of Intermediate Level Wastes

April 2, 2018

two David Noonan Submissions to current Federal Parliamentary Inquiry by Joint Standing Committee on Treaties (JSCT) Reprocessing Nuclear fuel – France (to report by 19 June) have been made public,

An ARPANSA Submission (23 Feb, 2 pages) “regarding the safety of intermediate level waste” has also been made public, at: https://www.aph.gov.au/DocumentStore.ashx?id=0739bc51-9403-4490-b0ce-c8cc6ed074a2&subId=563939

See below url’s & extracts for DN sub’s & JSCT Inquiry homepage at: https://www.aph.gov.au/Parliamentary_Business/Committees/Joint/Treaties/NuclearFuel-France

D Noonan Submission (14 Feb): “Public Interest Questions, Scenarios and Consequences of ‘Reprocessing Nuclear fuel – France’ treaty actions & associated nuclear actions”

https://www.aph.gov.au/DocumentStore.ashx?id=eab981b4-146d-4b66-aad9-59f64b275db0&subId=563627

ANSTO is without a Plan B to address key public interest scenarios which demand answers:

·         Reprocessing in France will not prove to be available throughout the OPAL reactor Operating License to 2057. At most, this treaty covers the first 2 of 5 decades of OPAL fuel wastes;

 ·         AND the proposed above ground Store in SA for ANSTO’s nuclear waste will damage and divide community and fall over and fail just as prior attempts have in SA and in NT.

If the OPAL reactor is to continue to operate ANSTO must address required contingencies:

·         Extended Storage of OPAL nuclear fuel waste on-site at Lucas Heights in secure cask storage. Lucas Height operates a Store for HIFAR nuclear fuel wastes with capacity to do so until availability of a final disposal option and can now set up to do so for OPAL fuel wastes;

 ·         AND to have to manage ANSTO nuclear fuel wastes entirely with-in Australia through to final disposal. Sending OPAL nuclear fuel waste overseas for reprocessing is used as an excuse to produce a burden of further nuclear waste without capacity or answers for its disposal. …

my Supplementary Submission (28 Feb) provides further evidence on three key aspects:

https://www.aph.gov.au/DocumentStore.ashx?id=f42dce88-9ecf-44f0-8195-5e9e552de078&subId=563627

1. Reprocessing is not International Best Practice, is in decline, and may leave ANSTO stranded

… A key Reprocessing review for consideration by JSCT is: ‘Plutonium Separation in Nuclear Power Programs. Status, Problems, and Prospects of Civilian Reprocessing around the World‘ (IPFM, July 2015), see: http://fissilematerials.org/library/2015/07/plutonium_separation_in_nuclea.html

France is currently the only country in the world that operates a commercial-scale spent fuel reprocessing plant.”  (IPFM Report, Country Studies Chapter 3 France p.30)

 … ANSTO should disclose the additional cost in Reprocessing compared to dry-cask storage

“The cost of spent-fuel reprocessing also is about ten times the cost of the alternative option for managing spent fuel, dry-cask spent-fuel storage.” (IPFM, Intro p.11)

 2. Extended Storage of ANSTO nuclear fuel waste at Lucas Heights is a viable option

& Contingency to return OPAL reactor Reprocessed fuel waste to Storage at LHs

3. ANSTO failure to provide a disposal strategy for OPAL nuclear fuel wastes flouts best practice

MIT’s $millions plan for small nuclear fusion station

April 2, 2018

MIT Receives Millions to Build Fusion Power Plant Within 15 Years https://gizmodo.com/mit-receives-millions-to-build-fusion-power-plant-withi-1823644634?IR=T   Ryan F. Mandelbaum 10 Mar 18 Nuclear fusion is like a way-more-efficient version of solar power—except instead of harnessing energy from the rays of a distant sun, scientists create miniature suns in power plants here on Earth. It would be vastly more efficient, and more importantly, much cleaner, than current methods of energy production. The main issue is that actually realizing fusion power has been really difficult.

Some, like the folks at the Bulletin of the Atomic Scientists, still worry that the excess neutrons produced in fusion could lead to radioactive waste or contaminants, as well as high costs.

Nature points out that there are plenty others are in the fusion-with-high-temperature-superconductors game, too. Princeton has its own tokamak, and there’s a British company called Tokamak Energy using a similar device to produce fusion energy. But all of the cash towards the MIT effort is significant.

“If MIT can do what they are saying—and I have no reason to think that they can’t — this is a major step forward,” Stephen Dean, head of Fusion Power Associates, in Maryland, told Nature.  Perhaps all fusion power needed to become reality was, well, a lot of money. Mumgaard said that CFS’ collaboration with MIT will “provide the speed to take what’s happening in the lab and bring it to the market.”