This group brings together the best thinkers on energy and climate. Join us for smart, insightful posts and conversations about where the energy industry is and where it is going.

Post

Idaho National Laboratory Gets New Supercomputer for Simulation of Advanced Reactor Designs and the Fuels that Will Power Them

A powerful new supercomputer arrived in December at Idaho National Laboratory’s Collaborative Computing Center. The machine has the capability to run complex modeling and simulation applications, which are essential to developing next-generation nuclear technologies and the fuels that will power them.

Swatooth Mountains from Redfish Lake, ID

The Sawtooth Mountains as viewed from a boat on Redfish Lake near Stanley, ID

Named after a central Idaho mountain range, the Sawtooth supercomputer will be available to users in early 2020.

The $19.2 million system will enable researchers at INL and elsewhere to simulate new fuels and reactor designs, greatly reducing the time, resources and funding needed to transition advanced nuclear technologies from the concept phase into the marketplace.

By using simulations to predict how new fuels and designs will perform in a reactor environment, engineers can select the most promising technologies for the real-world experiments, saving time and money.

John Wagner, the associate laboratory director for INL’s Nuclear Science and Technology directorate, said Sawtooth plays an important role in developing and deploying advanced nuclear technologies and is a key capability for the National Reactor Innovation Center (NRIC).

nuclear TRLs

In August, the U.S. Department of Energy designated INL to lead NRIC, which was established to provide developers the resources to test, demonstrate and assess performance of new nuclear technologies, critical steps that must be completed before they are available commercially.

Use of Technology Readiness Levels to Asses Development Progress

One of the processes is to establish technology readiness levels that allow for objective evaluation of the maturity of development efforts.

smr-trls-nia

The Technology Readiness Level (TRL) process is used to quantitatively assess the maturity of a given technology. The TRL process has been developed and successfully used by DOD and NASA for development and deployment of new technology and systems. NASA has also successfully used the TRL process to develop and deploy new systems, and to qualify them for flight, for space applications.

Advanced nuclear fuels and materials development are critical items needed for closing the nuclear fuel cycle. Because the deployment of a new nuclear fuel forms requires a lengthy and expensive research, development, and demonstration program, applying the TRL concept to the advanced reactor design and fuel development an essential management and tracking tool.

“With advanced modeling and simulation and the computing power now available, we expect to be able to dramatically shorten the time it takes to test, manufacture and commercialize new nuclear technologies,” Wagner said.

“Other industries and organizations, such as aerospace, have relied on modeling and simulation to bring new technologies to market much faster without compromising safety and performance.”

Sawtooth is funded by the DOE’s Office of Nuclear Energy through the Nuclear Science User Facilities program. It will provide computer access to researchers at INL, other national laboratories, industry and universities. Idaho’s three research universities will be able to access Sawtooth and INL’s other supercomputers remotely via the Idaho Regional Optical Network (IRON), an ultra-high-speed fiber optic network.

# # #

Dan Yurman's picture

Thank Dan for the Post!

Energy Central contributors share their experience and insights for the benefit of other Members (like you). Please show them your appreciation by leaving a comment, 'liking' this post, or following this Member.

Discussions

Matt Chester's picture
Matt Chester on Jan 6, 2020 10:18 pm GMT

Presumably these type of calculations have been done for a while, but the idea is this will get us even more accuracy-- what's the big difference? Is it just a more complex computer that can get more granular with the simulation and/or do it in a more reasonable time frame? Or is there something fundamentally different and new going on? 

Dan Yurman's picture
Dan Yurman on Jan 6, 2020 11:51 pm GMT

Yes, something fundamentally new is going on  . . see quoted material below and citation for the text that follows.

---------

Advanced nuclear reactors offer safe, clean, and reliable energy at the global scale. The development of such devices relies heavily upon computational models, from the pre-conceptual stages through detailed design, licensing, and operation.

An integrated reactor modeling framework that enables seamless communication, coupling, automation, and continuous development brings significant new capabilities and efficiencies to the practice of reactor design.

In such a system, key performance metrics (e.g., optimal fuel management, peak cladding temperature in design-basis accidents, levelized cost of electricity) can be explicitly linked to design inputs (e.g., assembly duct thickness, tolerances), enabling an exceptional level of design consistency.

Coupled with high-performance computing, thousands of integrated cases can be executed simultaneously to analyze the full system, perform complete sensitivity studies, and efficiently and robustly evaluate various design tradeoffs.

Citation: 

"Computational Tools for the Integrated Design of Advanced Nuclear Reactors"
Nicholas W. Touran, John Gilleland*, Graham T. Malmgren, Charles Whitmer, William H. Gates III
Engineering 3 (2017) 518–526; Elsevier; 
https://www.sciencedirect.com/science/article/pii/S2095809917306124

John Gilleland
Corresponding author.
johng@terrapower.com
TerraPower, LLC, Bellevue, WA 98005, USA

 

Bob Meinetz's picture
Bob Meinetz on Jan 7, 2020 2:44 pm GMT

Matt, when so much power is being generated in such a small area things can go very wrong, very fast.

As work on the Trinity project proceeded at Los AlamosIn the 1940s, Hans Bethe, Robert Oppenheimer, and Richard Feynman used slide rules, pens, and paper to calculate neutron flux to the microsecond, with the goal of making something "wrong" extremely fast. In the 1950s-1960s the first generation of nuclear reactors, with the benefit of only the simplest of mainframe computers, were designed to prevent uncontrolled fission.

With modern computing it became possible to create accurate digital models of the progression of neutron flux in a reactor core, saving years of laborious (and dangerous) experimentation with highly enriched uranium.

Matt Chester's picture
Matt Chester on Jan 7, 2020 5:21 pm GMT

I definitely understand the value of supercomputers and such simulations, but was more asking about what makes these supercomputers/simulations more valuable and/or accurate than the supercomputers we previously had been using. But thanks to your and Dan's responses I'm seeing that I was just assuming such highly accurate supercomputer calculations were already standard, and this push is in fact a great breakthrough thanks to the speed at which computing continues to accelerate. 

Get Published - Build a Following

The Energy Central Power Industry Network is based on one core idea - power industry professionals helping each other and advancing the industry by sharing and learning from each other.

If you have an experience or insight to share or have learned something from a conference or seminar, your peers and colleagues on Energy Central want to hear about it. It's also easy to share a link to an article you've liked or an industry resource that you think would be helpful.

                 Learn more about posting on Energy Central »