Posted in | News | Machine-Vision

New Machine Learning Platform for Immediate Testing of Nuclear Components

A recently designed machine vision system is capable of detecting damage such as defects and swelling caused by radiation on materials and components for nuclear reactors, in real-time.

New Machine Learning Platform for Immediate Testing of Nuclear Components.
Priyam Patki observes the evolution of the material’s microstructure as irradiation takes place and performs small adjustments to the imaging conditions in the electron microscope. The real-time quantification and visual overlay can be seen in the top monitor where Priyam tries to quantify the number of large “black dot” defects as an evolution of radiation damage. Image credit: Kevin Field, Nuclear Oriented Materials and Examination Group, University of Michigan.

It could accelerate the development of parts for advanced nuclear reactors, which may play a crucial role in decreasing greenhouse gas emissions to combat climate change.

We believe we are the first research team to ever demonstrate real-time image-based detection and quantification of radiation damage on the nanometer length scale in the world.

Kevin Field, Associate Professor of Nuclear Engineering and Radiological Sciences, University of Michigan

Field is also the vice president of Theia Scientific, the machine vision startup. The interpretation method could be applied to other types of image-based microscopy.

We see clear pathways to accelerate discoveries in the energy, transportation, and biomedical sectors,” Field said.

The new technology was put to test at the Michigan Ion Beam Laboratory. By transmitting beams of charged atoms — called ions — at material samples, the lab can quickly emulate the continual damage caused after years or decades of use in a nuclear reactor.

The researchers used an ion beam of the noble gas krypton to examine a sample of chromium, iron and aluminum — a radiation-tolerant material of interest for both fusion and fission reactors.

If radiation exposure makes your metal like Swiss cheese instead of a good Wisconsin cheddar, you would know it’s not going to have structural integrity.

Kevin Field, Associate Professor of Nuclear Engineering and Radiological Sciences, University of Michigan

The krypton ions hammering the sample produce radiation flaws — in this case, a plane of missing or additional atoms crammed between two ordinary crystal lattice planes. They look like black dots in the electron microscope images. The lab could view the development of these flaws with an electron microscope, which functions during the irradiation process, recording a video.

Previously, we would record the whole video for the irradiation experiments and then characterize just a few frames. But now, with the help of this technique, we are able to do it for each and every frame, giving us an insight into the dynamic behavior of the defects—in real-time.

Priyam Patki, Postdoctoral Researcher in Nuclear Engineering and Radiological Sciences, University of Michigan

Patki ran the experiment with Christopher Field, President of Theia Scientific.

To evaluate radiation-induced damages, scientists would usually download the video, take it to the office, and count every flaw in designated frames. With the hundreds, or even thousands, of video or image frames produced by modern microscopes, much of the in-depth data would be lost as counting the flaws manually in every frame is extremely difficult.

Rather than counting manually, the team employed Theia Scientific’s technology to detect and measure the radiation-induced flaws immediately during the experiment. The software shows the results in graphics overlaid on the electron microscope imagery, which labels the flaws — giving their size, location, number and density — and encapsulates this data as a measure of structural integrity.

The machine learning software employs a convolutional neural network, a type of artificial neural network ideal for interpreting images, to examine the electron microscope video frames. The neural network accomplished high speed and rich interpretation across samples of diverse quality, and this consecutively permitted the leap from manual analysis to real-time machine vision.

The real-time assessment of structural integrity allows us to stop early if a material is performing badly and cuts out any extensive human-based quantification. We believe that our process reduces the time from idea to conclusion by nearly 80 times.

Kevin Field, Associate Professor of Nuclear Engineering and Radiological Sciences, University of Michigan

The project was financially supported by the U.S. Department of Energy Small Business Innovation Research Program (Phase 1). Theia Scientific is, at present, forming its proposal for Phase 2, which would allow the completion of the main research and development efforts. The company anticipates that preproduction units would be available in 2022.

Other partners on the project include Fabian Naab, a research lab specialist in nuclear engineering and radiological sciences at U-M; Kai Sun, an associate research scientist in materials science and engineering at U-M; and Dane Morgan, the Harvey D. Spangler Professor of Engineering at the University of Wisconsin.

The machine-learning algorithm depended on datasets created at the University of Michigan, which Theia Scientific wants to license. The Field brothers co-founded Theia Scientific in 2020, and, therefore, Kevin Field has a financial interest in Theia Scientific.

Source: https://umich.edu

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.