Deep learning stretches electron microscopy limits

Editorial

Rebecca Pool

Monday, January 7, 2019 - 12:30
Image:  Defects labelled by the human (left) and neural network (right).
 
Oak Ridge National Laboratory researchers have used artificial intelligence to design deep learning networks that can extract structural information from raw electron microscopy data as effectively as human experts and in a fraction of the time.
 
So-called MENNDL - Multi-node Evolutionary Neural Networks for Deep Learning - creates artificial neural networks that can identify atomic-level structural information in scanning transmission electron microscopy data.
 
The evolutionary algorithm runs on ORNL's Summit supercomputer - currently the fastest supercomputer in the world - and performs an incredible 152 billion million calculations a second.
 
In only hours, Robert Patton and colleagues used MENNDL to create a neural network that performed as well as a human expert and reduced the time taken to analyse electron microscopy images by months.
 
One image; three analysis methods; a) Raw electron microscopy image. b) Defects (white) as labelled by a human expert. c) Defects (white) as labelled by a Fourier transform method. d) Defects (white) as labelled by the neural network. Defects that don’t exist are shown in purple, and defects that weren’t identified are orange.
 
Patton and colleagues reckon that MENNDL is the first known approach to automatically identify atomic-level structural information in STEM data.
 
Using the system to improve their understanding of electron beam-matter interaction and real-time image-based feedback, the researchers hope to drive automated materials nano-fabrication forward.
 
Website developed by S8080 Digital Media