| Date | 30th, Aug 2020 |
|---|
Illinois researchers have linked electron microscope imaging and machine learning, making it much easier to study nanoparticles in action. The schematic shows how a neural network, middle, works as a bridge between liquid-phase electron microscope imaging, left, and streamlined data output, right. Credit: Graphic courtesy ACS and the Qian Chen group
In the nanoworld, tiny particles such as proteins appear to dance as they transform and assemble to perform various tasks while suspended in a liquid. Recently developed methods have made it possible to watch and record these otherwise-elusive tiny motions, and researchers now take a step forward by developing a machine learning workflow to streamline the process.
The new study, led by Qian Chen, a professor of materials science and engineering at the University of Illinois, Urbana-Champaign, builds upon her past work with liquid-phase electron microscopy and is published in the journal ACS Central Science.
Graduate student Zihao Ou, left, professor Qian Chen, center, and graduate student and lead author Lehan Yao. Credit: Photo by L. Brian Stauffer
Being able to see – and record – the motions of nanoparticles is essential for understanding a variety of engineering challenges. Liquid-phase electron microscopy, which allows researchers to watch nanoparticles interact inside tiny aquariumlike sample containers, is useful for research in medicine, energy and environmental sustainability and in fabrication of metamaterials, to name a few. However, it is difficult to interpret the dataset, the researchers said. The video files produced are large, filled with temporal and spatial information, and are noisy due to background signals – in other words, they require a lot of tedious image processing and analysis.
“Developing a method even to see these particles was a huge challenge,” Chen said. “Figuring out how to efficiently get the useful data pieces from a sea of outliers and noise has become the new challenge.”
The schematic shows a simplified version of the steps taken by researchers to connect liquid-phase electron microscopy and machine learning to produce a streamlined data output that is less tedious to process than past methods. Credit: Graphic courtesy ACS and the Qian Chen group
To confront this problem, the team developed a machine learning workflow that is based upon an artificial neural network that mimics, in part, the learning potency of the human brain. The program builds off of an existing neural network, known as U-Net, that does not require handcrafted features or predetermined input and has yielded significant breakthroughs in identifying irregular cellular features using other types of microscopy, the study reports.
“Our new program processed information for three types of nanoscale dynamics including motion, chemical reaction and self-assembly of nanoparticles,” said lead author and graduate student Lehan Yao. “These represent the scenarios and challenges we have encountered in the analysis of liquid-phase electron microscopy videos.”
The researchers collected measurements from approximately 300,000 pairs of interacting nanoparticles, the study reports.
