News‎ > ‎

High-performance computing enables record-breaking ptychographic imaging at the ALS


By Erica Yee August 2, 2018

The Science IT Profiles series highlights how the Scientific Computing Group supports the work of Berkeley Lab researchers spanning various disciplines.

COSMIC members
From left to right: postdoc Bjoern Enders, CAMERA scientist Stefano Marchesini, ALS staff scientist David Shapiro, graduate student Kasra Nowrouzi at the COSMIC beamline. (Credit: Erica Yee)


Technologies from smartphones to electric cars are becoming indispensable, but they’re only useful as long as their batteries have juice. In order to build longer-lasting and faster-charging batteries for the world’s increasing power needs, scientists must observe the nanoscale processes in action.


Earlier this year, Advanced Light Source (ALS) researchers at Berkeley Lab used a technique called ptychographic computed tomography to image reactions in a battery cathode material down to 11 nanometer resolution, the smallest ever in three dimensions (3D). The group had previously set the 2D record in 2014 by imaging structures down to five nanometers.


Ptychographic imaging combines x-ray diffraction measurements and computation to characterize the structure and properties of objects at the nanoscale. In the 2018 study published in Nature Communications, ALS researchers used the technique to map locations of reactions inside a lithium-ion battery in 3D. Their sample was a cluster of nanoparticles harvested from a rechargeable coin cell battery. Even seeing a small portion allowed them to observe the chemical states and morphology (shape) to interpret what’s happening during the charge cycle. These results will hopefully help solve problems such as decreasing the time it takes to charge an electric car and improving battery performance.


3D chemical map

3D chemical map of a particle (left) and its segmentation into three chemical-phase groups (right). (Full image and caption at ALS)


The experiment was used to test the Nanosurveyor tool for the new COSMIC (COherent Scattering and MICroscopy) Imaging beamline at the ALS. Nanosurveyor’s permanent installation at Beamline 7.0.1 a few months ago opens up possibilities for similar explorations in magnetic materials and porous rocks, for example.


Integrating computing for the beamline


Since 2013, the ptychography group behind COSMIC has partnered with the Lab’s Scientific Computing Group (SCG) under the Science IT Department to implement a high-performance data pipeline that paved the way for the record-breaking ptychographic imaging.


From the Nanosurveyor tool, the pipeline integrates high-performance computing (HPC) by running algorithms for transmission, processing, analysis, and visualization of data on the Phasis compute cluster in a neighboring building. PHASIS is an HPC GPU cluster setup by the SCG specifically for preprocessing and reconstruction of soft x-ray diffraction data to a very high spatial resolution.


“Having a high-performance GPU compute cluster as part of the experimental instrument via a dedicated, high-speed network is really important for this research,” said David Shapiro, a staff scientist at the ALS and the lead scientist for COSMIC’s microscopy experiments.


With the penetrating power of ALS x-rays and computational power of the Phasis GPU nodes, COSMIC users can quickly turn large datasets of diffraction images into high resolution 3D images of a sample.


“Designing the computing workflow was a challenge,” said SCG HPC systems engineer Susan James. “Over a period of several years, we had many configuration changes. From relocating to different beamlines to integrating new development, compute, and storage nodes in the beamline and the Phasis cluster, the challenge was to optimize the workflow in a fast data pipeline from collecting, processing, to analyzing the data.”


Real-time streaming of ptychographic data

During their time at the beamline, users operate Nanosurveyor around the clock to constantly generate data. They place each sample into the endstation, then control the microscope and beamline from a screen in a neighboring hutch. The optical properties of the material change as they manipulate the x-ray wavelength, revealing information about internal chemical states. This high resolution, high dimensional data includes the full diffraction pattern, energy, and rotation of the sample. The images are recorded by the CCD (charge-coupled device), a camera-like detector that can generate 100 frames per second, or about 400MB/s of data.


“We can measure millions of x-ray diffraction patterns in a day. Each 2D sample image might be a few tens of thousands of such diffraction patterns,” said Shapiro.


A single 2D image at a single x-ray energy will typically be reduced from a few tens of gigabytes to a few tens of megabytes after processing. With all the diffraction patterns, the users may end up with a few thousand images from their few days at the beamline.


“The computation is very extensive, and it has to be accurate because we want quantitative results. Then this all has to come back from the supercomputer and be presented to the user at the microscope,” he continued. “The challenge for this collection of people is to make this complex data pipeline and computation really transparent so that non-specialists can come and use it.”


Nanosurveyor

The Nanosurveyor microscopy tool located at the COSMIC beamline. (Credit: Erica Yee)


The firehose of data coming out of the detector is preprocessed through software built by postdoctoral fellow Bjoern Enders. Running on CPU threads of Phasis, his code orchestrates the interaction between camera and control software, buffers the raw data flow, assembles and crops diffraction images, and applies a series of corrections and reductions to purify the data for reconstruction. The data are then piped to the GPUs using a high-throughput InfiniBand network.


The Phasis GPU infrastructure is necessary for high-performance analysis using SHARP, a new reconstruction algorithm developed by Berkeley Lab’s CAMERA (Center for Advanced Mathematics for Energy Research Applications). The SHARP (Scalable Heterogeneous Adaptive Real-time Ptychography) approach analyzes the ptychographic data from the detector to produce high resolution images in seconds.


“Essentially, SHARP iterates back and forth between the image that you're trying to reconstruct and the data that you're trying to fit. It recovers the image of the sample, the calibration of the illumination, and the background noise in order to fit the data and satisfy the overlap between all the different frames that have been acquired,” said Stefano Marchesini, a CAMERA scientist who was the PI on the grant that started the Lab’s ptychography program in 2009. The SHARP code developed by his team takes all the diffraction patterns for many overlapping positions on the sample and reconstructs the images sent back to the COSMIC microscope user. The 3D volume is currently assembled from these images using external tomography packages, but a new algorithmic workflow under construction will incorporate advanced tomographic algorithms as well.


Traditional ptychography requires three distinct steps: data acquisition, pre-processing, and reconstruction. The collaboration of expertise and technology from the ALS scientists, CAMERA, and the SCG now enables real-time ptychographic streaming — pre-processing and reconstruction of the data as it is acquired.


“We have a closed, automated streaming loop of data analysis from the detector through the cluster and back to the user,” said Shapiro.


The data is then transferred from the Phasis data transfer node (DTN) through ESnet’s Science DMZ 10Gb/s network to the National Energy Research Scientific Computing Center (NERSC). At NERSC, the large datasets are analyzed, shared and archived. Long-term, the ptychography group only wants to store the images the SHARP code produces from raw detector data. Enders is working on a searchable catalog for ptychographic data at NERSC for users to access. He has already successfully set-up connections using Globus Online, a research data transfer management service, so scientists from outside universities could copy the data.


“It's important to use Globus for the data transfer because if you want to push eight or 10 terabytes from one week of beamtime, it takes days to use the conventional internet. But if you use a Globus endpoint, it can be done in hours,” said Enders.


Preparing for ALS-U


COSMIC will soon need a much faster detector to take advantage of the upcoming ALS-U upgrade, which is projected to provide a hundred-times brighter x-rays. The ptychography researchers hope to use the increased brightness to achieve a significant gain in time resolution. For their research looking at chemical-based transformations in battery cathodes, they can create an image every few seconds with the current time resolution. ALS-U beams will potentially take them down to milliseconds, approaching the timescale of domain wall motion  and letting them observe real-time chemistry in nanomaterials.


In the upcoming ALS cycle, the group plans to test a new detector designed by the ALS detector group led by Peter Denes. This detector has a 30-times faster data rate and will produce more data than they can handle with the current computing architecture.


“If developments in detector technology and computation keep pace with those in synchrotron design, then we will be able to make efficient use of the brightness provided by the new facilities,” said Shapiro. “This will revolutionize our ability to study chemical morphology and dynamics in functional nano-materials.”





The Scientific Computing Group (also known as High Performance Computing Services) under Science IT supports the mission of Lawrence Berkeley National Laboratory by providing technology and consulting support for science and technical programs, in the areas of data management, HPC cluster computing, and Cloud services.