HPC-POWERED SCIENTIFIC DISCOVERY
Accelerating Light Source Science with Multi-Facility HPC Workflows
Revolutionizing data processing and analysis at the Advanced Light Source (ALS) using NERSC and ALCF, our infrastructure dramatically improves time-to-insight for complex experiments.
Transforming Scientific Workflows
Our integrated infrastructure dramatically improves data throughput, analysis speed, and reproducibility for synchrotron light source experiments at LBNL.
Deep Analysis & Enterprise Applications
Select a topic to dive deeper, then explore the specific findings from the research, rebuilt as interactive, enterprise-focused modules.
The system integrates the ALS microtomography beamline with NERSC and ALCF HPC facilities, automating data movement, processing, and visualization. It uses a mix of LabVIEW, EPICS, Slurm, PBS, and Globus Compute interfaces.
End-to-End Workflow Process
Our infrastructure significantly reduces experimental friction, improves data throughput, and ensures reproducibility. Median file-based reconstruction times are consistently 20-30 minutes.
Streaming reconstructions provide previews in under 10 seconds, fundamentally changing how experiments are conducted and enabling real-time decision-making.
| Feature | Traditional Workflow | HPC-Augmented Workflow |
|---|---|---|
| Preview Turnaround |
|
|
| Full Reconstruction |
|
|
| Data Volume Handling |
|
|
| Access to HPC |
|
|
The system enables rapid sample exchange and side-by-side volumetric comparison for materials science and facilitates reanalysis and sharing of historical data, enhancing scientific communication.
Feather Morphology Comparison
Rapid sample exchange and side-by-side volumetric comparison allowed scientists to immediately reveal structural differences in chicken and sandgrouse feathers, accelerating studies from hours to minutes. This highlights the power of fast feedback in experimental design.
Fracking Proppant Analysis
Historical micro-CT data reanalysis using our infrastructure, including reconstruction and segmentation, was textured in Blender and exported for VR, enhancing scientific communication and outreach. This demonstrates reproducibility and shareability of complex results.
Calculate Your Potential ROI
Estimate the time and cost savings your enterprise could achieve by automating advanced scientific data workflows with HPC integration.
Estimated Annual Impact
Our Implementation Roadmap
A phased approach to integrate multi-facility HPC workflows into your research operations, ensuring a smooth transition and rapid value realization.
Discovery & Planning
Assess current workflows, define requirements, and establish integration points with existing control systems at your facility.
Infrastructure Setup
Deploy containerized services, configure high-speed data movement solutions (e.g., Globus, pvapy), and set up HPC job submission for NERSC/ALCF.
Workflow Integration
Integrate beamline control software with Prefect orchestration and adapt technique-specific analysis codes for efficient HPC execution.
Testing & Optimization
Validate end-to-end workflows, optimize performance for diverse data volumes, and fine-tune resource allocations for specific experiments.
Training & Rollout
Provide comprehensive user training, establish robust support channels, and plan expansion to additional beamlines or user facilities.
Ready to Accelerate Your Scientific Research?
Connect with our experts to design an HPC-integrated workflow tailored to your facility's unique needs and scientific goals.