“Get non-Real”: Department of Energy grant funds novel research in High-Performance Algorithms at U-M

By | Feature, Research

“Preparing for the future means that we must continue to invest in the development of next-generation algorithms for scientific computing,

Barbara Helland, Associate Director for Advanced Scientific Computing Research, DOE Office of Science
Source: www.energy.gov/science/articles/department-energy-invests-28-million-novel-research-high-performance-algorithms

New research from the University of Michigan will help revolutionize the data processing pipeline with state-of-the-art algorithms to optimize the collection and processing of any kind of data. Algorithms available now are built for real data, meaning real numbers, however, most of the data we see on the internet is non-real, like discrete data, or categorical. This project is part of a $2.8 million grant from the Department of Energy on algorithms research, which is the backbone of predictive modeling and simulation. The research will enable DOE to set new frontiers in physics, chemistry, biology, and other domains. 

“Preparing for the future means that we must continue to invest in the development of next-generation algorithms for scientific computing,” said Barbara Helland, Associate Director for Advanced Scientific Computing Research, DOE Office of Science. “Foundational research in algorithms is essential for ensuring their efficiency and reliability in meeting the emerging scientific needs of the DOE and the United States.”

The U-M project, led by associate professor Laura Balzano and assistant professor Hessam Mahdavifar, both of electrical engineering and computer science, is one of six chosen by DOE to cover several topics at the leading-edge of algorithms research. According to the DOE, researchers will explore algorithms for analyzing data from biology, energy storage, and other applications. They will develop fast and efficient algorithms as building blocks for tackling increasingly large data analysis problems from scientific measurements, simulations, and experiments. Projects will also address challenges in solving large-scale computational fluid dynamics and related problems.

Laura Balzano and Hessam Mahdavifar portraits

Laura Balzano, associate professor of electrical engineering and computer science (left); Hessam Mahdavifar assistant professor of electrical engineering and computer science (right)

Balzano and Mahdavifar, both Michigan Institute for Computational Discovery and Engineering (MICDE) affiliated faculty members, will use a $300,000 portion of the overall grant to study randomized sketching and compression for high-dimensional non-real-valued data with low-dimensional structures.

“Randomized sketching and subsampling algorithms are revolutionizing the data processing pipeline by allowing significant compression of redundant information,” said Balzano. “Sketches work well because scientific data are generally highly redundant in nature, often following a perturbed low-dimensional structure. Hence, low-rank models and sketching that preserves those model structures are ubiquitous in many machine learning and signal processing applications.” 

Even though a lot of the data used and processed in scientific and technological applications are best modeled mathematically as discrete, categorical or ordinal data, most state-of-the art randomized sketching algorithms focus on real-valued data. To add to this, in practical applications, treating high-dimensional data can be challenging in terms of computational and memory demands. Thus, the proposed project will significantly expand the applicability of randomized sketching.

“A key to data-driven modeling is to carefully reformulate the computational and data analysis challenges and take full advantage of the underlying mathematical structure that is often common across application areas,” said Krishna Garikipati, MICDE director and professor of mechanical engineering and mathematics.”This research and the work that Laura and Hessam are doing is critically important to the advancement of computational discovery.”

MICDE catalyst grant leads to new NSF funding to study cascade “ecohydromics” in the Amazonian headwater system

By | Feature, News, Research

The Amazon Basin cycles more water through streamflow and evaporation than any other contiguous forest in the world, and transpiration by trees is a critical part of this cycle. Understanding how plant roots, stems, and leaves interact with soil water to regulate forest transpiration across landscapes is a critical knowledge gap, especially as climate changes. Professor Valeriy Ivanov, from the Department of Civil and Environmental Engineering at U-M, is the lead investigator in a newly NSF funded project that links diverse disciplines – plant ecophysiology, ecology, and hydrology – and will build a unique modeling framework to characterize landscape variation in physiological and hydrological processes in the Amazon Basin. The framework will integrate a wide array of field observations with detailed watershed modeling for hypothesis testing. The team includes Tyeen Taylor, research fellow also from the Civil and Environmental Engineering Department at U-M, and many collaborators in the U.S. at the University of Arizona, University of West Virginia, University of Nebraska, as well as Brazilian researchers at the Federal University of Eastern Para, and Federal University of Amazonas, National Institute for Amazonian Research, and Eastern Amazon Agricultural Agency. Detailed, physical models of ecophysiology and above- and below-ground hydrology will be informed by observations of leaf physiology, tree morphological traits, soil moisture, groundwater, and streamflow. Data and models will be integrated employing novel tools in probabilistic learning and uncertainty quantification. The computational framework tools to be used in this project were developed in part with the support from MICDE Catalyst grant program for the 2018 project “Urban Flood Modeling at “Human Action” Scale: Harnessing the Power of Reduced-Order Approaches and Uncertainty Quantification” led by Prof. Ivanov. 

Given (a) a mechanistic model M (e.g., a stomatal conductance model), (b) one can treat its inputs 𝛏 (e.g., parameters) as random variables. These inputs are sampled and model simulations are carried out. Using (c) PCEs, we construct a surrogate model that best approximates the model output – left-hand-side of (c). The surrogate is then evaluated with Monte Carlo simulations and used for (d) parameter inference. (d.1) is the flow of outputs from the surrogate model into a likelihood function L (D | 𝛏) to compare the surrogate model output and observed data D. This inference produces the posterior distribution for 𝛏. This pdf can then be sent back to the surrogate in (d.2) to reduce the uncertainty in the inputs and to obtain pdf for a quantity of interest (e).

“The reduced-ordered modeling approach developed during the MICDE Catalyst grant project is a key element of the new project,” said Prof. Ivanov, “the MICDE seed funding has allowed us to build a general framework that is applicable to a wide range of computational applications in earth-system science, and thus made our project proposal more competitive”.

The MICDE Catalyst Grants program funds projects that have the potential to catalyze and reorient the directions of their research fields by developing and harnessing powerful paradigms of computational science. This new NSF project is an example of the reach of the program.

Read more.

2021-2022 Catalyst Grant awardees continue to forge new fronts in computational science

By | Feature, News, Research

The Michigan Institute for Computational Discovery and Engineering (MICDE) announced the awardees of the 2021-2022 round of Catalyst Grants. Since 2017 MICDE Catalyst Grants program has funded a wide spectrum of cutting-edge research, this year focusing on AI for physically-based biomedicine, quantum computing, convergence of natural hazards with economic dislocation, and computational integration across scales and disciplines in biology. The five projects awarded in this round represent these new frontiers of computational research spearheaded by the Institute through its initiatives.

Prof. Shravan Veerapaneni (Mathematics) is working on advancing quantum algorithm research. His team will develop a Variational Quantum Monte Carlo algorithm that can potentially be applied to a wide range of linear algebraic tasks, like QR and Singular Value Decomposition (SVD). 

Profs. Salar Fattahi (Industrial and Operations Engineering) and Arvind Rao (Computational Medicine and Bioinformatics, Biostatistics) are revisiting existing theoretically powerful maximum-likelihood estimation mathematical methods to identify areas of weakness and strengthen them for use in biomedical, largely genomic, applications.

Profs. Gary Luker (Microbiology and Immunology), Nikola Banovic (Electrical Engineering and Computer Science), Xun Huan (Mechanical Engineering), Jennifer Linderman (Biomedical Engineering and Chemical Engineering), and Kathryn Luker (Radiology), will develop a physics/chemistry-aware inverse reinforcement learning (IRL) computational framework that will support the understanding single-cell and cooperative decision-making that drive tumor growth, metastasis, and recurrence.

Profs. Seth Guikema (Civil and Environmental Engineering and Industrial and Operations Engineering) and Jeremy Bricker (Civil and Environmental Engineering) will develop an integrated computational modeling approach to studying equity and resilience during natural hazard events, specifically estimating what essential services are the main constraints on individuals returning to a more normal life post-hazard, and assess inequities in resilience to coastal flooding events. 

Prof. Jesse Capecelatro (Mechanical Engineering and Aerospace Engineering) and Alberto Figueroa (Biomedical Engineering and Vascular Surgery), will develop a versatile, physics-driven, computationally efficient, and massively parallel numerical framework to simulate the interaction between fluids and biological particles in patient-specific vasculature geometries. This framework will enable next-generation computer-aided diagnostics.

“This year’s cohort of MICDE Catalyst Grants range from quantum computing for engineering science, AI for the physics of cancer, and computational advances in hazards engineering, through mathematical advances in data science, and bioengineering,” said MICDE Director Krishna Garikpati, a professor of mathematics and mechanical engineering. “These projects represent new frontiers of computational research spearheaded by MICDE through its initiatives.”

Learn more about MICDE’s Catalyst Grant program and funded projects here.

“This year’s cohort of MICDE Catalyst Grants … represent new frontiers of computational research spearheaded by MICDE through its initiatives.”

Krishna Garikipati
Director, MICDE

The crucial role of massively parallel simulations in future space exploration missions

By | HPC, News, Research

The NASA Mars 2020 Mission was launched with the goal of seeking signs of ancient life and collecting samples of rock and regolith (broken rock and soil) for possible return to Earth. Perseverance, the mission’s rover, is testing technologies to help pave the way for future human exploration of Mars. While Perseverance was launched in the summer of 2020, landing on the martian surface on February 18, 2021, the journey started years earlier when the mission’s objectives were outlined, including realistic surface operations, a proof-of-concept instrument suite, and suggestions for threshold science measurements that would meet the proposed objectives. The success of this, as well as past and future missions, is the collective result of thousands of NASA funded projects from teams of researchers and scientists from all over the country that span many decades. University of Michigan Professor Jesse Capecelatro (Mechanical Engineering & Aerospace Engineering) is the lead of one of these projects. In 2016, his research group started working on a project aimed to develop high fidelity models of plume-Induced soil erosion during lunar and planetary landings that will be used in future missions. 

During descent, exhaust plumes fluidize surface soil and dust, forming craters and buffeting the lander with coarse, abrasive particles. The SkyCrane technology, used by the Curiosity Rover in 2012 and by Perseverance in 2021, was designed to avoid plume-surface interactions by keeping the jets far above the surface. Regardless of this feature, a wind sensor on NASA’s Mars Curiosity rover was damaged during landing. In the Perseverance’s video footage of the landing, significant erosion and high-speed ejecta were observed.  It is also not a practical option for future crewed and sample return missions. 

NASA is aiming to improve rovers’ landing gears for future missions. Computational models and simulations are a critical component to achieve this as it is not feasible to mimic the martian or other celestial bodies entry conditions and run thousands of tests in a lab anywhere on Earth. This is where Prof. Capecelatro’s research group, including doctoral candidates Greg Shallcross, and Meet Patel, and postdoctoral fellow Medhi Khalloufi, work comes in as the accurate prediction of surface-plume interactions is necessary for the overall success of future space missions. While simulations of surface-plume interactions have been conducted in the past, these are outdated, and typically relied on simplified assumptions that prevent a detailed and dynamic analysis of the fluid-particle coupling. Capecelatro’s research code will provide NASA with a framework to better predict how different rover designs would impact the landing, specifically the effects of the force of the collision on the planet’s surface, and ability to adjust the rover’s landing trajectory independent of the NASA mission control team on earth.  

Prof. Capecelatro’s research project utilizes predictive simulation tools to capture the complex multiphase dynamics associated with rocket exhaust impingement during touchdown. Even in the most powerful supercomputers, a direct solution approach is only capable of accounting for  about a thousand particles at the same time, so accurate and predictive multi-scale models of the unresolved flow physics are essential. 

Full landing site image credit: NASA/JPL-Caltech (mars.nasa.gov/resources/24762/mars-sample-return-lander-touchdown-artists-concept/); particle and intermediate scale images: Capecelatro’s Research Group

Particle Scale

The group has been developing the simulation capabilities to directly resolve the flow at the sub-particle scale to shed light on important physics under the extreme conditions relevant to particle-structure interactions. Their model uses a massively parallel compressible particle-laden flow simulation tool where the exhaust plume and its corresponding flow features are computed in an Eulerian-Lagrangian framework. At this scale, for example, the flow between individual particles are resolved, providing important insight on drag and turbulence under these extreme conditions.

Intermediate Scale

As a next step, the particle-scale results inform models used in the intermediate-scale simulations developed by the group, where particles are still tracked individually but the flow is not resolved at a sub-particle resolution, allowing them to simulate upwards of 1 billion particles. At this scale, an Eularian-Lagrangian framework is used to incorporate the ground’s particle flow with the jet’s plume. 

Full Landing Site Scale

While the intermediate-scale simulations allow to study erosion and cratering, a full landing site that contains trillions of particles is still out of reach even in the most powerful HPC clusters. After further modeling, Capecelatro’s multi-scale framework will be handed over to NASA where it will be incorporated in simulations of the full landing site. At this scale, NASA’s framework uses an Eularian-based, two fluid model that treats both fluid and particles as a continuum, informed by the particle- and middle-scales models. 

Mission Mars 2020 is expanding NASA’s robotic presence on the red planet. While it is a big step to set the stage for future human exploration, the Perseverance Rover needs further redesign to make the voyage safe for humans. Capecelatro’s physics-based models are aiding this task by helping predict and model more accurately the outcomes of a spacecraft attempting to safely land millions of miles from home. As in many other fields, computational science will continue to play a critical role in the future of humanity’s quest to conquer space. #computationalscience everywhere!

Related links:
Sticking the landing on Mars: High-powered computing aims to reduce guesswork
Capecelatro’s Research Group
NASA 2020 Mars Mission: Perseverance Rover

Across six continents, scientists use computation to optimize cities’ responses to hazardous events

By | Events, Research, Uncategorized

“Community resilience is a manifestation of the human trait of adaptation. A resilient community is able to withstand and recover from hazardous events with minimal disruption to its way of life.”

Sherif El-Tawil
Antoine E. Naaman Collegiate Professor,
Department of Civil and Environmental Engineering

The combination of natural hazards, climate change, and the COVID-19 pandemic has demonstrated the importance of community resilience. Community resilience is a manifestation of the human trait of adaptation. A resilient community is able to withstand and recover from hazardous events with minimal disruption to its way of life. As humans, we seek to use our ability to engineer to adapt to the threat of natural hazards. Although achieving resilience is technically challenging and expensive, communities must strive to accomplish the highest level of resilience attainable with the engineering and financial resources available.

The science behind resilience engineering involves many disciplines, each dedicated to a subset of the overall problem. Complex issues lie at the intersection of these subsets, but interdisciplinary research is difficult to achieve because researchers in various disciplines frame problems and perform research from different perspectives and along distinct pathways. However, as computational models are well established in each discipline, computation is a natural language that links the disciplines together.

Last fall, the Michigan Institute for Computational Discovery and Engineering and the department of Civil and Environmental Engineering brought together established leaders and some of the most innovative rising scholars in the computational hazards research, to present and discuss different computational approaches used in modeling, assessing, and defining standards for community resilience. The speakers included representatives from leading research centers in the field: keynote speaker, Terri McAllister, from the National Institute of Standards and Technology (NIST); John van de Lindt (Colorado State University) co-director of the NIST-funded Center of Excellence (CoE) for Risk-Based Community Resilience Planning; Gregory Deierlein (Stanford University) from the SimCenter, which represents a consortium of universities on the U.S. West Coast; Sherif El-Tawil (University of Michigan) from ICoR, and Wael El-Dakhakhni (McMaster University) from INTERFACE.  They were joined

by other leaders in the fields including Tasos Sextos from Bristol University, UK, Xinzheng Lu, head of the Institute of Disaster Prevention and Mitigation of Tsinghua University; Hiba Baroud from Vanderbilt University, and Seth Guikema from the University of Michigan. The speakers highlighted their Centers’ or research groups’ capabilities and contributions, then reconvened for a panel discussion to address questions from the audience of nearly 250 participants from 30 countries, across six continents. The event also included a hands-on workshop that highlighted the Simple Run-Time Infrastructure software toolkit (SRTI). The SRTI is a free, open-source solution developed at the University of Michigan. It enables researchers to connect computer programs and simulators written in different languages, share data during execution, and design hybrid systems using disparate simulator modules, with a primary goal of being user friendly. The applications within this workshop demonstrated how one tool can be used to bring together multiple computational dialects to create a single language in the context of natural hazards research. The SRTI software toolkit is a result of the work of Dr. Sherif El-Tawil’s research group at the University of Michigan, supported by the National Science Foundation’s Office of Advanced Cyberinfrastructure (OAC) under grant CRISP TYPE II – 1638186. (icor.engin.umich.edu).

The range of techniques and principles that were detailed at this workshop can be applied to the current COVID-19 crisis. The pandemic is a perfect example that demonstrates that investing in mitigating risk reduces the cost, both human and material, of a hazard, and that even hazards with such a low probability of occurrence require enough investment to make ourselves resilient to it. The pandemic also illustrates that computational hazards research is a rich field with many opportunities at the intersection of the various disciplines. One of the most interesting ideas there is to explore is how to fuse sensor data – from the field – with simulations data, to achieve models that can help predict in real time the effect of a natural hazard.

Link to event information and recordings