MICDE Symposium: Emerging and Future Paradigms for High Performance Computing

By | Educational, Feature, General Interest, News, Research

By Eric Shaw
Office of the Vice President for Research

The Michigan Institute for Computational Discovery and Engineering (MICDE) welcomed distinguished scientists to discuss emerging and future paradigms for High Performance Computing during its 2023 symposium, held on Friday, March 24. 

“Computational advancements have reached a pivotal juncture, empowering researchers to make progress in scientific discoveries and engineering innovations. This is enabled by the confluence of algorithms, software and hardware, and it is critical for experts across disciplines to work together to continue the pace of progress and achieve desired outcomes,” said Karthik Duraisamy, Director of Michigan Institute for Computational Discovery and Engineering.

The symposium featured a wide range of topics related to high performance computing and other computational science-related issues, demonstrating the multidisciplinary nature of the field. These leading-edge developments highlight the vast potential for the field to address some of the most significant challenges facing society today. From improving weather and climate forecasting to advancing materials science and drug discovery to understanding nuclear fusion, the power of high-performance computing is truly remarkable.

Dr. Fariba Fahroo, program officer at the Air Force Office of Scientific Research (AFOSR), spoke on challenges, opportunities, and national needs in computational science. During the talk, she introduced AFOSR and the computational mathematics program she manages. She also discussed the challenges and new directions emerging in computational mathematics as a field bridging areas in applied mathematics and computational science. Dr. Fahroo shared her insights on large-scale projects in machine learning for modeling physical systems, rare events, data assimilation, and reduced order modeling. The talk highlighted the importance of basic research programs in various areas of computational math and control theory, such as multiscale modeling and computation, design under uncertainty, distributed, multi-agent control and estimation, and computational control theory.

Stanford professor, Alex Aiken, presented Legion, a programming model, and runtime system that is designed to handle the increasingly complex and hierarchical nature of modern machines. Aiken discussed the design of Legion, its rationale, and recent work in developing these libraries before highlighting the importance of considering data movement in parallel programming and the potential for Legion to improve the efficiency and productivity of programming for modern machines. Aiken also highlighted usability – particularly demonstrating how simple codes written for course projects scale to a massive number of GPU nodes.

Princeton University professor of astrophysical sciences, Amitava Bhattacharjee, presented his research on the High-Fidelity Whole Device Model of Magnetically Confined Fusion Plasma (WDMApp) as part of the Department of Energy’s Exascale Computing Project (ECP). Bhattacharjee explained that WDMApp is a ten-year project that involves plasma physicists, applied mathematicians, and computer scientists to simulate whole device burning plasmas applicable to an advanced tokamak regime. Bhattacharjee explained that the most crucial step of the project was coupling two existing, well-established, extreme-scale gyrokinetic codes, the GENE continuum code, and the XGC particle-in-cell (PIC) code, to develop novel algorithms for both GENE-XGC and GEM-XGC coupling. The WDMApp codes (GENE, GEM, and XGC) were optimized, leveraging the ECP Co-Design and Software Technologies projects for portability and performance.

Dr. Patty Lee, the chief scientist of hardware technology development at Quantinuum, presented the current capabilities of quantum computing hardware and discussed the scientific and industrial applications that have been run on the hardware. Dr. Lee also provided insights into the software development toolkits available to support the quantum programmer community and the outlook for achieving quantum advantage in the near term. She highlighted the exponential improvement in the computational capability of state-of-the-art quantum computing hardware compared to classical computers.

Dr. Christiane Jablonowski, a professor in the Department of Climate and Space Sciences and Engineering at the University of Michigan, gave a talk on “Computational Frontiers in Weather and Climate Modeling”. She reviewed the state-of-the-art weather and climate modeling approaches at NOAA, the National Center for Atmospheric Research (NCAR), and the Department of Energy, and discussed the emerging computational frontiers. The talk focused on high-resolution weather and climate modeling trends, the ‘digital twin’ concept, and emerging computational paradigms.

The much-anticipated exascale computing era is here, with the arrival of the Frontier system at Oak Ridge National Laboratory in the US. The US Department of Energy’s Exascale Computing Project (ECP) is poised to take full advantage of Frontier’s capabilities in tackling problems of national and international interest. According to Doug Kothe, the Director of the US Department of Energy Exascale Computing Project, and the day’s final speaker, “When we collaborate, we get the most powerful tools and discoveries.” The ECP’s mission is to deliver on targeted exascale systems such as Frontier, which are capable of addressing high-priority strategic problems of national interest that are intractable with at least 50 times the computational power of the HPC systems available in 2016, yet at a very high efficiency. The ECP’s software technology effort is developing an expanded and vertically integrated software stack that includes advanced mathematical libraries, extreme-scale programming environments, development tools, visualization libraries, and the software infrastructure to support large-scale data management and data science for science and security applications.

In addition to the inspiring talks, symposium attendees also had the opportunity to engage in a lively panel discussion with the day’s speakers. Moderated by Venkat Raman, Professor of Aerospace Engineering and Mechanical Engineering at the University of Michigan, the panelists – Fariba Fahroo, Doug Kothe, Amitava Bhattacharjee, Patty Lee, Christiane Jablonowski, and Alex Aiken – tackled some of the most pressing issues and challenges in the field of high performance computing and computational discovery. The audience was able to ask questions and participate in the discussion, making it an engaging experience. It was a fitting end to an informative and thought-provoking day at the MICDE Symposium.

The 2023 MICDE Symposium featured a poster competition where 58 participants showcased their research. The competition winners were announced on Friday, March 24. Tommy Waltmann secured the first place for his work on “Fast and Efficient Particle Trajectory Analysis with the freud Library.” Doruk Aksoy won the second place for “An Incremental Tensor Train Decomposition for High-Dimensional Data Streams,” and Archana Sridhar and Parameshwaran Pasupathy shared the third place for their respective works on “Simulation and modeling of particle-laden compressible flows” and “A Fractional Viscoelastic Model of the Axon in Brain White Matter.” The fourth place was shared among Keith Phuthi, Srinivasan Arunachalam, Kyle Bushick, and Vishal Subramanian, for their works on various topics related to simulation and modeling.

“It has truly been an honor hosting these distinguished speakers, and to attend the poster session which highlighted the incredible depth and breadth of research in computational science at the University of Michigan. It is a testament to our pursuit of knowledge and innovation and a reminder of the direct impact computational science has on science and society,” Duraisamy said.

MICDE 2023 Annual Symposium Thank you

Principles of Intelligent Behavior in Biological and Social Systems (PIBBSS) Fellowship

By | News, Research, SC2 jobs, Uncategorized

Summer fellowship for researchers, primarily for PhD students and postdocs with experience studying complex and intelligent behavior in biological and social systems. Fellows will work on selected projects at the intersection between the fellow’s field of expertise and AI alignment and/or governance in close collaboration with a mentor.

Date: June – August, 2023

Place: Europe (most likely one in near Oxford and one in near Prague)

Stipend: 3,000 USD/month

Deadline to apply: February 15, 2023

Application materials:

  • CV/résumé
  • Personal statement : A 600-800 word statement discussing a) your research background and interests, and b) why you are motivated to participate in the fellowship.
  • Past work (optional) : We are interested in examples representative of your research interests, expertise, and/or scientific writing style.

More information.

4th Rising Stars in Computational and Data Sciences workshop

By | Educational, Events, News, Research

The Oden Institute for Computational Engineering and Sciences at UT Austin, Sandia National Laboratories (SNL), and Lawrence Livermore National Laboratory (LLNL) are partnering to host the 4th Rising Stars in Computational and Data Sciences, an intensive workshop for women graduate students and postdocs who are interested in pursuing academic and research careers.

Date: April 12-13, 2023

Place: Austin, TX University of Texas Oden Institute for Computational Engineering and Sciences

The Oden Institute is seeking nominations for outstanding candidates in their final year of PhD or within three years of having graduated. We will select approximately 30 women to come for two days of research presentations, poster sessions, and interactive discussions about academic and research careers, with financial support for travel provided.

The nomination consists of sending (1) a letter of nomination and (2) a copy of the nominee’s 2-page resume to Karissa Vail at karissa.vail@austin.utexas.edu. More information can be found at https://risingstars.oden.utexas.edu/

Please consider nominating one of your outstanding current/recent PhD students or postdocs.

Nominations are due:  January 23, 2023.

NVIDIA Academic Hardware Grant Program

By | Funding Opportunities, Research

The NVIDIA Academic Hardware Grant Program endeavors to advance education and research by:

  1. Enabling groundbreaking, innovative, and unique academic research projects with world-class computing resources.
  2. Providing educators with a hands-on platform to teach AI, deep learning, and data science to students in any discipline.

Eligibility

Must be a member of the NVIDIA Developer Program to qualify

  • For Researchers:
    • Applicant must be a faculty or PhD student researcher at a university or research institute
    • Application must demonstrate clear understanding of how to use NVIDIA technology to accelerate research and significantly impact the success of the project
  • For Teachers and Instructors:
    • Applicant must be a teacher or administrator at a college, university, primary/secondary school, or non-profit STEM organization
    • Course must make use of NVIDIA SDKs and give students a hands-on opportunity to hone skills

This is a competitive program. Not all projects that meet the eligibility requirements will be awarded.

Application Window – Key Dates:

  • Opens: June 20, 2022 9:00 AM PT
  • Closes: July 1, 2022 6:00 PM PT
  • Award Decisions Sent By: August 26th, 2022

You can find the application form and more information about the grant on NVIDIA`s website.

New physics-based computation and AI framework to understand the agressive behavior of cancer cells

By | Feature, Research

Cancer is an illness caused by an uncontrolled division of transformed cells, which can originate in almost  any organ of the body.  Cancer is not a single disease, even when it arises in the same site of the body. Tremendous variability exists in progression of disease and response to therapy among different persons with the same general type of cancer, such as breast cancer. Even at the level of a single person, cancer cells show tremendous heterogeneity within a single tumor and among a primary tumor and metastases. This heterogeneity causes drug resistance and fatal disease. The prevailing dogma is that heterogeneity among cancer cells arises randomly, generating greedy individual cancer cells that compete for growth factors and optimal environments. The rare “winners” in this competition survive and metastasize. However, tumors consistently maintain heterogeneous subpopulations of cancer cells, some of which appear less able to grow and spread. This observation prompted Gary and Kathy Luker, cancer cell biologists at the University of Michigan, to hypothesize that cancer cells may actually collaborate under some circumstances to cause disease and not just compete. The idea that single, heterogeneous cancer cells work collectively within a constrained range of variability to drive population-level outputs in tumor progression is a ground-breaking concept that may revolutionize how we approach cancer biology and therapy.

The team is using innovative approaches to extract and merge data streams from models that generate heterogeneous cell behaviors

...cancer cell biologists have teamed up with computational scientists and experts in artificial intelligence to focus the power of these fields on understanding and overcoming heterogeneity in cancer.

To understand causes of single-cell heterogeneity in cancer and conditions that motivate cancer cells to collaborate, an interdisciplinary team of scientists at UM formulated an entirely new conceptual approach to this challenging problem. The cancer cell biologists have teamed up with computational scientists and experts in artificial intelligence to focus the power of these fields on understanding and overcoming heterogeneity in cancer. Building on large, single-cell data sets unique to the team, they will combine inverse reinforcement learning, an artificial intelligence method typically applied to discover motivations for human behaviors, with computational models inferred on the basis of the physics and chemistry of cell signaling and migration. They have proposed an entirely new conceptual approach combining single cell data, physics-based modeling and artificial intelligence to single-cell heterogeneity and intercellular interactions. By discovering  testable molecular processes underlying “decision-making” by single cells and their “motivations” for acting competitively or collaboratively, this research blazes a new path to understand and treat cancer. Their high-risk, high-reward approach to understand how each cell in a population processes information and translates that to action driving cancer progression, has attracted an award of $1 million dollars by the Keck Foundation. 

The team includes Gary Luker (Radiology, Microbiology and Immunology; Biomedical Engineering), and Kathryn Luker (Radiology), who are leading the experimental studies of cell signaling and migration; Jennifer Linderman (Chemical Engineering; Biomedical Engineering); and Krishna Garikipati (Mechanical Engineering; Mathematics), who are leading the machine learning and modeling side of the project. Nikola Banovic (Electrical Engineering and Computer Science) and Xun Huan (Mechanical Engineering) are using artificial intelligence approaches to discover decision-making policies and rewards for cancer cells, working with the rest of the investigators to incorporate experimental data and physics/chemistry-based models into their approaches.

The W. M. Keck Foundation was established in 1954 in Los Angeles by William Myron Keck, founder of The Superior Oil Company. One of the nation’s largest philanthropic organizations, the W. M. Keck Foundation supports outstanding science, engineering and medical research. The Foundation also supports undergraduate education and maintains a program within Southern California to support arts and culture, education, health and community service projects. This project incorporates elements from all the W. M. Keck Foundation’s focus research areas to tackle cancer with a novel, physics-based modeling and AI-centered approach.  The idea for this project originated in the 2020 MICDE faculty workshop in AI for Physically based Bio-medicine Workshop. The workshop brought together an interdisciplinary group of faculty members to discuss ways to advance artificial intelligence and machine learning methods for biomedical problems. After seeding the idea, a subset of these researchers were awarded an MICDE catalyst grant and a MIDAS PODS grant. These funds were used to establish the proof of concept and to generate preliminary results. 

Computational science is becoming increasingly indispensable in many areas of biomedical science. While the current proposal focuses on cancer, this innovative computational framework represents a transformative leap with widespread applications in multiple other biomedical, physical, and social sciences. MICDE supports innovative and interdisciplinary projects aiming to advance the current paradigms.

Portraits of Kathryn Luker, Gary Luker, Krishna Garikipati, Jennifer Linderman, Nikola Banovic and Xun Huan

Project’s principal investigators (left to right): Kathryn Luker (Radiology), Gary Luker (Radiology, Microbiology and Immonology, and Biomedical Engineering), Krishna Garikipati (Mechanical Engineering, and Mathematics), Jennifer Linderman (Chemical Engineering, and Mathematics), Nikola Banovic (Electrical Engineering and Computer Science) and Xun Huan (Mechanical Engineering).

“Get non-Real”: Department of Energy grant funds novel research in High-Performance Algorithms at U-M

By | Feature, Research

“Preparing for the future means that we must continue to invest in the development of next-generation algorithms for scientific computing,

Barbara Helland, Associate Director for Advanced Scientific Computing Research, DOE Office of Science
Source: www.energy.gov/science/articles/department-energy-invests-28-million-novel-research-high-performance-algorithms

New research from the University of Michigan will help revolutionize the data processing pipeline with state-of-the-art algorithms to optimize the collection and processing of any kind of data. Algorithms available now are built for real data, meaning real numbers, however, most of the data we see on the internet is non-real, like discrete data, or categorical. This project is part of a $2.8 million grant from the Department of Energy on algorithms research, which is the backbone of predictive modeling and simulation. The research will enable DOE to set new frontiers in physics, chemistry, biology, and other domains. 

“Preparing for the future means that we must continue to invest in the development of next-generation algorithms for scientific computing,” said Barbara Helland, Associate Director for Advanced Scientific Computing Research, DOE Office of Science. “Foundational research in algorithms is essential for ensuring their efficiency and reliability in meeting the emerging scientific needs of the DOE and the United States.”

The U-M project, led by associate professor Laura Balzano and assistant professor Hessam Mahdavifar, both of electrical engineering and computer science, is one of six chosen by DOE to cover several topics at the leading-edge of algorithms research. According to the DOE, researchers will explore algorithms for analyzing data from biology, energy storage, and other applications. They will develop fast and efficient algorithms as building blocks for tackling increasingly large data analysis problems from scientific measurements, simulations, and experiments. Projects will also address challenges in solving large-scale computational fluid dynamics and related problems.

Laura Balzano and Hessam Mahdavifar portraits

Laura Balzano, associate professor of electrical engineering and computer science (left); Hessam Mahdavifar assistant professor of electrical engineering and computer science (right)

Balzano and Mahdavifar, both Michigan Institute for Computational Discovery and Engineering (MICDE) affiliated faculty members, will use a $300,000 portion of the overall grant to study randomized sketching and compression for high-dimensional non-real-valued data with low-dimensional structures.

“Randomized sketching and subsampling algorithms are revolutionizing the data processing pipeline by allowing significant compression of redundant information,” said Balzano. “Sketches work well because scientific data are generally highly redundant in nature, often following a perturbed low-dimensional structure. Hence, low-rank models and sketching that preserves those model structures are ubiquitous in many machine learning and signal processing applications.” 

Even though a lot of the data used and processed in scientific and technological applications are best modeled mathematically as discrete, categorical or ordinal data, most state-of-the art randomized sketching algorithms focus on real-valued data. To add to this, in practical applications, treating high-dimensional data can be challenging in terms of computational and memory demands. Thus, the proposed project will significantly expand the applicability of randomized sketching.

“A key to data-driven modeling is to carefully reformulate the computational and data analysis challenges and take full advantage of the underlying mathematical structure that is often common across application areas,” said Krishna Garikipati, MICDE director and professor of mechanical engineering and mathematics.”This research and the work that Laura and Hessam are doing is critically important to the advancement of computational discovery.”

MICDE catalyst grant leads to new NSF funding to study cascade “ecohydromics” in the Amazonian headwater system

By | Feature, News, Research

The Amazon Basin cycles more water through streamflow and evaporation than any other contiguous forest in the world, and transpiration by trees is a critical part of this cycle. Understanding how plant roots, stems, and leaves interact with soil water to regulate forest transpiration across landscapes is a critical knowledge gap, especially as climate changes. Professor Valeriy Ivanov, from the Department of Civil and Environmental Engineering at U-M, is the lead investigator in a newly NSF funded project that links diverse disciplines – plant ecophysiology, ecology, and hydrology – and will build a unique modeling framework to characterize landscape variation in physiological and hydrological processes in the Amazon Basin. The framework will integrate a wide array of field observations with detailed watershed modeling for hypothesis testing. The team includes Tyeen Taylor, research fellow also from the Civil and Environmental Engineering Department at U-M, and many collaborators in the U.S. at the University of Arizona, University of West Virginia, University of Nebraska, as well as Brazilian researchers at the Federal University of Eastern Para, and Federal University of Amazonas, National Institute for Amazonian Research, and Eastern Amazon Agricultural Agency. Detailed, physical models of ecophysiology and above- and below-ground hydrology will be informed by observations of leaf physiology, tree morphological traits, soil moisture, groundwater, and streamflow. Data and models will be integrated employing novel tools in probabilistic learning and uncertainty quantification. The computational framework tools to be used in this project were developed in part with the support from MICDE Catalyst grant program for the 2018 project “Urban Flood Modeling at “Human Action” Scale: Harnessing the Power of Reduced-Order Approaches and Uncertainty Quantification” led by Prof. Ivanov. 

Given (a) a mechanistic model M (e.g., a stomatal conductance model), (b) one can treat its inputs ? (e.g., parameters) as random variables. These inputs are sampled and model simulations are carried out. Using (c) PCEs, we construct a surrogate model that best approximates the model output – left-hand-side of (c). The surrogate is then evaluated with Monte Carlo simulations and used for (d) parameter inference. (d.1) is the flow of outputs from the surrogate model into a likelihood function L (D | ?) to compare the surrogate model output and observed data D. This inference produces the posterior distribution for ?. This pdf can then be sent back to the surrogate in (d.2) to reduce the uncertainty in the inputs and to obtain pdf for a quantity of interest (e).

“The reduced-ordered modeling approach developed during the MICDE Catalyst grant project is a key element of the new project,” said Prof. Ivanov, “the MICDE seed funding has allowed us to build a general framework that is applicable to a wide range of computational applications in earth-system science, and thus made our project proposal more competitive”.

The MICDE Catalyst Grants program funds projects that have the potential to catalyze and reorient the directions of their research fields by developing and harnessing powerful paradigms of computational science. This new NSF project is an example of the reach of the program.

Read more.

2021-2022 Catalyst Grant awardees continue to forge new fronts in computational science

By | Feature, News, Research

The Michigan Institute for Computational Discovery and Engineering (MICDE) announced the awardees of the 2021-2022 round of Catalyst Grants. Since 2017 MICDE Catalyst Grants program has funded a wide spectrum of cutting-edge research, this year focusing on AI for physically-based biomedicine, quantum computing, convergence of natural hazards with economic dislocation, and computational integration across scales and disciplines in biology. The five projects awarded in this round represent these new frontiers of computational research spearheaded by the Institute through its initiatives.

Prof. Shravan Veerapaneni (Mathematics) is working on advancing quantum algorithm research. His team will develop a Variational Quantum Monte Carlo algorithm that can potentially be applied to a wide range of linear algebraic tasks, like QR and Singular Value Decomposition (SVD). 

Profs. Salar Fattahi (Industrial and Operations Engineering) and Arvind Rao (Computational Medicine and Bioinformatics, Biostatistics) are revisiting existing theoretically powerful maximum-likelihood estimation mathematical methods to identify areas of weakness and strengthen them for use in biomedical, largely genomic, applications.

Profs. Gary Luker (Microbiology and Immunology), Nikola Banovic (Electrical Engineering and Computer Science), Xun Huan (Mechanical Engineering), Jennifer Linderman (Biomedical Engineering and Chemical Engineering), and Kathryn Luker (Radiology), will develop a physics/chemistry-aware inverse reinforcement learning (IRL) computational framework that will support the understanding single-cell and cooperative decision-making that drive tumor growth, metastasis, and recurrence.

Profs. Seth Guikema (Civil and Environmental Engineering and Industrial and Operations Engineering) and Jeremy Bricker (Civil and Environmental Engineering) will develop an integrated computational modeling approach to studying equity and resilience during natural hazard events, specifically estimating what essential services are the main constraints on individuals returning to a more normal life post-hazard, and assess inequities in resilience to coastal flooding events. 

Prof. Jesse Capecelatro (Mechanical Engineering and Aerospace Engineering) and Alberto Figueroa (Biomedical Engineering and Vascular Surgery), will develop a versatile, physics-driven, computationally efficient, and massively parallel numerical framework to simulate the interaction between fluids and biological particles in patient-specific vasculature geometries. This framework will enable next-generation computer-aided diagnostics.

“This year’s cohort of MICDE Catalyst Grants range from quantum computing for engineering science, AI for the physics of cancer, and computational advances in hazards engineering, through mathematical advances in data science, and bioengineering,” said MICDE Director Krishna Garikpati, a professor of mathematics and mechanical engineering. “These projects represent new frontiers of computational research spearheaded by MICDE through its initiatives.”

Learn more about MICDE’s Catalyst Grant program and funded projects here.

“This year’s cohort of MICDE Catalyst Grants … represent new frontiers of computational research spearheaded by MICDE through its initiatives.”

Krishna Garikipati
Director, MICDE

The crucial role of massively parallel simulations in future space exploration missions

By | HPC, News, Research

The NASA Mars 2020 Mission was launched with the goal of seeking signs of ancient life and collecting samples of rock and regolith (broken rock and soil) for possible return to Earth. Perseverance, the mission’s rover, is testing technologies to help pave the way for future human exploration of Mars. While Perseverance was launched in the summer of 2020, landing on the martian surface on February 18, 2021, the journey started years earlier when the mission’s objectives were outlined, including realistic surface operations, a proof-of-concept instrument suite, and suggestions for threshold science measurements that would meet the proposed objectives. The success of this, as well as past and future missions, is the collective result of thousands of NASA funded projects from teams of researchers and scientists from all over the country that span many decades. University of Michigan Professor Jesse Capecelatro (Mechanical Engineering & Aerospace Engineering) is the lead of one of these projects. In 2016, his research group started working on a project aimed to develop high fidelity models of plume-Induced soil erosion during lunar and planetary landings that will be used in future missions. 

During descent, exhaust plumes fluidize surface soil and dust, forming craters and buffeting the lander with coarse, abrasive particles. The SkyCrane technology, used by the Curiosity Rover in 2012 and by Perseverance in 2021, was designed to avoid plume-surface interactions by keeping the jets far above the surface. Regardless of this feature, a wind sensor on NASA’s Mars Curiosity rover was damaged during landing. In the Perseverance’s video footage of the landing, significant erosion and high-speed ejecta were observed.  It is also not a practical option for future crewed and sample return missions. 

NASA is aiming to improve rovers’ landing gears for future missions. Computational models and simulations are a critical component to achieve this as it is not feasible to mimic the martian or other celestial bodies entry conditions and run thousands of tests in a lab anywhere on Earth. This is where Prof. Capecelatro’s research group, including doctoral candidates Greg Shallcross, and Meet Patel, and postdoctoral fellow Medhi Khalloufi, work comes in as the accurate prediction of surface-plume interactions is necessary for the overall success of future space missions. While simulations of surface-plume interactions have been conducted in the past, these are outdated, and typically relied on simplified assumptions that prevent a detailed and dynamic analysis of the fluid-particle coupling. Capecelatro’s research code will provide NASA with a framework to better predict how different rover designs would impact the landing, specifically the effects of the force of the collision on the planet’s surface, and ability to adjust the rover’s landing trajectory independent of the NASA mission control team on earth.  

Prof. Capecelatro’s research project utilizes predictive simulation tools to capture the complex multiphase dynamics associated with rocket exhaust impingement during touchdown. Even in the most powerful supercomputers, a direct solution approach is only capable of accounting for  about a thousand particles at the same time, so accurate and predictive multi-scale models of the unresolved flow physics are essential. 

Full landing site image credit: NASA/JPL-Caltech (mars.nasa.gov/resources/24762/mars-sample-return-lander-touchdown-artists-concept/); particle and intermediate scale images: Capecelatro’s Research Group

Particle Scale

The group has been developing the simulation capabilities to directly resolve the flow at the sub-particle scale to shed light on important physics under the extreme conditions relevant to particle-structure interactions. Their model uses a massively parallel compressible particle-laden flow simulation tool where the exhaust plume and its corresponding flow features are computed in an Eulerian-Lagrangian framework. At this scale, for example, the flow between individual particles are resolved, providing important insight on drag and turbulence under these extreme conditions.

Intermediate Scale

As a next step, the particle-scale results inform models used in the intermediate-scale simulations developed by the group, where particles are still tracked individually but the flow is not resolved at a sub-particle resolution, allowing them to simulate upwards of 1 billion particles. At this scale, an Eularian-Lagrangian framework is used to incorporate the ground’s particle flow with the jet’s plume. 

Full Landing Site Scale

While the intermediate-scale simulations allow to study erosion and cratering, a full landing site that contains trillions of particles is still out of reach even in the most powerful HPC clusters. After further modeling, Capecelatro’s multi-scale framework will be handed over to NASA where it will be incorporated in simulations of the full landing site. At this scale, NASA’s framework uses an Eularian-based, two fluid model that treats both fluid and particles as a continuum, informed by the particle- and middle-scales models. 

Mission Mars 2020 is expanding NASA’s robotic presence on the red planet. While it is a big step to set the stage for future human exploration, the Perseverance Rover needs further redesign to make the voyage safe for humans. Capecelatro’s physics-based models are aiding this task by helping predict and model more accurately the outcomes of a spacecraft attempting to safely land millions of miles from home. As in many other fields, computational science will continue to play a critical role in the future of humanity’s quest to conquer space. #computationalscience everywhere!

Related links:
Sticking the landing on Mars: High-powered computing aims to reduce guesswork
Capecelatro’s Research Group
NASA 2020 Mars Mission: Perseverance Rover

Across six continents, scientists use computation to optimize cities’ responses to hazardous events

By | Events, Research, Uncategorized

“Community resilience is a manifestation of the human trait of adaptation. A resilient community is able to withstand and recover from hazardous events with minimal disruption to its way of life.”

Sherif El-Tawil
Antoine E. Naaman Collegiate Professor,
Department of Civil and Environmental Engineering

The combination of natural hazards, climate change, and the COVID-19 pandemic has demonstrated the importance of community resilience. Community resilience is a manifestation of the human trait of adaptation. A resilient community is able to withstand and recover from hazardous events with minimal disruption to its way of life. As humans, we seek to use our ability to engineer to adapt to the threat of natural hazards. Although achieving resilience is technically challenging and expensive, communities must strive to accomplish the highest level of resilience attainable with the engineering and financial resources available.

The science behind resilience engineering involves many disciplines, each dedicated to a subset of the overall problem. Complex issues lie at the intersection of these subsets, but interdisciplinary research is difficult to achieve because researchers in various disciplines frame problems and perform research from different perspectives and along distinct pathways. However, as computational models are well established in each discipline, computation is a natural language that links the disciplines together.

Last fall, the Michigan Institute for Computational Discovery and Engineering and the department of Civil and Environmental Engineering brought together established leaders and some of the most innovative rising scholars in the computational hazards research, to present and discuss different computational approaches used in modeling, assessing, and defining standards for community resilience. The speakers included representatives from leading research centers in the field: keynote speaker, Terri McAllister, from the National Institute of Standards and Technology (NIST); John van de Lindt (Colorado State University) co-director of the NIST-funded Center of Excellence (CoE) for Risk-Based Community Resilience Planning; Gregory Deierlein (Stanford University) from the SimCenter, which represents a consortium of universities on the U.S. West Coast; Sherif El-Tawil (University of Michigan) from ICoR, and Wael El-Dakhakhni (McMaster University) from INTERFACE.  They were joined

by other leaders in the fields including Tasos Sextos from Bristol University, UK, Xinzheng Lu, head of the Institute of Disaster Prevention and Mitigation of Tsinghua University; Hiba Baroud from Vanderbilt University, and Seth Guikema from the University of Michigan. The speakers highlighted their Centers’ or research groups’ capabilities and contributions, then reconvened for a panel discussion to address questions from the audience of nearly 250 participants from 30 countries, across six continents. The event also included a hands-on workshop that highlighted the Simple Run-Time Infrastructure software toolkit (SRTI). The SRTI is a free, open-source solution developed at the University of Michigan. It enables researchers to connect computer programs and simulators written in different languages, share data during execution, and design hybrid systems using disparate simulator modules, with a primary goal of being user friendly. The applications within this workshop demonstrated how one tool can be used to bring together multiple computational dialects to create a single language in the context of natural hazards research. The SRTI software toolkit is a result of the work of Dr. Sherif El-Tawil’s research group at the University of Michigan, supported by the National Science Foundation’s Office of Advanced Cyberinfrastructure (OAC) under grant CRISP TYPE II – 1638186. (icor.engin.umich.edu).

The range of techniques and principles that were detailed at this workshop can be applied to the current COVID-19 crisis. The pandemic is a perfect example that demonstrates that investing in mitigating risk reduces the cost, both human and material, of a hazard, and that even hazards with such a low probability of occurrence require enough investment to make ourselves resilient to it. The pandemic also illustrates that computational hazards research is a rich field with many opportunities at the intersection of the various disciplines. One of the most interesting ideas there is to explore is how to fuse sensor data – from the field – with simulations data, to achieve models that can help predict in real time the effect of a natural hazard.

Link to event information and recordings