MICDE catalyst grant leads to new NSF funding to study cascade “ecohydromics” in the Amazonian headwater system

By | Feature, News, Research

The Amazon Basin cycles more water through streamflow and evaporation than any other contiguous forest in the world, and transpiration by trees is a critical part of this cycle. Understanding how plant roots, stems, and leaves interact with soil water to regulate forest transpiration across landscapes is a critical knowledge gap, especially as climate changes. Professor Valeriy Ivanov, from the Department of Civil and Environmental Engineering at U-M, is the lead investigator in a newly NSF funded project that links diverse disciplines – plant ecophysiology, ecology, and hydrology – and will build a unique modeling framework to characterize landscape variation in physiological and hydrological processes in the Amazon Basin. The framework will integrate a wide array of field observations with detailed watershed modeling for hypothesis testing. The team includes Tyeen Taylor, research fellow also from the Civil and Environmental Engineering Department at U-M, and many collaborators in the U.S. at the University of Arizona, University of West Virginia, University of Nebraska, as well as Brazilian researchers at the Federal University of Eastern Para, and Federal University of Amazonas, National Institute for Amazonian Research, and Eastern Amazon Agricultural Agency. Detailed, physical models of ecophysiology and above- and below-ground hydrology will be informed by observations of leaf physiology, tree morphological traits, soil moisture, groundwater, and streamflow. Data and models will be integrated employing novel tools in probabilistic learning and uncertainty quantification. The computational framework tools to be used in this project were developed in part with the support from MICDE Catalyst grant program for the 2018 project “Urban Flood Modeling at “Human Action” Scale: Harnessing the Power of Reduced-Order Approaches and Uncertainty Quantification” led by Prof. Ivanov. 

Given (a) a mechanistic model M (e.g., a stomatal conductance model), (b) one can treat its inputs 𝛏 (e.g., parameters) as random variables. These inputs are sampled and model simulations are carried out. Using (c) PCEs, we construct a surrogate model that best approximates the model output – left-hand-side of (c). The surrogate is then evaluated with Monte Carlo simulations and used for (d) parameter inference. (d.1) is the flow of outputs from the surrogate model into a likelihood function L (D | 𝛏) to compare the surrogate model output and observed data D. This inference produces the posterior distribution for 𝛏. This pdf can then be sent back to the surrogate in (d.2) to reduce the uncertainty in the inputs and to obtain pdf for a quantity of interest (e).

“The reduced-ordered modeling approach developed during the MICDE Catalyst grant project is a key element of the new project,” said Prof. Ivanov, “the MICDE seed funding has allowed us to build a general framework that is applicable to a wide range of computational applications in earth-system science, and thus made our project proposal more competitive”.

The MICDE Catalyst Grants program funds projects that have the potential to catalyze and reorient the directions of their research fields by developing and harnessing powerful paradigms of computational science. This new NSF project is an example of the reach of the program.

Read more.

2021-2022 Catalyst Grant awardees continue to forge new fronts in computational science

By | Feature, News, Research

The Michigan Institute for Computational Discovery and Engineering (MICDE) announced the awardees of the 2021-2022 round of Catalyst Grants. Since 2017 MICDE Catalyst Grants program has funded a wide spectrum of cutting-edge research, this year focusing on AI for physically-based biomedicine, quantum computing, convergence of natural hazards with economic dislocation, and computational integration across scales and disciplines in biology. The five projects awarded in this round represent these new frontiers of computational research spearheaded by the Institute through its initiatives.

Prof. Shravan Veerapaneni (Mathematics) is working on advancing quantum algorithm research. His team will develop a Variational Quantum Monte Carlo algorithm that can potentially be applied to a wide range of linear algebraic tasks, like QR and Singular Value Decomposition (SVD). 

Profs. Salar Fattahi (Industrial and Operations Engineering) and Arvind Rao (Computational Medicine and Bioinformatics, Biostatistics) are revisiting existing theoretically powerful maximum-likelihood estimation mathematical methods to identify areas of weakness and strengthen them for use in biomedical, largely genomic, applications.

Profs. Gary Luker (Microbiology and Immunology), Nikola Banovic (Electrical Engineering and Computer Science), Xun Huan (Mechanical Engineering), Jennifer Linderman (Biomedical Engineering and Chemical Engineering), and Kathryn Luker (Radiology), will develop a physics/chemistry-aware inverse reinforcement learning (IRL) computational framework that will support the understanding single-cell and cooperative decision-making that drive tumor growth, metastasis, and recurrence.

Profs. Seth Guikema (Civil and Environmental Engineering and Industrial and Operations Engineering) and Jeremy Bricker (Civil and Environmental Engineering) will develop an integrated computational modeling approach to studying equity and resilience during natural hazard events, specifically estimating what essential services are the main constraints on individuals returning to a more normal life post-hazard, and assess inequities in resilience to coastal flooding events. 

Prof. Jesse Capecelatro (Mechanical Engineering and Aerospace Engineering) and Alberto Figueroa (Biomedical Engineering and Vascular Surgery), will develop a versatile, physics-driven, computationally efficient, and massively parallel numerical framework to simulate the interaction between fluids and biological particles in patient-specific vasculature geometries. This framework will enable next-generation computer-aided diagnostics.

“This year’s cohort of MICDE Catalyst Grants range from quantum computing for engineering science, AI for the physics of cancer, and computational advances in hazards engineering, through mathematical advances in data science, and bioengineering,” said MICDE Director Krishna Garikpati, a professor of mathematics and mechanical engineering. “These projects represent new frontiers of computational research spearheaded by MICDE through its initiatives.”

Learn more about MICDE’s Catalyst Grant program and funded projects here.

“This year’s cohort of MICDE Catalyst Grants … represent new frontiers of computational research spearheaded by MICDE through its initiatives.”

Krishna Garikipati
Director, MICDE

The crucial role of massively parallel simulations in future space exploration missions

By | HPC, News, Research

The NASA Mars 2020 Mission was launched with the goal of seeking signs of ancient life and collecting samples of rock and regolith (broken rock and soil) for possible return to Earth. Perseverance, the mission’s rover, is testing technologies to help pave the way for future human exploration of Mars. While Perseverance was launched in the summer of 2020, landing on the martian surface on February 18, 2021, the journey started years earlier when the mission’s objectives were outlined, including realistic surface operations, a proof-of-concept instrument suite, and suggestions for threshold science measurements that would meet the proposed objectives. The success of this, as well as past and future missions, is the collective result of thousands of NASA funded projects from teams of researchers and scientists from all over the country that span many decades. University of Michigan Professor Jesse Capecelatro (Mechanical Engineering & Aerospace Engineering) is the lead of one of these projects. In 2016, his research group started working on a project aimed to develop high fidelity models of plume-Induced soil erosion during lunar and planetary landings that will be used in future missions. 

During descent, exhaust plumes fluidize surface soil and dust, forming craters and buffeting the lander with coarse, abrasive particles. The SkyCrane technology, used by the Curiosity Rover in 2012 and by Perseverance in 2021, was designed to avoid plume-surface interactions by keeping the jets far above the surface. Regardless of this feature, a wind sensor on NASA’s Mars Curiosity rover was damaged during landing. In the Perseverance’s video footage of the landing, significant erosion and high-speed ejecta were observed.  It is also not a practical option for future crewed and sample return missions. 

NASA is aiming to improve rovers’ landing gears for future missions. Computational models and simulations are a critical component to achieve this as it is not feasible to mimic the martian or other celestial bodies entry conditions and run thousands of tests in a lab anywhere on Earth. This is where Prof. Capecelatro’s research group, including doctoral candidates Greg Shallcross, and Meet Patel, and postdoctoral fellow Medhi Khalloufi, work comes in as the accurate prediction of surface-plume interactions is necessary for the overall success of future space missions. While simulations of surface-plume interactions have been conducted in the past, these are outdated, and typically relied on simplified assumptions that prevent a detailed and dynamic analysis of the fluid-particle coupling. Capecelatro’s research code will provide NASA with a framework to better predict how different rover designs would impact the landing, specifically the effects of the force of the collision on the planet’s surface, and ability to adjust the rover’s landing trajectory independent of the NASA mission control team on earth.  

Prof. Capecelatro’s research project utilizes predictive simulation tools to capture the complex multiphase dynamics associated with rocket exhaust impingement during touchdown. Even in the most powerful supercomputers, a direct solution approach is only capable of accounting for  about a thousand particles at the same time, so accurate and predictive multi-scale models of the unresolved flow physics are essential. 

Full landing site image credit: NASA/JPL-Caltech (mars.nasa.gov/resources/24762/mars-sample-return-lander-touchdown-artists-concept/); particle and intermediate scale images: Capecelatro’s Research Group

Particle Scale

The group has been developing the simulation capabilities to directly resolve the flow at the sub-particle scale to shed light on important physics under the extreme conditions relevant to particle-structure interactions. Their model uses a massively parallel compressible particle-laden flow simulation tool where the exhaust plume and its corresponding flow features are computed in an Eulerian-Lagrangian framework. At this scale, for example, the flow between individual particles are resolved, providing important insight on drag and turbulence under these extreme conditions.

Intermediate Scale

As a next step, the particle-scale results inform models used in the intermediate-scale simulations developed by the group, where particles are still tracked individually but the flow is not resolved at a sub-particle resolution, allowing them to simulate upwards of 1 billion particles. At this scale, an Eularian-Lagrangian framework is used to incorporate the ground’s particle flow with the jet’s plume. 

Full Landing Site Scale

While the intermediate-scale simulations allow to study erosion and cratering, a full landing site that contains trillions of particles is still out of reach even in the most powerful HPC clusters. After further modeling, Capecelatro’s multi-scale framework will be handed over to NASA where it will be incorporated in simulations of the full landing site. At this scale, NASA’s framework uses an Eularian-based, two fluid model that treats both fluid and particles as a continuum, informed by the particle- and middle-scales models. 

Mission Mars 2020 is expanding NASA’s robotic presence on the red planet. While it is a big step to set the stage for future human exploration, the Perseverance Rover needs further redesign to make the voyage safe for humans. Capecelatro’s physics-based models are aiding this task by helping predict and model more accurately the outcomes of a spacecraft attempting to safely land millions of miles from home. As in many other fields, computational science will continue to play a critical role in the future of humanity’s quest to conquer space. #computationalscience everywhere!

Related links:
Sticking the landing on Mars: High-powered computing aims to reduce guesswork
Capecelatro’s Research Group
NASA 2020 Mars Mission: Perseverance Rover

Across six continents, scientists use computation to optimize cities’ responses to hazardous events

By | Events, Research, Uncategorized

“Community resilience is a manifestation of the human trait of adaptation. A resilient community is able to withstand and recover from hazardous events with minimal disruption to its way of life.”

Sherif El-Tawil
Antoine E. Naaman Collegiate Professor,
Department of Civil and Environmental Engineering

The combination of natural hazards, climate change, and the COVID-19 pandemic has demonstrated the importance of community resilience. Community resilience is a manifestation of the human trait of adaptation. A resilient community is able to withstand and recover from hazardous events with minimal disruption to its way of life. As humans, we seek to use our ability to engineer to adapt to the threat of natural hazards. Although achieving resilience is technically challenging and expensive, communities must strive to accomplish the highest level of resilience attainable with the engineering and financial resources available.

The science behind resilience engineering involves many disciplines, each dedicated to a subset of the overall problem. Complex issues lie at the intersection of these subsets, but interdisciplinary research is difficult to achieve because researchers in various disciplines frame problems and perform research from different perspectives and along distinct pathways. However, as computational models are well established in each discipline, computation is a natural language that links the disciplines together.

Last fall, the Michigan Institute for Computational Discovery and Engineering and the department of Civil and Environmental Engineering brought together established leaders and some of the most innovative rising scholars in the computational hazards research, to present and discuss different computational approaches used in modeling, assessing, and defining standards for community resilience. The speakers included representatives from leading research centers in the field: keynote speaker, Terri McAllister, from the National Institute of Standards and Technology (NIST); John van de Lindt (Colorado State University) co-director of the NIST-funded Center of Excellence (CoE) for Risk-Based Community Resilience Planning; Gregory Deierlein (Stanford University) from the SimCenter, which represents a consortium of universities on the U.S. West Coast; Sherif El-Tawil (University of Michigan) from ICoR, and Wael El-Dakhakhni (McMaster University) from INTERFACE.  They were joined

by other leaders in the fields including Tasos Sextos from Bristol University, UK, Xinzheng Lu, head of the Institute of Disaster Prevention and Mitigation of Tsinghua University; Hiba Baroud from Vanderbilt University, and Seth Guikema from the University of Michigan. The speakers highlighted their Centers’ or research groups’ capabilities and contributions, then reconvened for a panel discussion to address questions from the audience of nearly 250 participants from 30 countries, across six continents. The event also included a hands-on workshop that highlighted the Simple Run-Time Infrastructure software toolkit (SRTI). The SRTI is a free, open-source solution developed at the University of Michigan. It enables researchers to connect computer programs and simulators written in different languages, share data during execution, and design hybrid systems using disparate simulator modules, with a primary goal of being user friendly. The applications within this workshop demonstrated how one tool can be used to bring together multiple computational dialects to create a single language in the context of natural hazards research. The SRTI software toolkit is a result of the work of Dr. Sherif El-Tawil’s research group at the University of Michigan, supported by the National Science Foundation’s Office of Advanced Cyberinfrastructure (OAC) under grant CRISP TYPE II – 1638186. (icor.engin.umich.edu).

The range of techniques and principles that were detailed at this workshop can be applied to the current COVID-19 crisis. The pandemic is a perfect example that demonstrates that investing in mitigating risk reduces the cost, both human and material, of a hazard, and that even hazards with such a low probability of occurrence require enough investment to make ourselves resilient to it. The pandemic also illustrates that computational hazards research is a rich field with many opportunities at the intersection of the various disciplines. One of the most interesting ideas there is to explore is how to fuse sensor data – from the field – with simulations data, to achieve models that can help predict in real time the effect of a natural hazard.

Link to event information and recordings

Reducing lung cancer mortality through modeling and simulations

By | Feature, Research

Lung cancer remains the leading cause of cancer related mortality in the US, and globally, accounting for 1.8 million deaths annually. Many of these deaths are preventable by the implementation of prevention strategies, including tobacco control policies and lung cancer screening recommendations, and by improvements in lung cancer treatment.  In the US, these policies have generally been implemented based on the analysis and outcomes of the population as a whole, although data analyses have shown that smoking and lung cancer rates, and access to healthcare and interventions, vary significantly by education, income, and race/ethnicity.

The Cancer Intervention and Surveillance Modeling Network (CISNET) Lung Working Group (LWG), led by Rafael Meza, associate professor of Epidemiology from the School of Public Health and MICDE member, has been awarded a new $8.5M grant to investigate the synergistic impacts of tobacco control policies, lung cancer screening and treatment interventions in the US and in middle-income nations. For the past 15 years, the CISNET LWG has contributed to the development of US national strategies for reducing the lung cancer burden by quantifying, through modeling and simulation, the impact of tobacco control on smoking, lung cancer, and overall mortality, as well as the population benefits and harms of lung cancer screening. This new grant will allow the group to expand their work to consider the impact of treatment improvements, including targeted therapies and immunotherapies,  and the synergies between treatment and prevention interventions. It also will enable the researchers to continue their work in addressing smoking and lung cancer disparities. The consortium uses a comparative modeling approach, where multiple, but distinct, models use the same data inputs, and aim to answer a common question with different approaches. This allows the group to assess the strengths and weaknesses of the different models, and aid the decision making process.

Established in 2000, CISNET is a consortium of NCI-sponsored investigators who use modeling and simulation to improve their understanding of cancer control interventions in prevention, screening, and treatment and their effects on population trends in incidence and mortality. CISNET is committed to bringing the most sophisticated evidence-based planning tools to population health and public policy. These models have been used to guide public health research and priorities, and have aided the development of optimal cancer control strategies. Besides lung cancer, CISNET also includes breast, colon, cervical esophageal and prostate cancer groups. 

Alternatives Research & Development Foundation to Support Research on COVID-19, Aiming for Advancement in Non-animal Methods of Drug Discovery

By | News, Research

Pharmaceutical companies across the globe are racing to introduce clinically tested and approved therapeutic drugs that fight COVID-19 virus to market. As is typical in drug discovery research, animals have played a critical role in the development and testing of COVID-19 therapeutics. A proposal by U-M Professor Rudy J. Richardson, Dow Professor Emeritus of Toxicology, Professor Emeritus of Environmental Health Sciences, and Associate Professor Emeritus of Neurology at the University of Michigan, titled “Discovering host factor inhibitors in silico for SARS-CoV-2 entry and replication” has been awarded funding to identify compounds that bind to human proteins that facilitate entry and/or replication of the SARS-CoV-2 virus. Awarded, in part, because of its potential to develop alternative methods to advance science and replace or reduce animal use, this research will employ in silico ligand protein docking to discover existing drugs (repurposing) and/or new drug candidates capable of inhibiting host proteins involved in infection pathways for the COVID-19 virus, SARS-CoV-2.

Protein docking targets include four serine hydrolases. Using these targets, researchers will reversibly dock approximately 40,000 ligands from the Binding Database comprising FDA-approved drugs along with serine protease and PLA2 inhibitors, including organoboron compounds. Then, covalent docking will be conducted on a ligand subset containing pharmacophores capable of covalently binding serine hydrolases. Consensus ranking from four docking programs will be used to generate a penultimate list of candidate compounds. Those showing high predicted potency against off-target serine hydrolases will be excluded. The final list of compounds will be made publicly available for further evaluation in bioassays.

Professor Richardson’s grant, awarded by the Alternatives Research & Development Foundation, is a part of the ARDF’s 2020 Open Grants program, funding research projects that develop alternate methods to advance science and replace or reduce animal use. Although the immediate goal of this computational study supports the identification or development of a COVID-19 vaccine, the long-range vision is to advance computational and in vitro approaches to eliminate animal use from drug discovery for humans and other species. 

MICDE Affiliated Faculty member Rudy J. Richardson is a Dow Professor Emeritus of Toxicology and Professor Emeritus of Environmental Health Sciences within the School of Public Health, and Associate Professor Emeritus of Neurology within the Medical School at the University of Michigan.

MICDE funds wide-ranging computational discovery in galactic formation, drug discovery, bacterial biofilm colonies and turbulence simulations

By | News, Research

Since 2017 the Michigan Institute for Computational Discovery & Engineering (MICDE) Catalyst Grants program has funded a wide spectrum of cutting-edge research that combines science, engineering, mathematics and computer science. This year the program will fund four new projects that continue this tradition: Prof. Aaron Frank (Chemistry) and his group will spearhead efficient strategies to rapidly develop treatments for emerging diseases– a need made more compelling by the current COVID-19 Pandemic. Their approach combines generative artificial intelligence models and molecular docking to rapidly explore the space of chemical structures and generate target-specific virtual libraries for drug discovery. Prof. Marisa Eisenberg (Epidemiology, Mathematics, and Complex Systems) and Prof. Alexander Rickard’s (Epidemiology) groups will develop novel computational techniques to study biofilm architectures.  Biofilms are complex assemblages of microbial cells that form on almost any natural and man-made surface. They cause several debilitating diseases, and can even damage machinery and equipment, elevating the understanding of their behaviour to a critical need. Prof. Oleg Gnedin (Astronomy) will develop novel techniques to tailor the mathematical initial conditions from which to simulate chosen regions of the universe. The resulting insights will help uncover the origins of our own galaxy, the Milky Way. Finally, Prof. Aaron Towne (Mechanical Engineering) will advance the modeling of complex, turbulent flows and other large-scale systems in engineering science. His research will enable orders of magnitude of acceleration in the computation of extremely large scale flows in a number of engineering systems.

“These four projects have the potential to catalyze and  reorient the directions of their research fields by developing and harnessing powerful paradigms of computational science”, said Krishna Garikipati, Professor of Mechanical Engineering and of Mathematics, and MICDE’s Director. “MICDE’s mission is to lead the advances in computational science research by bringing together interdisciplinary teams at U of M, and these projects embody that vision.” 

More about MICDE’s catalyst grant program and the projects can be found at micde.umich.edu/catalyst.

Microsoft AI for Health Program to support an AI-facilitated Optimization Framework for Improving COVID-19 Testing

By | News, Research

With the recent resurgence of COVID-19 infections, testing has become central to an integrated, global response to the pandemic. Accurate, effective, and efficient testing can lead to early detection and prompt an agile response by public health authorities. Strategic testing systems are critical for providing data that will inform disease prevention, preparation, and intervention. MICDE Associate Director and Associate Professor of Industrial and Operations Engineering and of Civil and Environmental Engineering, Siqian Shen, has recently published an article pin-pointing a number of pivotal operations research and industrial engineering tools that can be brought to  the fight against COVID-19. One of the key lessons from her research is the importance of expanding the availability of COVID-19 testing and making the resulting data transparent to the public as anonymized, summary statistics. This enables informed decision making by individuals, public health officials, and governments.  

Based on these high-impact findings, Professor Shen is striding ahead to design a comprehensive COVID-19 testing framework to efficiently serve the urgent needs of diverse population groups . A grant from Microsoft’s AI for Health program, part of the AI for Good initiative, will provide credits to use Microsoft’s Azure service.  With this cyber resource, Professor Shen and her team will integrate and coordinate decision-making models and data analytics tools that they have developed for testing on a Cloud-based platform. In addition, their AI framework is dynamic, and collects daily infection data to improve testing-related decisions. Such a platform could have significant impacts on three major problems that exist with current testing design strategies:

1) Where to locate testing facilities and how to allocate test kits and other resources.
2) How to effectively triage different population groups through effective appointment scheduling.
3) How to visualize real-time testing capacities to better inform the public and serve ad-hoc needs of patients. 

Prof. Shen’s research will integrate AI techniques with optimization to dynamically refine existing testing design methods for gathering and analyzing data from unexplored populations and regions around the globe. The development and refinement of these new models with the support of Microsoft Azure will create a transparent, data-informed testing system that will allow public health and government authorities to make agile, data-driven decisions to aid in the prevention, preparation, intervention, and management of COVID-19 and other outbreaks of infectious diseases.

Siqian Shen is a  Professor of Industrial and Operations Engineering, and of Civil and Environmental Engineering at the University of Michigan, an Associate Director of the Michigan Institute for Computational Discovery & Engineering, and an affiliated faculty member in the Michigan Institute for Data Science. Her research group works on both theoretical and applied aspects of problems by combining stochastic programming, integer programming, network optimization,  machine learning and statistics.

What is the right model? Different MRIO models yield very different carbon footprints estimates in China

By | Research

Appropriate accounting of greenhouse gas emissions is the first step to assign mitigation responsibilities and develop effective mitigation strategies. Consistent methods are required to fairly assess a region’s impact on climate change. Two leading reasons for the existence of different accounting systems are the political pressures, and the actual costs of climate mitigation to local governments. At the international level there has been consensus, and global environmentally extended multi-regional input-output (EE-MRIO) models that capture the interdependence of and their environmental impacts have been constructed.  However in China, the largest greenhouse gas emitter, where accurate interregional trade-related emission accounts are critical to develop mitigation strategies and monitor progresses at the regional level, this information is sporadic and inconsistent. Prof. Ming Xu from the School of Environment and Sustainability, and his research group, analyzed the available data from China, which dates back to 2012. They showed that the results varied wildly depending on the MRIO model used. For example, they found two MRIO models differed as much as 208 metric tons in a single region, which is equivalent to the emissions of Argentina, United Arab Emirates, or the Netherlands. Their results show the need to prioritize future efforts to harmonize greenhouse gas emissions accounting within China.

Ming Xu is an Associate Professor in the School for Environment and Sustainability and in the Department of Civil and Environmental Engineering at the University of Michigan, Ann Arbor. His research focuses on the broad fields of sustainable engineering and industrial ecology. 

Read the full article.

Modeling the transmission of infectious aerosols

By | Feature, Research

Inhalation of micron-sized droplets represents the dominant transmission mechanism for influenza and rhinovirus, and recent research shows that it is likely also the case for the novel coronavirus.  Increasing evidence suggests that the transmission of infectious aerosols is more complex than previously thought. Coughing, sneezing and even talking yield a gaseous flow field near the infected person that is dynamic and turbulent in nature. Existing models commonly employed in simulations of aerosol transmission attempt to represent the effect of turbulence using random walk models that are often phenomenological in nature, employing adjustable parameters and inherently assuming the turbulent fluctuations ‘felt’ by a droplet do not depend upon direction. To design physics-informed guidelines to minimize the spread of this virus, improved predictive modeling capabilities for effectively tracking the aerosol paths are needed. Dr. Aaron M. Lattanzi and Prof. Jesse Capecelatro, from Mechanical Engineering and MICDE are tackling this problem by focusing on mathematical modeling of aerosol dispersion. They derived analytical solutions for the mean-squared-displacement resulting from systems of stochastic differential equations. A key element of their methodology is that the solution connects stochastic theory inputs to statistics present in high-fidelity simulations or experiments, providing a framework for developing improved models.

Simple simulation of aerosol dispersion from a single-point source. The grey, cone-like surface is the approximation using Force Langevin (FL) theory and the colored particles are from integration of Newton’s equations with stochastic drag forces.

Prof. Capecelatro’s research group develops physics-based models and numerical algorithms to leverage supercomputers for prediction and optimization of the complex flows relevant to energy and the environment. The main focus of their research involves developing robust and scalable numerical tools to investigate the multiphysics and multiscale phenomena under various flow conditions, like those that they study here. They recently submitted their findings to the Journal of Fluid Mechanics, and are continuing to work on this problem hoping it will help understand the transmission of COVID-19 and therefore help optimize current guidelines.