The crucial role of massively parallel simulations in future space exploration missions

By | HPC, News, Research

The NASA Mars 2020 Mission was launched with the goal of seeking signs of ancient life and collecting samples of rock and regolith (broken rock and soil) for possible return to Earth. Perseverance, the mission’s rover, is testing technologies to help pave the way for future human exploration of Mars. While Perseverance was launched in the summer of 2020, landing on the martian surface on February 18, 2021, the journey started years earlier when the mission’s objectives were outlined, including realistic surface operations, a proof-of-concept instrument suite, and suggestions for threshold science measurements that would meet the proposed objectives. The success of this, as well as past and future missions, is the collective result of thousands of NASA funded projects from teams of researchers and scientists from all over the country that span many decades. University of Michigan Professor Jesse Capecelatro (Mechanical Engineering & Aerospace Engineering) is the lead of one of these projects. In 2016, his research group started working on a project aimed to develop high fidelity models of plume-Induced soil erosion during lunar and planetary landings that will be used in future missions. 

During descent, exhaust plumes fluidize surface soil and dust, forming craters and buffeting the lander with coarse, abrasive particles. The SkyCrane technology, used by the Curiosity Rover in 2012 and by Perseverance in 2021, was designed to avoid plume-surface interactions by keeping the jets far above the surface. Regardless of this feature, a wind sensor on NASA’s Mars Curiosity rover was damaged during landing. In the Perseverance’s video footage of the landing, significant erosion and high-speed ejecta were observed.  It is also not a practical option for future crewed and sample return missions. 

NASA is aiming to improve rovers’ landing gears for future missions. Computational models and simulations are a critical component to achieve this as it is not feasible to mimic the martian or other celestial bodies entry conditions and run thousands of tests in a lab anywhere on Earth. This is where Prof. Capecelatro’s research group, including doctoral candidates Greg Shallcross, and Meet Patel, and postdoctoral fellow Medhi Khalloufi, work comes in as the accurate prediction of surface-plume interactions is necessary for the overall success of future space missions. While simulations of surface-plume interactions have been conducted in the past, these are outdated, and typically relied on simplified assumptions that prevent a detailed and dynamic analysis of the fluid-particle coupling. Capecelatro’s research code will provide NASA with a framework to better predict how different rover designs would impact the landing, specifically the effects of the force of the collision on the planet’s surface, and ability to adjust the rover’s landing trajectory independent of the NASA mission control team on earth.  

Prof. Capecelatro’s research project utilizes predictive simulation tools to capture the complex multiphase dynamics associated with rocket exhaust impingement during touchdown. Even in the most powerful supercomputers, a direct solution approach is only capable of accounting for  about a thousand particles at the same time, so accurate and predictive multi-scale models of the unresolved flow physics are essential. 

Full landing site image credit: NASA/JPL-Caltech (mars.nasa.gov/resources/24762/mars-sample-return-lander-touchdown-artists-concept/); particle and intermediate scale images: Capecelatro’s Research Group

Particle Scale

The group has been developing the simulation capabilities to directly resolve the flow at the sub-particle scale to shed light on important physics under the extreme conditions relevant to particle-structure interactions. Their model uses a massively parallel compressible particle-laden flow simulation tool where the exhaust plume and its corresponding flow features are computed in an Eulerian-Lagrangian framework. At this scale, for example, the flow between individual particles are resolved, providing important insight on drag and turbulence under these extreme conditions.

Intermediate Scale

As a next step, the particle-scale results inform models used in the intermediate-scale simulations developed by the group, where particles are still tracked individually but the flow is not resolved at a sub-particle resolution, allowing them to simulate upwards of 1 billion particles. At this scale, an Eularian-Lagrangian framework is used to incorporate the ground’s particle flow with the jet’s plume. 

Full Landing Site Scale

While the intermediate-scale simulations allow to study erosion and cratering, a full landing site that contains trillions of particles is still out of reach even in the most powerful HPC clusters. After further modeling, Capecelatro’s multi-scale framework will be handed over to NASA where it will be incorporated in simulations of the full landing site. At this scale, NASA’s framework uses an Eularian-based, two fluid model that treats both fluid and particles as a continuum, informed by the particle- and middle-scales models. 

Mission Mars 2020 is expanding NASA’s robotic presence on the red planet. While it is a big step to set the stage for future human exploration, the Perseverance Rover needs further redesign to make the voyage safe for humans. Capecelatro’s physics-based models are aiding this task by helping predict and model more accurately the outcomes of a spacecraft attempting to safely land millions of miles from home. As in many other fields, computational science will continue to play a critical role in the future of humanity’s quest to conquer space. #computationalscience everywhere!

Related links:
Sticking the landing on Mars: High-powered computing aims to reduce guesswork
Capecelatro’s Research Group
NASA 2020 Mars Mission: Perseverance Rover

Across six continents, scientists use computation to optimize cities’ responses to hazardous events

By | Events, Research, Uncategorized

“Community resilience is a manifestation of the human trait of adaptation. A resilient community is able to withstand and recover from hazardous events with minimal disruption to its way of life.”

Sherif El-Tawil
Antoine E. Naaman Collegiate Professor,
Department of Civil and Environmental Engineering

The combination of natural hazards, climate change, and the COVID-19 pandemic has demonstrated the importance of community resilience. Community resilience is a manifestation of the human trait of adaptation. A resilient community is able to withstand and recover from hazardous events with minimal disruption to its way of life. As humans, we seek to use our ability to engineer to adapt to the threat of natural hazards. Although achieving resilience is technically challenging and expensive, communities must strive to accomplish the highest level of resilience attainable with the engineering and financial resources available.

The science behind resilience engineering involves many disciplines, each dedicated to a subset of the overall problem. Complex issues lie at the intersection of these subsets, but interdisciplinary research is difficult to achieve because researchers in various disciplines frame problems and perform research from different perspectives and along distinct pathways. However, as computational models are well established in each discipline, computation is a natural language that links the disciplines together.

Last fall, the Michigan Institute for Computational Discovery and Engineering and the department of Civil and Environmental Engineering brought together established leaders and some of the most innovative rising scholars in the computational hazards research, to present and discuss different computational approaches used in modeling, assessing, and defining standards for community resilience. The speakers included representatives from leading research centers in the field: keynote speaker, Terri McAllister, from the National Institute of Standards and Technology (NIST); John van de Lindt (Colorado State University) co-director of the NIST-funded Center of Excellence (CoE) for Risk-Based Community Resilience Planning; Gregory Deierlein (Stanford University) from the SimCenter, which represents a consortium of universities on the U.S. West Coast; Sherif El-Tawil (University of Michigan) from ICoR, and Wael El-Dakhakhni (McMaster University) from INTERFACE.  They were joined

by other leaders in the fields including Tasos Sextos from Bristol University, UK, Xinzheng Lu, head of the Institute of Disaster Prevention and Mitigation of Tsinghua University; Hiba Baroud from Vanderbilt University, and Seth Guikema from the University of Michigan. The speakers highlighted their Centers’ or research groups’ capabilities and contributions, then reconvened for a panel discussion to address questions from the audience of nearly 250 participants from 30 countries, across six continents. The event also included a hands-on workshop that highlighted the Simple Run-Time Infrastructure software toolkit (SRTI). The SRTI is a free, open-source solution developed at the University of Michigan. It enables researchers to connect computer programs and simulators written in different languages, share data during execution, and design hybrid systems using disparate simulator modules, with a primary goal of being user friendly. The applications within this workshop demonstrated how one tool can be used to bring together multiple computational dialects to create a single language in the context of natural hazards research. The SRTI software toolkit is a result of the work of Dr. Sherif El-Tawil’s research group at the University of Michigan, supported by the National Science Foundation’s Office of Advanced Cyberinfrastructure (OAC) under grant CRISP TYPE II – 1638186. (icor.engin.umich.edu).

The range of techniques and principles that were detailed at this workshop can be applied to the current COVID-19 crisis. The pandemic is a perfect example that demonstrates that investing in mitigating risk reduces the cost, both human and material, of a hazard, and that even hazards with such a low probability of occurrence require enough investment to make ourselves resilient to it. The pandemic also illustrates that computational hazards research is a rich field with many opportunities at the intersection of the various disciplines. One of the most interesting ideas there is to explore is how to fuse sensor data – from the field – with simulations data, to achieve models that can help predict in real time the effect of a natural hazard.

Link to event information and recordings

University of Michigan’s Ph.D. in Scientific Computing: A history of supporting research through education

By | Educational, Feature

#Computationalscience everywhere!

Left side, 2167 configuration console for the IBM/System 360 Model 67-2 (duplex) at the University of Michigan, c. 1969 [Picture by Scott Gerstenberger – Scott Gerstenberger, Public Domain]

The University of Michigan’s joint Ph.D. program in Scientific Computing recently achieved a record enrollment of 137 students. Between 2015, when 15 students were enrolled -mainly from the Colleges of Engineering and Literature, Science and the Arts- and today, the program has witnessed an explosive growth of interest on the part of U-M students. The program now has students enrolled from over 30 departments spanning 8 different schools and colleges, and more than 130 students have graduated in the last 31 years, including 17 students to-date in 2020.

This popularity is emblematic of the dominant role that computation plays in the world today. With the breakneck pace at which new hardware and software architectures are being developed, the boom in simulation-based research in a growing number of disciplines, and the initiatives in data and computational sciences implemented at U-M in the last few years, including the establishment of the Michigan Institute for Computational Discovery & Engineering, and the Michigan Institute for Data Science (MIDAS), it may seem only natural that scientific computing should attract this level of interest. However, like all exceptionally successful undertakings, it owes a great deal to its past. We reached back more than three decades to piece together the history of the Ph.D. in Scientific Computing at U-M.

The broader history of computational science and high performance computing at the University of Michigan is rich and extensive. U-M has been at the forefront of Cyberinfrastructure research for many decades, marked by the acquisition of U-M’s first virtual memory computer in 1967, an IBM 360/67, one of the first computers of its kind in the world. This milestone was followed by many others, including further hardware acquisitions and establishment of new units to support advanced research computing. An important early step was taken in 1985 when the College of Engineering established the Laboratory for Scientific Computation (LaSC). LaSC’s goal was to foster and promote the use of scientific computation in research and instruction at U-M. During those years, several reports from national study committees established computational science as the third pillar of scientific methodology, along with theory and experimentation. Faculty members of LaSC, who were at the forefront of driving these trends  recognized that any initiative in this field needed to include a robust student training program. 

left: Prof. Kenneth Powell (Aerospace Engineering), director of the Ph.D. in Scientific Computing program since 2005; right: Prof. William Martin (Nuclear Eng. and Rad. Sciences), director of the program from 1989 to 2004.

Prominent at that time in LaSC were Prof. William “Bill” Martin (Nuclear Engineering and Radiological Sciences – NERS), the laboratory’s director, Prof. John Boyd (Atmospheric, Oceanic and Space Sciences), the laboratory’s associate director, and Prof. Edward Larsen (NERS), who was hired as part of the College of Engineering’s initiative to move aggressively in the area of scientific computing. Together, they designed a graduate academic program with the goal of giving students a more comprehensive training in numerical analysis and computer science than is typically possible within standard disciplinary programs housed within individual departments and schools. The fundamental idea was that, to excel in computational science and engineering, one must have a thorough understanding of the mathematical and physical problems to be solved, expertise in  the methodologies and algorithms, and a foundation in computer science to be able to apply this arsenal of techniques on modern computer platforms. The need for a thorough understanding of the physical problems led directly to the requirement that students had to be enrolled in a traditional Rackham degree program (i.e., a home department), while the need for mathematical underpinning and knowledge of algorithms and computer science topics led to the requirements for courses in numerical analysis, parallel algorithms, and related topics. The PhD in Scientific Computing program was approved by the State of Michigan in 1988, and enrolled its first students in 1989. This was well in advance of a wider recognition of the centrality of computation in academia and industry. It is true today, as it was in 1988, that students can apply to the PhD in Scientific Computing program from any Rackham-recognized PhD program at the UM. This unique and flexible administrative structure has enabled the rapid growth experienced in recent years as scientific computing has become an indispensable tool in many fields of academic endeavor. 

Prof. Quentin Stout, director of the Center for Parallel Computing 1992-2001 [Picture source: NASA Insights 1998]

The oversight of the degree program has evolved over the years as administrative structures around scientific computing have shifted. Regardless of its administrative home, the program has always been organized under the Rackham School of Graduate Studies. Originally, the College of Engineering had oversight of the program, with Prof. Martin appointed as director, and with guidance from the LaSC Education Committee. This setup continued through the merger of LaSC and the Center for Parallel Computing1 into the Center for Advanced Computing in 2001. In 2005, Prof. Kenneth Powell (Aerospace Engineering) was named director of the program succeeding Prof. Martin, and has continued in the role since. In 2008, the Office of Research Cyberinfrastructure (ORCI) was established, and the oversight of the program changed to the U-M Office of Research. In 2013, when ORCI was re-named as Advanced Research Computing, and the Michigan Institute for Computational Discovery & Engineering (MICDE) was born, oversight was transferred to MICDE.

Since its inception, the program has been described as intended for students who will make intensive use of large-scale computation, computational methods or algorithms in their doctoral studies. Although the requirements and goals of the program have not  changed in 31 years, the research applications, the algorithms and methodologies, and the computer platforms have been in constant evolution. Naturally, the courses offered in support of the program have followed closely. In 1989 the core research areas behind the program were computational fluid dynamics, advanced computer architectures, and particle transport, with the majority of the students coming from engineering, and mathematics. Still, students working in areas where computation was less recognized, such as AIDS transmission or social research projects, also were enrolled. Over the next two decades, the tremendous increase in simulation-based research by the faculty in engineering and physical sciences added many other focus areas, including material science, astronomy, and high energy physics, to name just a few. This growth added a new driver as data-intensive research gained importance in those fields. 

Prof. Suzanne Weekes, Associate Dean of Undergraduate Studies, ad interim, and Professor of Mathematical Sciences at Worcester Polytechnic Institute (U-M 1995, Mathematics and Scientific Computing) [Picture source: SIAM News Sept. 2020]

Several faculty members have had an important role shaping the program, by offering fundamental courses and providing mentorship. Notably, Prof. Quentin Stout, from Computer Science and Engineering, has had a prominent role in the program. He was the founding director of the Center for Parallel Computing, which  provided the basis for subsequent units in this sphere at U-M. He also developed, and has been teaching, Parallel Computing since 1985, innovating its curriculum to remain at the cutting edge of the current techniques, important aspects of which have been based on his own research. Other foundational courses, such as the Department of Mathematics’ Numerical Methods for Scientific Computing I & II and Numerical Linear Algebra have been offered for more than 30 years. More recently the Department of Physics course, Computational Physics, and the College of Engineering course, Methods and Practice of Scientific Computing, along with an array of courses in machine learning, have played prominent roles in transforming the curriculum in scientific computing as research in these areas has likewise redefined the field.

Unsurprisingly, the Ph.D. in Scientific Computing has produced many exceptional alumni. The first student graduated from the program in 1992, and notably for its time, two of the first four graduates were women, when gender imbalance was barely recognized. A majority of the program graduates went on to  positions in academia or the National Laboratories, with the rest working in varied fields in industry or government. Some of these outstanding alumni include Suzanne Weekes, U-M 1995 (Mathematics and Scientific Computing), currently Associate Dean of Undergraduate Studies, ad interim, and Professor of Mathematical Sciences at Worcester Polytechnic Institute. Prof. Weekes has recently been named SIAM executive director, and will start her new role on January 1, 2021.  Another alumna, Rona Oran, U-M 2014 (Space Science and Scientific Computing), is a computational plasma physicist at MIT and a member of the NASA team that is designing and planning a mission to the metal asteroid Psyche to be launched in 2020.

The current goal of the program is still founded on the original idea of strengthening the students’ foundations in methodology and computer science. The leadership of the program strives to bring computational science to more research fields, but importantly, aims to do so by enhancing diversity in the field. An important marker of U-M’s success on this front came in  2018 in the form of the Henry Luce Foundation’s award to the University of two Claire Boothe Luce Ph.D. fellowships for women to enroll in the Ph.D. in Scientific Computing. The program is committed to pursuing other such opportunities and creating an environment where students of all backgrounds and identities feel welcome and thrive.

1 In 1992 U-M was awarded a major equipment grant by the National Science Foundation to create a testbed of parallel computing architectures. The Center for Parallel Computing was established to operate the facility. The center installed and operated several different parallel computers over the years, including KSR-1, KSR-2, Convex Exemplar, SGI PowerChallenge, IBM SP2, and AMD and Apple clusters.

Reducing lung cancer mortality through modeling and simulations

By | Feature, Research

Lung cancer remains the leading cause of cancer related mortality in the US, and globally, accounting for 1.8 million deaths annually. Many of these deaths are preventable by the implementation of prevention strategies, including tobacco control policies and lung cancer screening recommendations, and by improvements in lung cancer treatment.  In the US, these policies have generally been implemented based on the analysis and outcomes of the population as a whole, although data analyses have shown that smoking and lung cancer rates, and access to healthcare and interventions, vary significantly by education, income, and race/ethnicity.

The Cancer Intervention and Surveillance Modeling Network (CISNET) Lung Working Group (LWG), led by Rafael Meza, associate professor of Epidemiology from the School of Public Health and MICDE member, has been awarded a new $8.5M grant to investigate the synergistic impacts of tobacco control policies, lung cancer screening and treatment interventions in the US and in middle-income nations. For the past 15 years, the CISNET LWG has contributed to the development of US national strategies for reducing the lung cancer burden by quantifying, through modeling and simulation, the impact of tobacco control on smoking, lung cancer, and overall mortality, as well as the population benefits and harms of lung cancer screening. This new grant will allow the group to expand their work to consider the impact of treatment improvements, including targeted therapies and immunotherapies,  and the synergies between treatment and prevention interventions. It also will enable the researchers to continue their work in addressing smoking and lung cancer disparities. The consortium uses a comparative modeling approach, where multiple, but distinct, models use the same data inputs, and aim to answer a common question with different approaches. This allows the group to assess the strengths and weaknesses of the different models, and aid the decision making process.

Established in 2000, CISNET is a consortium of NCI-sponsored investigators who use modeling and simulation to improve their understanding of cancer control interventions in prevention, screening, and treatment and their effects on population trends in incidence and mortality. CISNET is committed to bringing the most sophisticated evidence-based planning tools to population health and public policy. These models have been used to guide public health research and priorities, and have aided the development of optimal cancer control strategies. Besides lung cancer, CISNET also includes breast, colon, cervical esophageal and prostate cancer groups. 

We welcome 15 students to the 2020-21 class of MICDE graduate fellows

By | Educational, News

MICDE is proud to announce the recipients of the 2020 MICDE graduate fellowships. The fellows’ research projects involve the use and advancement of scientific computing techniques and practices. From political science, psychology, physics, and applied and interdisciplinary mathematics within the College of Literature, Science & the Arts to aerospace engineering, mechanical engineering, materials science engineering, industrial & operations engineering, and civil & environmental engineering within the College of Engineering, the 2020 MICDE fellows epitomize the reach of computation in diverse scientific disciplines.

For the past six years, MICDE has awarded fellowships to over 120 graduate students from our large community of computational scientists. The MICDE graduate student top-off fellowship provides students with a stipend to use for supplies, technology, and other materials that will further their education and research. Among other things, awards have helped many to travel to conferences and meetings around the world to share the rich and diverse research in computational science being carried out at U-M.

The awardees are:

Eytan Adler, Aerospace Engineering
Hessa Al-Thani,
Industrial and Operations Engineering
Zijie Chen,
Mechanical Engineering
Alexander Coppeans
, Aerospace Engineering
Xinyang Dong, Physics
Karthik Ganesan,
Psychology
Iman Javaheri, Aerospace Engineering
Huiwen Jia, Industrial and Operations Engineering
Daeho Kim, Civil and Environmental Engineering
Yudan Liu,
Chemistry
Emily Oliphant
, Materials Science and Engineering
Ryan Sandberg, Applied and Interdisciplinary Mathematics
Patrick Wu, Political Science
Zhucong Xi, Materials Science and Engineering
Yi Zhu, Civil and Environmental Engineering

Learn more about the fellows and the MICDE Fellowship program

MICDE funds wide-ranging computational discovery in galactic formation, drug discovery, bacterial biofilm colonies and turbulence simulations

By | News, Research

Since 2017 the Michigan Institute for Computational Discovery & Engineering (MICDE) Catalyst Grants program has funded a wide spectrum of cutting-edge research that combines science, engineering, mathematics and computer science. This year the program will fund four new projects that continue this tradition: Prof. Aaron Frank (Chemistry) and his group will spearhead efficient strategies to rapidly develop treatments for emerging diseases– a need made more compelling by the current COVID-19 Pandemic. Their approach combines generative artificial intelligence models and molecular docking to rapidly explore the space of chemical structures and generate target-specific virtual libraries for drug discovery. Prof. Marisa Eisenberg (Epidemiology, Mathematics, and Complex Systems) and Prof. Alexander Rickard’s (Epidemiology) groups will develop novel computational techniques to study biofilm architectures.  Biofilms are complex assemblages of microbial cells that form on almost any natural and man-made surface. They cause several debilitating diseases, and can even damage machinery and equipment, elevating the understanding of their behaviour to a critical need. Prof. Oleg Gnedin (Astronomy) will develop novel techniques to tailor the mathematical initial conditions from which to simulate chosen regions of the universe. The resulting insights will help uncover the origins of our own galaxy, the Milky Way. Finally, Prof. Aaron Towne (Mechanical Engineering) will advance the modeling of complex, turbulent flows and other large-scale systems in engineering science. His research will enable orders of magnitude of acceleration in the computation of extremely large scale flows in a number of engineering systems.

“These four projects have the potential to catalyze and  reorient the directions of their research fields by developing and harnessing powerful paradigms of computational science”, said Krishna Garikipati, Professor of Mechanical Engineering and of Mathematics, and MICDE’s Director. “MICDE’s mission is to lead the advances in computational science research by bringing together interdisciplinary teams at U of M, and these projects embody that vision.” 

More about MICDE’s catalyst grant program and the projects can be found at micde.umich.edu/catalyst.

Fabricio Vasselai wins the Irving Louis Horowitz Award from the Horowitz Foundation for Social Policy

By | News

Fabricio Vasselai, a dual Ph. D. candidate in Political Science and Scientific Computing is a recipient of this year’s Horowitz Foundation awards from the Horowitz Foundation for Social Policy. His proposal titled “Elections in the AI era: using Machine Learning and Multi-Agent Systems to detect and study menaces to election integrity” won the Irving Louis Horowitz Award, given to the overall most outstanding project of the year, as well as the Joshua Feigenbaum Award as the most outstanding project on Arts, Popular Culture and Mass Communication.

The proposal develops Artificial Intelligence tools to detect and to study threats to election integrity. First, novel Multi-agent simulations of elections (MASE) are derived and implemented to be the data-generating process of synthetic data. Then these data is used to train Supervised Machine Learning (SML) to detect fraud in real election result counts. He uses such ability to create simulated training data to properly bootstrap the SML classifications, allowing for the novel estimation of uncertainty around election fraud detection. He also uses MASE to perform virtual experiments on the spread of fake news, showing that biased misinformation is critical for political polarization to flourish in majoritarian elections.

Fabricio Vasselai is an MICDE Fellow (awarded on 2018), and he is currently a Researcher at U-M’s Center for Political Studies and Center for Complex Systems.

Established in 1998, the Horowitz Foundation awards grants to scholars to assist them in completing their dissertations. It is highly competitive, with less than 3 percent of applicants receiving an award this year.

Microsoft AI for Health Program to support an AI-facilitated Optimization Framework for Improving COVID-19 Testing

By | News, Research

With the recent resurgence of COVID-19 infections, testing has become central to an integrated, global response to the pandemic. Accurate, effective, and efficient testing can lead to early detection and prompt an agile response by public health authorities. Strategic testing systems are critical for providing data that will inform disease prevention, preparation, and intervention. MICDE Associate Director and Associate Professor of Industrial and Operations Engineering and of Civil and Environmental Engineering, Siqian Shen, has recently published an article pin-pointing a number of pivotal operations research and industrial engineering tools that can be brought to  the fight against COVID-19. One of the key lessons from her research is the importance of expanding the availability of COVID-19 testing and making the resulting data transparent to the public as anonymized, summary statistics. This enables informed decision making by individuals, public health officials, and governments.  

Based on these high-impact findings, Professor Shen is striding ahead to design a comprehensive COVID-19 testing framework to efficiently serve the urgent needs of diverse population groups . A grant from Microsoft’s AI for Health program, part of the AI for Good initiative, will provide credits to use Microsoft’s Azure service.  With this cyber resource, Professor Shen and her team will integrate and coordinate decision-making models and data analytics tools that they have developed for testing on a Cloud-based platform. In addition, their AI framework is dynamic, and collects daily infection data to improve testing-related decisions. Such a platform could have significant impacts on three major problems that exist with current testing design strategies:

1) Where to locate testing facilities and how to allocate test kits and other resources.
2) How to effectively triage different population groups through effective appointment scheduling.
3) How to visualize real-time testing capacities to better inform the public and serve ad-hoc needs of patients. 

Prof. Shen’s research will integrate AI techniques with optimization to dynamically refine existing testing design methods for gathering and analyzing data from unexplored populations and regions around the globe. The development and refinement of these new models with the support of Microsoft Azure will create a transparent, data-informed testing system that will allow public health and government authorities to make agile, data-driven decisions to aid in the prevention, preparation, intervention, and management of COVID-19 and other outbreaks of infectious diseases.

Siqian Shen is a  Professor of Industrial and Operations Engineering, and of Civil and Environmental Engineering at the University of Michigan, an Associate Director of the Michigan Institute for Computational Discovery & Engineering, and an affiliated faculty member in the Michigan Institute for Data Science. Her research group works on both theoretical and applied aspects of problems by combining stochastic programming, integer programming, network optimization,  machine learning and statistics.

What is the right model? Different MRIO models yield very different carbon footprints estimates in China

By | Research

Appropriate accounting of greenhouse gas emissions is the first step to assign mitigation responsibilities and develop effective mitigation strategies. Consistent methods are required to fairly assess a region’s impact on climate change. Two leading reasons for the existence of different accounting systems are the political pressures, and the actual costs of climate mitigation to local governments. At the international level there has been consensus, and global environmentally extended multi-regional input-output (EE-MRIO) models that capture the interdependence of and their environmental impacts have been constructed.  However in China, the largest greenhouse gas emitter, where accurate interregional trade-related emission accounts are critical to develop mitigation strategies and monitor progresses at the regional level, this information is sporadic and inconsistent. Prof. Ming Xu from the School of Environment and Sustainability, and his research group, analyzed the available data from China, which dates back to 2012. They showed that the results varied wildly depending on the MRIO model used. For example, they found two MRIO models differed as much as 208 metric tons in a single region, which is equivalent to the emissions of Argentina, United Arab Emirates, or the Netherlands. Their results show the need to prioritize future efforts to harmonize greenhouse gas emissions accounting within China.

Ming Xu is an Associate Professor in the School for Environment and Sustainability and in the Department of Civil and Environmental Engineering at the University of Michigan, Ann Arbor. His research focuses on the broad fields of sustainable engineering and industrial ecology. 

Read the full article.

Modeling the transmission of infectious aerosols

By | Feature, Research

Inhalation of micron-sized droplets represents the dominant transmission mechanism for influenza and rhinovirus, and recent research shows that it is likely also the case for the novel coronavirus.  Increasing evidence suggests that the transmission of infectious aerosols is more complex than previously thought. Coughing, sneezing and even talking yield a gaseous flow field near the infected person that is dynamic and turbulent in nature. Existing models commonly employed in simulations of aerosol transmission attempt to represent the effect of turbulence using random walk models that are often phenomenological in nature, employing adjustable parameters and inherently assuming the turbulent fluctuations ‘felt’ by a droplet do not depend upon direction. To design physics-informed guidelines to minimize the spread of this virus, improved predictive modeling capabilities for effectively tracking the aerosol paths are needed. Dr. Aaron M. Lattanzi and Prof. Jesse Capecelatro, from Mechanical Engineering and MICDE are tackling this problem by focusing on mathematical modeling of aerosol dispersion. They derived analytical solutions for the mean-squared-displacement resulting from systems of stochastic differential equations. A key element of their methodology is that the solution connects stochastic theory inputs to statistics present in high-fidelity simulations or experiments, providing a framework for developing improved models.

Simple simulation of aerosol dispersion from a single-point source. The grey, cone-like surface is the approximation using Force Langevin (FL) theory and the colored particles are from integration of Newton’s equations with stochastic drag forces.

Prof. Capecelatro’s research group develops physics-based models and numerical algorithms to leverage supercomputers for prediction and optimization of the complex flows relevant to energy and the environment. The main focus of their research involves developing robust and scalable numerical tools to investigate the multiphysics and multiscale phenomena under various flow conditions, like those that they study here. They recently submitted their findings to the Journal of Fluid Mechanics, and are continuing to work on this problem hoping it will help understand the transmission of COVID-19 and therefore help optimize current guidelines.