New physics-based computation and AI framework to understand the agressive behavior of cancer cells

By | Feature, Research

Cancer is an illness caused by an uncontrolled division of transformed cells, which can originate in almost  any organ of the body.  Cancer is not a single disease, even when it arises in the same site of the body. Tremendous variability exists in progression of disease and response to therapy among different persons with the same general type of cancer, such as breast cancer. Even at the level of a single person, cancer cells show tremendous heterogeneity within a single tumor and among a primary tumor and metastases. This heterogeneity causes drug resistance and fatal disease. The prevailing dogma is that heterogeneity among cancer cells arises randomly, generating greedy individual cancer cells that compete for growth factors and optimal environments. The rare “winners” in this competition survive and metastasize. However, tumors consistently maintain heterogeneous subpopulations of cancer cells, some of which appear less able to grow and spread. This observation prompted Gary and Kathy Luker, cancer cell biologists at the University of Michigan, to hypothesize that cancer cells may actually collaborate under some circumstances to cause disease and not just compete. The idea that single, heterogeneous cancer cells work collectively within a constrained range of variability to drive population-level outputs in tumor progression is a ground-breaking concept that may revolutionize how we approach cancer biology and therapy.

The team is using innovative approaches to extract and merge data streams from models that generate heterogeneous cell behaviors

...cancer cell biologists have teamed up with computational scientists and experts in artificial intelligence to focus the power of these fields on understanding and overcoming heterogeneity in cancer.

To understand causes of single-cell heterogeneity in cancer and conditions that motivate cancer cells to collaborate, an interdisciplinary team of scientists at UM formulated an entirely new conceptual approach to this challenging problem. The cancer cell biologists have teamed up with computational scientists and experts in artificial intelligence to focus the power of these fields on understanding and overcoming heterogeneity in cancer. Building on large, single-cell data sets unique to the team, they will combine inverse reinforcement learning, an artificial intelligence method typically applied to discover motivations for human behaviors, with computational models inferred on the basis of the physics and chemistry of cell signaling and migration. They have proposed an entirely new conceptual approach combining single cell data, physics-based modeling and artificial intelligence to single-cell heterogeneity and intercellular interactions. By discovering  testable molecular processes underlying “decision-making” by single cells and their “motivations” for acting competitively or collaboratively, this research blazes a new path to understand and treat cancer. Their high-risk, high-reward approach to understand how each cell in a population processes information and translates that to action driving cancer progression, has attracted an award of $1 million dollars by the Keck Foundation. 

The team includes Gary Luker (Radiology, Microbiology and Immunology; Biomedical Engineering), and Kathryn Luker (Radiology), who are leading the experimental studies of cell signaling and migration; Jennifer Linderman (Chemical Engineering; Biomedical Engineering); and Krishna Garikipati (Mechanical Engineering; Mathematics), who are leading the machine learning and modeling side of the project. Nikola Banovic (Electrical Engineering and Computer Science) and Xun Huan (Mechanical Engineering) are using artificial intelligence approaches to discover decision-making policies and rewards for cancer cells, working with the rest of the investigators to incorporate experimental data and physics/chemistry-based models into their approaches.

The W. M. Keck Foundation was established in 1954 in Los Angeles by William Myron Keck, founder of The Superior Oil Company. One of the nation’s largest philanthropic organizations, the W. M. Keck Foundation supports outstanding science, engineering and medical research. The Foundation also supports undergraduate education and maintains a program within Southern California to support arts and culture, education, health and community service projects. This project incorporates elements from all the W. M. Keck Foundation’s focus research areas to tackle cancer with a novel, physics-based modeling and AI-centered approach.  The idea for this project originated in the 2020 MICDE faculty workshop in AI for Physically based Bio-medicine Workshop. The workshop brought together an interdisciplinary group of faculty members to discuss ways to advance artificial intelligence and machine learning methods for biomedical problems. After seeding the idea, a subset of these researchers were awarded an MICDE catalyst grant and a MIDAS PODS grant. These funds were used to establish the proof of concept and to generate preliminary results. 

Computational science is becoming increasingly indispensable in many areas of biomedical science. While the current proposal focuses on cancer, this innovative computational framework represents a transformative leap with widespread applications in multiple other biomedical, physical, and social sciences. MICDE supports innovative and interdisciplinary projects aiming to advance the current paradigms.

Portraits of Kathryn Luker, Gary Luker, Krishna Garikipati, Jennifer Linderman, Nikola Banovic and Xun Huan

Project’s principal investigators (left to right): Kathryn Luker (Radiology), Gary Luker (Radiology, Microbiology and Immonology, and Biomedical Engineering), Krishna Garikipati (Mechanical Engineering, and Mathematics), Jennifer Linderman (Chemical Engineering, and Mathematics), Nikola Banovic (Electrical Engineering and Computer Science) and Xun Huan (Mechanical Engineering).

“Get non-Real”: Department of Energy grant funds novel research in High-Performance Algorithms at U-M

By | Feature, Research

“Preparing for the future means that we must continue to invest in the development of next-generation algorithms for scientific computing,

Barbara Helland, Associate Director for Advanced Scientific Computing Research, DOE Office of Science

New research from the University of Michigan will help revolutionize the data processing pipeline with state-of-the-art algorithms to optimize the collection and processing of any kind of data. Algorithms available now are built for real data, meaning real numbers, however, most of the data we see on the internet is non-real, like discrete data, or categorical. This project is part of a $2.8 million grant from the Department of Energy on algorithms research, which is the backbone of predictive modeling and simulation. The research will enable DOE to set new frontiers in physics, chemistry, biology, and other domains. 

“Preparing for the future means that we must continue to invest in the development of next-generation algorithms for scientific computing,” said Barbara Helland, Associate Director for Advanced Scientific Computing Research, DOE Office of Science. “Foundational research in algorithms is essential for ensuring their efficiency and reliability in meeting the emerging scientific needs of the DOE and the United States.”

The U-M project, led by associate professor Laura Balzano and assistant professor Hessam Mahdavifar, both of electrical engineering and computer science, is one of six chosen by DOE to cover several topics at the leading-edge of algorithms research. According to the DOE, researchers will explore algorithms for analyzing data from biology, energy storage, and other applications. They will develop fast and efficient algorithms as building blocks for tackling increasingly large data analysis problems from scientific measurements, simulations, and experiments. Projects will also address challenges in solving large-scale computational fluid dynamics and related problems.

Laura Balzano and Hessam Mahdavifar portraits

Laura Balzano, associate professor of electrical engineering and computer science (left); Hessam Mahdavifar assistant professor of electrical engineering and computer science (right)

Balzano and Mahdavifar, both Michigan Institute for Computational Discovery and Engineering (MICDE) affiliated faculty members, will use a $300,000 portion of the overall grant to study randomized sketching and compression for high-dimensional non-real-valued data with low-dimensional structures.

“Randomized sketching and subsampling algorithms are revolutionizing the data processing pipeline by allowing significant compression of redundant information,” said Balzano. “Sketches work well because scientific data are generally highly redundant in nature, often following a perturbed low-dimensional structure. Hence, low-rank models and sketching that preserves those model structures are ubiquitous in many machine learning and signal processing applications.” 

Even though a lot of the data used and processed in scientific and technological applications are best modeled mathematically as discrete, categorical or ordinal data, most state-of-the art randomized sketching algorithms focus on real-valued data. To add to this, in practical applications, treating high-dimensional data can be challenging in terms of computational and memory demands. Thus, the proposed project will significantly expand the applicability of randomized sketching.

“A key to data-driven modeling is to carefully reformulate the computational and data analysis challenges and take full advantage of the underlying mathematical structure that is often common across application areas,” said Krishna Garikipati, MICDE director and professor of mechanical engineering and mathematics.”This research and the work that Laura and Hessam are doing is critically important to the advancement of computational discovery.”

MICDE catalyst grant leads to new NSF funding to study cascade “ecohydromics” in the Amazonian headwater system

By | Feature, News, Research

The Amazon Basin cycles more water through streamflow and evaporation than any other contiguous forest in the world, and transpiration by trees is a critical part of this cycle. Understanding how plant roots, stems, and leaves interact with soil water to regulate forest transpiration across landscapes is a critical knowledge gap, especially as climate changes. Professor Valeriy Ivanov, from the Department of Civil and Environmental Engineering at U-M, is the lead investigator in a newly NSF funded project that links diverse disciplines – plant ecophysiology, ecology, and hydrology – and will build a unique modeling framework to characterize landscape variation in physiological and hydrological processes in the Amazon Basin. The framework will integrate a wide array of field observations with detailed watershed modeling for hypothesis testing. The team includes Tyeen Taylor, research fellow also from the Civil and Environmental Engineering Department at U-M, and many collaborators in the U.S. at the University of Arizona, University of West Virginia, University of Nebraska, as well as Brazilian researchers at the Federal University of Eastern Para, and Federal University of Amazonas, National Institute for Amazonian Research, and Eastern Amazon Agricultural Agency. Detailed, physical models of ecophysiology and above- and below-ground hydrology will be informed by observations of leaf physiology, tree morphological traits, soil moisture, groundwater, and streamflow. Data and models will be integrated employing novel tools in probabilistic learning and uncertainty quantification. The computational framework tools to be used in this project were developed in part with the support from MICDE Catalyst grant program for the 2018 project “Urban Flood Modeling at “Human Action” Scale: Harnessing the Power of Reduced-Order Approaches and Uncertainty Quantification” led by Prof. Ivanov. 

Given (a) a mechanistic model M (e.g., a stomatal conductance model), (b) one can treat its inputs 𝛏 (e.g., parameters) as random variables. These inputs are sampled and model simulations are carried out. Using (c) PCEs, we construct a surrogate model that best approximates the model output – left-hand-side of (c). The surrogate is then evaluated with Monte Carlo simulations and used for (d) parameter inference. (d.1) is the flow of outputs from the surrogate model into a likelihood function L (D | 𝛏) to compare the surrogate model output and observed data D. This inference produces the posterior distribution for 𝛏. This pdf can then be sent back to the surrogate in (d.2) to reduce the uncertainty in the inputs and to obtain pdf for a quantity of interest (e).

“The reduced-ordered modeling approach developed during the MICDE Catalyst grant project is a key element of the new project,” said Prof. Ivanov, “the MICDE seed funding has allowed us to build a general framework that is applicable to a wide range of computational applications in earth-system science, and thus made our project proposal more competitive”.

The MICDE Catalyst Grants program funds projects that have the potential to catalyze and reorient the directions of their research fields by developing and harnessing powerful paradigms of computational science. This new NSF project is an example of the reach of the program.

Read more.

2021-2022 Catalyst Grant awardees continue to forge new fronts in computational science

By | Feature, News, Research

The Michigan Institute for Computational Discovery and Engineering (MICDE) announced the awardees of the 2021-2022 round of Catalyst Grants. Since 2017 MICDE Catalyst Grants program has funded a wide spectrum of cutting-edge research, this year focusing on AI for physically-based biomedicine, quantum computing, convergence of natural hazards with economic dislocation, and computational integration across scales and disciplines in biology. The five projects awarded in this round represent these new frontiers of computational research spearheaded by the Institute through its initiatives.

Prof. Shravan Veerapaneni (Mathematics) is working on advancing quantum algorithm research. His team will develop a Variational Quantum Monte Carlo algorithm that can potentially be applied to a wide range of linear algebraic tasks, like QR and Singular Value Decomposition (SVD). 

Profs. Salar Fattahi (Industrial and Operations Engineering) and Arvind Rao (Computational Medicine and Bioinformatics, Biostatistics) are revisiting existing theoretically powerful maximum-likelihood estimation mathematical methods to identify areas of weakness and strengthen them for use in biomedical, largely genomic, applications.

Profs. Gary Luker (Microbiology and Immunology), Nikola Banovic (Electrical Engineering and Computer Science), Xun Huan (Mechanical Engineering), Jennifer Linderman (Biomedical Engineering and Chemical Engineering), and Kathryn Luker (Radiology), will develop a physics/chemistry-aware inverse reinforcement learning (IRL) computational framework that will support the understanding single-cell and cooperative decision-making that drive tumor growth, metastasis, and recurrence.

Profs. Seth Guikema (Civil and Environmental Engineering and Industrial and Operations Engineering) and Jeremy Bricker (Civil and Environmental Engineering) will develop an integrated computational modeling approach to studying equity and resilience during natural hazard events, specifically estimating what essential services are the main constraints on individuals returning to a more normal life post-hazard, and assess inequities in resilience to coastal flooding events. 

Prof. Jesse Capecelatro (Mechanical Engineering and Aerospace Engineering) and Alberto Figueroa (Biomedical Engineering and Vascular Surgery), will develop a versatile, physics-driven, computationally efficient, and massively parallel numerical framework to simulate the interaction between fluids and biological particles in patient-specific vasculature geometries. This framework will enable next-generation computer-aided diagnostics.

“This year’s cohort of MICDE Catalyst Grants range from quantum computing for engineering science, AI for the physics of cancer, and computational advances in hazards engineering, through mathematical advances in data science, and bioengineering,” said MICDE Director Krishna Garikpati, a professor of mathematics and mechanical engineering. “These projects represent new frontiers of computational research spearheaded by MICDE through its initiatives.”

Learn more about MICDE’s Catalyst Grant program and funded projects here.

“This year’s cohort of MICDE Catalyst Grants … represent new frontiers of computational research spearheaded by MICDE through its initiatives.”

Krishna Garikipati
Director, MICDE

The crucial role of massively parallel simulations in future space exploration missions

By | HPC, News, Research

The NASA Mars 2020 Mission was launched with the goal of seeking signs of ancient life and collecting samples of rock and regolith (broken rock and soil) for possible return to Earth. Perseverance, the mission’s rover, is testing technologies to help pave the way for future human exploration of Mars. While Perseverance was launched in the summer of 2020, landing on the martian surface on February 18, 2021, the journey started years earlier when the mission’s objectives were outlined, including realistic surface operations, a proof-of-concept instrument suite, and suggestions for threshold science measurements that would meet the proposed objectives. The success of this, as well as past and future missions, is the collective result of thousands of NASA funded projects from teams of researchers and scientists from all over the country that span many decades. University of Michigan Professor Jesse Capecelatro (Mechanical Engineering & Aerospace Engineering) is the lead of one of these projects. In 2016, his research group started working on a project aimed to develop high fidelity models of plume-Induced soil erosion during lunar and planetary landings that will be used in future missions. 

During descent, exhaust plumes fluidize surface soil and dust, forming craters and buffeting the lander with coarse, abrasive particles. The SkyCrane technology, used by the Curiosity Rover in 2012 and by Perseverance in 2021, was designed to avoid plume-surface interactions by keeping the jets far above the surface. Regardless of this feature, a wind sensor on NASA’s Mars Curiosity rover was damaged during landing. In the Perseverance’s video footage of the landing, significant erosion and high-speed ejecta were observed.  It is also not a practical option for future crewed and sample return missions. 

NASA is aiming to improve rovers’ landing gears for future missions. Computational models and simulations are a critical component to achieve this as it is not feasible to mimic the martian or other celestial bodies entry conditions and run thousands of tests in a lab anywhere on Earth. This is where Prof. Capecelatro’s research group, including doctoral candidates Greg Shallcross, and Meet Patel, and postdoctoral fellow Medhi Khalloufi, work comes in as the accurate prediction of surface-plume interactions is necessary for the overall success of future space missions. While simulations of surface-plume interactions have been conducted in the past, these are outdated, and typically relied on simplified assumptions that prevent a detailed and dynamic analysis of the fluid-particle coupling. Capecelatro’s research code will provide NASA with a framework to better predict how different rover designs would impact the landing, specifically the effects of the force of the collision on the planet’s surface, and ability to adjust the rover’s landing trajectory independent of the NASA mission control team on earth.  

Prof. Capecelatro’s research project utilizes predictive simulation tools to capture the complex multiphase dynamics associated with rocket exhaust impingement during touchdown. Even in the most powerful supercomputers, a direct solution approach is only capable of accounting for  about a thousand particles at the same time, so accurate and predictive multi-scale models of the unresolved flow physics are essential. 

Full landing site image credit: NASA/JPL-Caltech (; particle and intermediate scale images: Capecelatro’s Research Group

Particle Scale

The group has been developing the simulation capabilities to directly resolve the flow at the sub-particle scale to shed light on important physics under the extreme conditions relevant to particle-structure interactions. Their model uses a massively parallel compressible particle-laden flow simulation tool where the exhaust plume and its corresponding flow features are computed in an Eulerian-Lagrangian framework. At this scale, for example, the flow between individual particles are resolved, providing important insight on drag and turbulence under these extreme conditions.

Intermediate Scale

As a next step, the particle-scale results inform models used in the intermediate-scale simulations developed by the group, where particles are still tracked individually but the flow is not resolved at a sub-particle resolution, allowing them to simulate upwards of 1 billion particles. At this scale, an Eularian-Lagrangian framework is used to incorporate the ground’s particle flow with the jet’s plume. 

Full Landing Site Scale

While the intermediate-scale simulations allow to study erosion and cratering, a full landing site that contains trillions of particles is still out of reach even in the most powerful HPC clusters. After further modeling, Capecelatro’s multi-scale framework will be handed over to NASA where it will be incorporated in simulations of the full landing site. At this scale, NASA’s framework uses an Eularian-based, two fluid model that treats both fluid and particles as a continuum, informed by the particle- and middle-scales models. 

Mission Mars 2020 is expanding NASA’s robotic presence on the red planet. While it is a big step to set the stage for future human exploration, the Perseverance Rover needs further redesign to make the voyage safe for humans. Capecelatro’s physics-based models are aiding this task by helping predict and model more accurately the outcomes of a spacecraft attempting to safely land millions of miles from home. As in many other fields, computational science will continue to play a critical role in the future of humanity’s quest to conquer space. #computationalscience everywhere!

Related links:
Sticking the landing on Mars: High-powered computing aims to reduce guesswork
Capecelatro’s Research Group
NASA 2020 Mars Mission: Perseverance Rover