**Bio: **John Harlim is a Professor in the Department of Mathematics and the Department of Meteorology and Atmospheric Sciences. Harlim received his undergraduate degree in Mathematics from the Universitas Padjadaran (Indonesia), a master’s from the University of Guelph in Applied Mathematics, and a PhD in Applied Mathematics and Scientific Computation from the University of Maryland at College Park. His research interests in applied mathematics include parameter estimation, machine learning, manifold learning, operator estimation, data assimilation.

### Learning Missing Dynamics through Data

The recent success of machine learning has drawn tremendous interest in applied mathematics and scientific computations. In this talk, I would address the classical closure problem that is also known as model error, missing dynamics, or reduced-order-modeling in various community. Particularly, I will discuss a general framework to compensate for the model error. The proposed framework reformulates the model error problem into a supervised learning task to approximate a very high-dimensional target function involving the Mori-Zwanzig representation of projected dynamical systems. Connection to traditional parametric approaches will be clarified as specifying the appropriate hypothesis space for the target function. Theoretical convergence and numerical demonstration on modeling problems arising from PDE’s will be discussed.

SPERM NAVIGATION IN COMPLEX ENVIRONMENTS

HIGH-ACCURACY SIMULATION OF FREE SURFACE FLOWS NEAR FINITE-TIME PINCH-OFF AND COALESCENCE SINGULARITIES

**Due to unforeseen circumstances the originally scheduled talk by Professor Brandon Johnson has been cancelled and replaced with the following seminar.**

### Theoretical and Computational Contributions to the Modeling of Global Tsunamis

The distribution of tsunami amplitudes in the open ocean is controlled by source mechanism as well as bathymetry geometry and resolution, with the latter controlling far-field tsunami features. However, large detailed bathymetry grids result in long computer simulation times for tsunamis. It is therefore of interest to investigate the amount of physical detail in bathymetric grids that control the most important features in tsunami amplitudes, to assess what constitutes sufficient level for grids in numerical simulations. By decomposing the Pacific bathymetry using a spherical harmonics approach one can create “smoothed” versions of the original field. Using these simplified bathymetries to simulate tsunamis from potential ruptures around the Pacific, we can see that for large megathrust events (M0=1029 dyn-cm), only a resolution of ~1000 km (equivalent to l=40), or ~1% surface smoothness of the Pacific is needed in order to reproduce the main components of the true distribution of tsunami amplitudes. This would result in simpler simulations, and faster computations in the context of tsunami warning algorithms.

In a separate context, an overview of tsunami studies and a report on a study of a meteotsunami are presented. These scenarios are evidence for the fact that tsunami studies are interdisciplinary fields of research that require coordinated efforts by investigators from various backgrounds.

*MICDE is co-hosting this seminar with the Earth and Environmental Sciences department. *

**Bio: **Professor Wetzel is an assistant professor in the physics department and in the astrophysics and cosmology group at the University of California, Davis. He is a theoretical/computational astrophysicist and cosmologist. Using the world’s most powerful supercomputers, he generates cosmological simulations to model the formation of cosmic structures, including galaxies and their stars. He uses these simulations as theoretical laboratories to develop and test models of galaxy formation, stellar dynamics, and the nature of dark matter, with emphasis on our own Milky Way galaxy.

### Simulating the Milky Way

The Gaia satellite mission, together with a multitude of ground-based observational surveys, now measure 6-D phase-space coordinates and multi-species elemental abundances for hundreds of millions of stars across the Milky Way. This new era of galactic archeology and near-field cosmology demands a new generation of simulations that achieve high dynamic range to resolve scales of individual stellar populations within a cosmological context. I will describe the new Latte suite of massively parallelized cosmological zoom-in simulations, run on the nation’s most powerful supercomputers, that model the formation of Milky Way-like galaxies at parsec-scale resolution, using the FIRE (Feedback in Realistic Environments) model for star formation and feedback. First I will discuss the formation of the Milky Way disk, including resolving for the first time the dynamics and lifetimes of giant molecular clouds and stars clusters at z = 0. These simulations also self-consistently resolve the formation of satellite dwarf galaxies around each Milky Way-like host. These low-mass galaxies have presented significant challenges to the cold dark matter model, but I will show progress in addressing the “missing satellites” and “too-big-to-fail” problems. Finally, I will discuss synthetic Milky Way surveys that we have created from the Latte simulations, which are publicly available, to provide theoretical modeling insight for the era of Gaia.

*Prof. Wetzel is being hosted by Prof. Gnedin (Astronomy). If you would like to meet with him during his visit, please send an email to micde-events@umich.edu. If you are an MICDE graduate student and would like to join Prof. Wetzel for lunch please RSVP by Thursday, January 23. *

**Bio**: Allen Sanderson, Ph.D. is a Research Scientist at the University of Utah’s Scientific Computing and Imaging Institute. His interest lies in visualization and analysis of large data coming from application areas ranging from plasma physics to combustion. Recently he has focused on new ways to utilize in situ data analysis and visualization which often has him working directly on the science application infrastructure.

### Teasing out Ephemeral Data from HPC Applications for In Situ Visualization and Analysis

It is well known that as HPC applications have grown, I/O has become a bottleneck, which has required scientists to turn to in situ tools for data exploration. The focus of this exploration has typically been on simulation data. However, applications also produce ephemeral data that is optionally written to disk for post hoc analysis, but not otherwise saved or utilized by the application in subsequent time steps. One example of ephemeral data is runtime performance data. In this talk I will present the infrastructure implemented for efficiently collecting this and other data within the Uintah framework which was coupled to VisIt’s in situ toolkit for analysis and visualization. This collection and coupling allows performance data to be visualized using multiple domains giving insight previously not possible. As part this coupling, we take advantage of VisIt’s in situ custom user interface to create a “simulation dashboard” that allows for in situ computational steering and visual debugging allowing for improvements in the development and simulation workflow.

*Dr. Sanderson is being hosted by the Scientific Computing Student Club [SC2]. If you would like to meet with him during his visit, please send an email to micde-events@umich.edu. Limited lunch will be provided. *

**Bio**: Bo Zhu is an assistant professor of Computer Science at Dartmouth College. Prior to that, he was a postdoctoral associate at MIT CSAIL. He received his Ph.D. in Computer Science from Stanford University in 2015. His research interests encompass computer graphics, computational physics, and computational fabrication. In particular, he focuses on building computational approaches to automate the process of exploring complex physical systems.

### Super-Resolution Structural Simulation and Optimization

Complex physical systems exhibiting mixed-dimensional geometry and multi-scale mechanics are ubiquitous. Examples include biological structures, such as insect wing exoskeletons, fluid phenomena, such as bubbles and jets, and human-made objects, such as microrobots. The beauty and complexity of these systems attract efforts from scientists, engineers, and artists in various fields. However, a computational investigation of these systems on the level of super-resolution –with millions to billions of computational elements — is still challenging, due to the non-manifold geometric structures, non-linear governing physics, and the tight coupling between them.

My work tackles these challenges by rethinking of the computation pipeline—from a perspective that aims to blur the line between discrete geometry and continuous physics. My guiding principle is to study the hidden low-dimensional topological and structural characteristics underpinning these complex systems and to create the most natural geometric analogs in a discrete setting for efficient simulation and optimization. In this talk, I will present two examples to demonstrate this methodology, including a super-resolution topology optimization algorithm based on sparse grids to emerge biomimetic structures and a numerical simulation approach based on simplicial complexes to model codimensional fluids. These computational tools enable the investigation, discovery, and development of a broad range of complex physical systems that are multi-scale and mixed-dimensional, with applications in computer graphics, computational physics, and additive manufacturing.

*Prof. Zhu is being hosted by Prof. Saitou (ME). If you would like to meet with him during his visit, please send an email to micde-events@umich.edu. If you are an MICDE graduate student and would like to join Prof. Zhu for lunch please RSVP by Friday, December 6th . *

**Bio**: Anna Vainchtein is a professor in the Department of Mathematics at the University of Pittsburgh. She is generally interested in mathematical modeling and analysis of nonlinear phenomena in materials science, physics and biology. Examples include dynamics of phase boundaries, cracks and dislocations in crystals, hysteresis in phase-transforming materials, solitary and heteroclinic traveling waves in nonlinear lattices and DNA overstretching. The resulting mathematical problems typically involve minimization of nonconvex functionals, nonlinear PDEs that change type, dynamical systems with many degrees of freedom and functional differential equations. Thus nonstandard analytical and numerical techniques are required.

### Strictly supersonic solitary waves in lattices

We consider a nonlinear mass-spring chain with first and second-neighbor interactions and show that there is a parameter range where solitary waves in this system are strictly supersonic. In these regimes standard quasicontinuum theories, targeting long-wave limits of lattice models, are not adequate since even weak strictly supersonic solitary waves are of envelope type and crucially involve a microscopic scale in addition to the mesoscopic scale of the envelope. To capture this effect in a continuum setting it is necessary to employ unconventional, higher-order quasicontinuum approximations carrying more than one length scale. This talk is based on recent joint work with Lev Truskinovsky (ESPCI).

*This seminar is co-sponsored by the Applied & Interdisciplinary Mathematics program. Prof. Vainchtein is being hosted by Prof. Garikipati (ME). If you would like to meet with her during her visit, please send an email to micde-events@umich.edu. *

**Bio**: Irene J. Beyerlein is a Professor at the University of California at Santa Barbara (UCSB) with a joint appointment in the Mechanical Engineering and Materials Departments. She currently holds the Robert Mehrabian Interdisciplinary Endowed Chair in the College of Engineering. After receiving her Ph.D. degree in Theoretical and Applied Mechanics at Cornell University in 1997, she began a postdoctoral appointment as a J.R. Oppenheimer Fellow at Los Alamos National Laboratory, where she remained on the scientific staff in the Theoretical Division, until 2016, when she joined UCSB. She has published one book, nine book chapters, and more than 300 peer-reviewed articles in the field of structural composites, materials processing, multiscale modeling of microstructure/property relationships, deformation mechanisms, and polycrystalline plasticity. She is an Editor for *Acta Materialia* and *Scripta Materialia* and an Associate Editor for* Modelling* and *Simulation in Materials Science and Engineering*. In recent years, she has been awarded the Los Alamos National Laboratory Fellow’s Prize for Research (2012), the International Plasticity Young Researcher Award (2013), the TMS Distinguished Scientist/Engineering Award (2018), and the Brimacombe Metal (2019).

### A COMPOSITE OF SUPERIOR PROPERTIES WITH NANOSTRUCTURED COMPOSITE MATERIAL

Many future engineering systems will rely on high-performance metallic materials that are several times stronger and tougher than those in use today. In many situations, these superior properties will be desired in harsh environments, such as elevated temperatures, at high rates, and under irradiation. Nanolaminates, built from stacks of crystalline layers, each with nanoscale individual thicknesses, are proving to exhibit a composite of many of these target properties. Examples span from nanotwinned materials to biphase nanolaminates, comprised of alternating nano-thick layers that differ in orientation, chemistry and crystal structure. Studies on these materials report exceptional properties far beyond a volume average value of their constituents, such as strengths that are over five to ten times higher, hardness values that are several orders of magnitude higher, and unprecedented microstructural stability in harsh environments, such as irradiation, sudden impact, or elevated temperatures. While the combination of properties is clearly attractive, one roadblock to applying the nanolaminate concept to any general composite material system is their complex, highly anisotropic deformation behavior, making them less reliable than coarsely structured materials. Critical to designing the material nanostructure to achieve uniformity and reliability is understanding and predicting the strength properties of nanostructure materials based on known conditions and measurable variables, such as basic nanostructure size scales and chemical composition. Multiscale models for conventional coarse-grained materials have been in development for several decades, but analogous versions for nanostructured materials require extensions to explicitly account for the overriding dominance of internal boundaries on these microstructure/property relationships. The computational materials challenge lies in how to represent the discrete and statistical dislocation glide processes in nanostructured materials so that the profound influence of the fine nanoscale crystals can be properly replicated in simulation. In this talk, we will present recent examples of computational techniques and some unanticipated couplings between nanostructural size effects and microstructural evolution and strength that arise from their application.

*Prof. Beyerlein is being hosted by Prof. Fan (ME). *