Bio: John Harlim is a Professor in the Department of Mathematics and the Department of Meteorology and Atmospheric Sciences. Harlim received his undergraduate degree in Mathematics from the Universitas Padjadaran (Indonesia), a master’s from the University of Guelph in Applied Mathematics, and a PhD in Applied Mathematics and Scientific Computation from the University of Maryland at College Park. His research interests in applied mathematics include parameter estimation, machine learning, manifold learning, operator estimation, data assimilation.
Learning Missing Dynamics through Data
The recent success of machine learning has drawn tremendous interest in applied mathematics and scientific computations. In this talk, I would address the classical closure problem that is also known as model error, missing dynamics, or reduced-order-modeling in various community. Particularly, I will discuss a general framework to compensate for the model error. The proposed framework reformulates the model error problem into a supervised learning task to approximate a very high-dimensional target function involving the Mori-Zwanzig representation of projected dynamical systems. Connection to traditional parametric approaches will be clarified as specifying the appropriate hypothesis space for the target function. Theoretical convergence and numerical demonstration on modeling problems arising from PDE’s will be discussed.
Due to unforeseen circumstances the originally scheduled talk by Professor Brandon Johnson has been cancelled and replaced with the following seminar.
Theoretical and Computational Contributions to the Modeling of Global Tsunamis
The distribution of tsunami amplitudes in the open ocean is controlled by source mechanism as well as bathymetry geometry and resolution, with the latter controlling far-field tsunami features. However, large detailed bathymetry grids result in long computer simulation times for tsunamis. It is therefore of interest to investigate the amount of physical detail in bathymetric grids that control the most important features in tsunami amplitudes, to assess what constitutes sufficient level for grids in numerical simulations. By decomposing the Pacific bathymetry using a spherical harmonics approach one can create “smoothed” versions of the original field. Using these simplified bathymetries to simulate tsunamis from potential ruptures around the Pacific, we can see that for large megathrust events (M0=1029 dyn-cm), only a resolution of ~1000 km (equivalent to l=40), or ~1% surface smoothness of the Pacific is needed in order to reproduce the main components of the true distribution of tsunami amplitudes. This would result in simpler simulations, and faster computations in the context of tsunami warning algorithms.
In a separate context, an overview of tsunami studies and a report on a study of a meteotsunami are presented. These scenarios are evidence for the fact that tsunami studies are interdisciplinary fields of research that require coordinated efforts by investigators from various backgrounds.
MICDE is co-hosting this seminar with the Earth and Environmental Sciences department.
Bio: Professor Wetzel is an assistant professor in the physics department and in the astrophysics and cosmology group at the University of California, Davis. He is a theoretical/computational astrophysicist and cosmologist. Using the world’s most powerful supercomputers, he generates cosmological simulations to model the formation of cosmic structures, including galaxies and their stars. He uses these simulations as theoretical laboratories to develop and test models of galaxy formation, stellar dynamics, and the nature of dark matter, with emphasis on our own Milky Way galaxy.
Simulating the Milky Way
The Gaia satellite mission, together with a multitude of ground-based observational surveys, now measure 6-D phase-space coordinates and multi-species elemental abundances for hundreds of millions of stars across the Milky Way. This new era of galactic archeology and near-field cosmology demands a new generation of simulations that achieve high dynamic range to resolve scales of individual stellar populations within a cosmological context. I will describe the new Latte suite of massively parallelized cosmological zoom-in simulations, run on the nation’s most powerful supercomputers, that model the formation of Milky Way-like galaxies at parsec-scale resolution, using the FIRE (Feedback in Realistic Environments) model for star formation and feedback. First I will discuss the formation of the Milky Way disk, including resolving for the first time the dynamics and lifetimes of giant molecular clouds and stars clusters at z = 0. These simulations also self-consistently resolve the formation of satellite dwarf galaxies around each Milky Way-like host. These low-mass galaxies have presented significant challenges to the cold dark matter model, but I will show progress in addressing the “missing satellites” and “too-big-to-fail” problems. Finally, I will discuss synthetic Milky Way surveys that we have created from the Latte simulations, which are publicly available, to provide theoretical modeling insight for the era of Gaia.
Prof. Wetzel is being hosted by Prof. Gnedin (Astronomy). If you would like to meet with him during his visit, please send an email to email@example.com. If you are an MICDE graduate student and would like to join Prof. Wetzel for lunch please RSVP by Thursday, January 23.
Bio: Allen Sanderson, Ph.D. is a Research Scientist at the University of Utah’s Scientific Computing and Imaging Institute. His interest lies in visualization and analysis of large data coming from application areas ranging from plasma physics to combustion. Recently he has focused on new ways to utilize in situ data analysis and visualization which often has him working directly on the science application infrastructure.
Teasing out Ephemeral Data from HPC Applications for In Situ Visualization and Analysis
It is well known that as HPC applications have grown, I/O has become a bottleneck, which has required scientists to turn to in situ tools for data exploration. The focus of this exploration has typically been on simulation data. However, applications also produce ephemeral data that is optionally written to disk for post hoc analysis, but not otherwise saved or utilized by the application in subsequent time steps. One example of ephemeral data is runtime performance data. In this talk I will present the infrastructure implemented for efficiently collecting this and other data within the Uintah framework which was coupled to VisIt’s in situ toolkit for analysis and visualization. This collection and coupling allows performance data to be visualized using multiple domains giving insight previously not possible. As part this coupling, we take advantage of VisIt’s in situ custom user interface to create a “simulation dashboard” that allows for in situ computational steering and visual debugging allowing for improvements in the development and simulation workflow.
Dr. Sanderson is being hosted by the Scientific Computing Student Club [SC2]. If you would like to meet with him during his visit, please send an email to firstname.lastname@example.org. Limited lunch will be provided.
Bio: Bo Zhu is an assistant professor of Computer Science at Dartmouth College. Prior to that, he was a postdoctoral associate at MIT CSAIL. He received his Ph.D. in Computer Science from Stanford University in 2015. His research interests encompass computer graphics, computational physics, and computational fabrication. In particular, he focuses on building computational approaches to automate the process of exploring complex physical systems.
Super-Resolution Structural Simulation and Optimization
Complex physical systems exhibiting mixed-dimensional geometry and multi-scale mechanics are ubiquitous. Examples include biological structures, such as insect wing exoskeletons, fluid phenomena, such as bubbles and jets, and human-made objects, such as microrobots. The beauty and complexity of these systems attract efforts from scientists, engineers, and artists in various fields. However, a computational investigation of these systems on the level of super-resolution –with millions to billions of computational elements — is still challenging, due to the non-manifold geometric structures, non-linear governing physics, and the tight coupling between them.
My work tackles these challenges by rethinking of the computation pipeline—from a perspective that aims to blur the line between discrete geometry and continuous physics. My guiding principle is to study the hidden low-dimensional topological and structural characteristics underpinning these complex systems and to create the most natural geometric analogs in a discrete setting for efficient simulation and optimization. In this talk, I will present two examples to demonstrate this methodology, including a super-resolution topology optimization algorithm based on sparse grids to emerge biomimetic structures and a numerical simulation approach based on simplicial complexes to model codimensional fluids. These computational tools enable the investigation, discovery, and development of a broad range of complex physical systems that are multi-scale and mixed-dimensional, with applications in computer graphics, computational physics, and additive manufacturing.
Prof. Zhu is being hosted by Prof. Saitou (ME). If you would like to meet with him during his visit, please send an email to email@example.com. If you are an MICDE graduate student and would like to join Prof. Zhu for lunch please RSVP by Friday, December 6th .
Bio: Dr. Pablo Zavattieri is a Professor of Civil Engineering and University Faculty Scholar at Purdue University. Zavattieri received his BS/MS degrees in Nuclear Engineering from the Balseiro Institute (Argentina) and PhD in Aeronautics and Astronautics Engineering from Purdue University. He worked at the General Motors Research and Development Center as a staff researcher for 9 years, where he led research activities in the general areas of computational solid mechanics, smart and biomimetic materials. His current research lies at the interface between solid mechanics and materials engineering. He has focused on the fundamental aspects of how Nature uses elegant and efficient ways to make remarkable materials and their translation to engineering materials. He has contributed to the area of biomimetic materials by investigating the structure-function relationship of naturally-occurring high-performance materials at multiple length-scales, combining state-of-the-art computational techniques and experiments to characterize the properties.
CLEVER ARCHITECTURES, INTERFACES AND COMPETING MECHANISMS IN BIOLOGICAL MATERIALS
Nature uses modest constituents to synthesize composite materials with exceptional mechanical properties for structural and impact resistance purposes. In most cases, these materials achieved outstanding mechanical properties avoiding the typical trade-offs often attained by manmade materials. While these materials require modern microscopy techniques to characterize their complex hierarchical structures, most of our learnings come from the way these materials mitigate catastrophic damage, revealing the most important mechanisms and features of their inner structure that contribute to energy dissipation and toughening. Considering the current progress in material synthesis and manufacturing, these new concepts have converged to the field of architected materials. In this talk, I will describe some interesting mechanics problems that we encountered as we studied some extraordinary species, and how we can translate these lessons learned to architected materials. In particular, I will focus on a few examples related to how the combination of clever architectures, interfaces, material properties and competing mechanisms can promote delocalization to mitigate catastrophic failure, hence, improving toughness and impact resistance without sacrificing other important mechanical properties. Most of this discussion is driven by how we can eventually translate these lessons learned to the development and manufacturing of architected materials.
Prof. Zavattieri is being hosted by Prof. Evgueni Flipov (CEE). If you would like to meet with him during his visit, please send an email to firstname.lastname@example.org. If you are an MICDE or CEE student and would like to join Prof. Zavattieri for lunch please RSVP by Monday, November 4th.
Bio: Sanjay Govindjee is the Horace, Dorothy, and Katherine Johnson Professor in Engineering. His main interests are in theoretical and computational mechanics with an emphasis on micro-mechanics of nonlinear phenomena in solid materials. He was the winner of the inaugural Zienkiewicz Prize and Medal in 1998 and more recently received a 2018 Alexander von Humboldt Foundation Research Prize in honor of his lifetime achievements. For the last two and half years, he has been the PI and co-Director of the NSF NHERI SimCenter at Berkeley.
The NSF Natural Hazards Engineering Research Infrastructure (NHERI) Computation and Simulation Center (SimCenter) at Berkeley: An Overview
In October 2016, the National Science Foundation awarded the NHERI SimCenter to Berkeley. The SimCenter is the computational satellite to the eight experimental sites of the NHERI constellation. Its primary goal is to advance natural hazards engineering through the use of simulation. The center develops and stands-up open-source software to simulate the effects of seismic, wind, and water loads on structures with a focus on regional assessments of damage at high resolution under uncertainty. The SimCenter’s work includes both research and educational components.
The SimCenter has just completed Year 3 or its original mandate and now offers a wide selection of user friendly front end applications that permit local as well as HPC cloud based execution of simulations. Simulations can be of single detailed structural models subjected to a variety of harzards using state-of-the-art and state-of-the-practice loading methodologies. They can also be of a larger regional nature using simpler models and further coupled to forward uncertainty propogation with Monte Carlo methods with or without surrogating. Engineering demands can be further propogated into damage and loss, downtime and recovery, using Hazus methodologies, FEMA P58 methods, or user provided techniques with our hazard-blind framework. All elements of the SimCenter’s software are desgined in a plug-n-play fashion to promote detailed research into natural hazard effects with the ability to see impacts on a larger scale.
In this presentation, I will give an overview of the SimCenter’s recent activities and discuss research needs and how researchers can participate in the SimCenter’s activities, along with a preview of upcoming developments anticipated in Year 4
Prof. Govindjee is being hosted by Prof. Garikipati (ME).