Explore ARCExplore ARC

[SC2 Jobs] Postdoctoral position for Prof. Grosch’s Group

By | General Interest, News, SC2 jobs

Job Description

Prof. Grosch’s Group in the Department of Mechanical Engineering is currently seeking a post-doctoral scholar to work on our NIH-funded cochlear mechanics project. The position involves high-performance, time-domain computations in order to model the fluid-structure interaction in the cochlea including electro-mechanical coupling via the biologically piezoelectric outer hair cells. The candidate should have a PhD in Engineering or a related field (e.g., Biophysics, Mathematics, or Computer Science) with experience Scientific Computing and an interest high performance for nonlinear dynamics simulations. Prior experience in cochlear mechanics is not required.

Applying

If you are interested in this position, please send Karl Grosh (grosh@umich.edu) your curriculum vitae and at least two references.

[SC2 Jobs] Job Openings in KLA-Tencor

By | General Interest, News, SC2 jobs

Job Description

KLA-Tencor Corporation is a leading provider of process control and yield management solutions and partners with customers around the world to develop leading edge inspection and metrology technologies. These technologies serve the semiconductor and global electronics industry. With a portfolio of industry-leading products and a team of world-class engineers and scientists, the company has created superior solutions for its customers for more than 40 years.

KLA-Tencor is ramping a R&D center in Ann Arbor and is looking for brilliant Computational & Data scientists and Machine Learning experts to join the company. Please see the table below for the job lists and links to job descriptions, where you may apply for the job directly. If you have questions regarding these positions, please email Tina Revels at Tina.Revels@kla-tencor.com

KT Job Opening # Posting Title Link to Webpage for Job Descriptions and Application
119639 Research Scientist https://careers.kla-tencor.com/jobs/3341509-research-scientist
119638 Research Scientist https://careers.kla-tencor.com/jobs/3341508-research-scientist
119203 Sr. Software Engineer (Big Data) https://careers.kla-tencor.com/jobs/3259406-sr-software-engineer-big-data
119637 Algorithm Engineer https://careers.kla-tencor.com/jobs/3341507-algorithm-engineer
119446 Supply Chain Specialist https://careers.kla-tencor.com/jobs/3259407-supply-chain-specialist
119267 Senior Algorithm/Research Scientist ( Machine Learning | Statistical analysis | Optimization ) https://careers.kla-tencor.com/jobs/3265156-senior-algorithm-slash-research-scientist-machine-learning-statistical-analysis-optimization
119268 Senior Algorithm/Research Scientist ( Machine Learning | Statistical analysis | Optimization ) https://careers.kla-tencor.com/jobs/3265157-senior-algorithm-slash-research-scientist-machine-learning-statistical-analysis-optimization
119260 Algorithm Engineer https://careers.kla-tencor.com/jobs/3265069-algorithm-engineer
119262 Algorithm Engineer https://careers.kla-tencor.com/jobs/3265070-algorithm-engineer
119537 Research Scientist https://careers.kla-tencor.com/jobs/3310393-research-scientist

 

Most CSCAR workshops will be free for the U-M community starting in January 2019

By | Educational, General Interest, Happenings, News

Beginning in January of 2019, most of CSCAR’s workshops will be offered free of charge to UM students, faculty, and staff.

CSCAR is able to do this thanks to funding from UM’s Data Science Initiative.  Registration for CSCAR workshops is still required, and seats are limited.

CSCAR requests that participants please cancel their registration if they decide not to attend a workshop for which they have previously registered.

Note that a small number of workshops hosted by CSCAR but taught by non-CSCAR personnel will continue to have a fee, and fees will continue to apply for people who are not UM students, faculty or staff.

Eric Michielssen completes term as Associate Vice President for Research – Advanced Research Computing

By | General Interest, Happenings, News

Eric Michielssen will step down from his position as Associate Vice President for Research – Advanced Research Computing on December 31, 2018, after serving in that leadership role for almost six years. Dr. Michielssen will return to his faculty role in the Department of Electrical Engineering and Computer Science in the College of Engineering.

Under his leadership, Advanced Research Computing has helped empower computational discovery through the Michigan Institute for Computational Discovery and Engineering (MICDE), the Michigan Institute for Data Science (MIDAS), Advanced Research Computing-Technology Services (ARC-TS) and Consulting for Statistics, Computing and Analytics Research (CSCAR).

In 2015, Eric helped launch the university’s $100 million Data Science initiative, which enhances opportunities for researchers across campus to tap into the enormous potential of big data. He also serves as co-director of the university’s Precision Health initiative, launched last year to harness campus-wide research aimed at finding personalized solutions to improve the health and wellness of individuals and communities.

The Office of Research will convene a group to assess the University’s current and emerging needs in the area of research computing and how best to address them.

U-M approves new graduate certificate in computational neuroscience

By | Educational, General Interest, Happenings, News

The new Graduate Certificate in Computational Neuroscience will help bridge the gap between experimentally focused studies and quantitative modeling and analysis, giving graduate students a chance to broaden their skill sets in the diversifying field of brain science.

“The broad, practical training provided in this certificate program will help prepare both quantitatively focused and lab-based students for the increasingly cross-disciplinary job market in neuroscience,” said Victoria Booth, Professor of Mathematics and Associate Professor of Anesthesiology, who will oversee the program.

To earn the certificate, students will be required to take core computational neuroscience courses and cross-disciplinary courses outside of their home departments; participate in a specialized interdisciplinary journal club; and complete a practicum.

Cross-discplinary courses will depend on a student’s focus: students in experimental neuroscience programs will take quantitative coursework, and students in quantitative science programs such as physics, biophysics, mathematics and engineering will take neuroscience coursework.

The certificate was approved this fall, and will be jointly administered by the Neuroscience Graduate Program (NGP) and the Michigan Institute for Computational Discovery and Engineering (MICDE).

For more information, visit micde.umich.edu/comput-neuro-certificate. Enrollment is not yet open, but information sessions will be scheduled early next year. Please register for the program’s mailing list if you’re interested.

Along with the Graduate Certificate in Computational Neuroscience, U-M offers several other graduate programs aimed at training students in computational and data-intensive science, including:

  • The Graduate Certificate in Computational Discovery and Engineering, which is focused on quantitative and computing techniques that can be applied broadly to all sciences.
  • The Graduate Certificate in Data Science, which specializes in statistical and computational methods required to analyze large data sets.
  • The Ph.D in Scientific Computing, intended for students who will make extensive use of large-scale computation, computational methods, or algorithms for advanced computer architectures in their doctoral studies. This degree is awarded jointly with an existing program, so that a student receives, for example, a Ph.D in Aerospace engineering and Scientific Computing.

 

U-M awarded a Clare Boothe Luce grant for fellowships to support women in STEM

By | Educational, General Interest, Happenings, News

The Clare Boothe Luce Program of the Henry Luce Foundation has awarded a $270,000 grant to the University of Michigan. The funding will support women PhD students through the Michigan Institute for Computational Discovery and Engineering (MICDE). The program aims to encourage women “to enter, study, graduate and teach” in science, and the funding will support women PhD students who make use of computational science in their research.

“We’re very excited to be able to promote women in scientific computing,” said Mariana Carrasco-Teja, manager of the grant and Associate Director of MICDE. “These resources generously provided by the Clare Boothe Luce program will make a huge difference in the careers of women pursuing computational science at U-M.”

For details on applying, and fellowship requirements, see the fellowship page at micde.umich.edu/academic-programs/cbl/.

The fellowships carry a $35,000 annual stipend and tuition, among other benefits. They will be awarded to students applying for PhD programs in fall 2019 in the College of Engineering, or several programs in the College of Literature, Science and the Arts (Applied and Interdisciplinary Mathematics, Applied Physics, Astronomy, Chemistry, Earth & Environmental Sciences, Mathematics, Physics, and Statistics).

The CBL program at U-M is funded by the Clare Boothe Luce Program of the Henry Luce Foundation, with additional support from the Rackham School of Graduate Studies, the College of Engineering, the College of Literature, Sciences and the Arts, and MICDE.

Postdoctoral Position in in Machine Learning Methods for Computational Physics at U-M

By | General Interest, News, SC2 jobs

Postdoctoral Position

Machine Learning Methods for Computational Physics
University of Michigan
Department of Mechanical Engineering

Applications are invited for a postdoctoral research positions to join the Computational Physics group in mechanical engineering at the University of Michigan to develop machine learning methods for system identification of partial differential equations.

Qualifications

Applicants should have a doctoral degree in engineering or mathematics with a strong focus on computational science. Some combination of a familiarity with numerical methods for PDEs, high performance computing and machine learning would be ideal.

Compensation

Compensation (salary and benefits) will be offered according to University of Michigan.

The position is available immediately but starting date is negotiable. To apply please contact Prof. Krishna Garikipati at krishna@umich.edu

The University of Michigan offers a vibrant computational science community. 

U-M participates in SC18 conference in Dallas

By | General Interest, Happenings, News

University of Michigan researchers and IT staff wrapped up a successful Supercomputing ‘18 (SC18) in Dallas from Nov. 11-16, 2018, taking part in a number of different aspects of the conference.

SC “Perennial” Quentin Stout, U-M professor of Electrical Engineering and Computer Science and one of only 19 people who have been to every Supercomputing conference, co-presented a tutorial titled Parallel Computing 101.

And with the recent announcement of a new HPC cluster on campus called Great Lakes, IT staff from Advanced Research Computing – Technology Services (ARC-TS) made presentations around the conference on the details of the new supercomputer.

U-M once again shared a booth with Michigan State University booth, highlighting our computational and data-intensive research as well as the comprehensive set of tools and services we provide to our researchers. Representatives from all ARC units were at the booth: ARC-TS, the Michigan Institute for Data Science (MIDAS), the Michigan Institute for Computational Discovery and Engineering (MICDE), and Consulting for Statistics, Computing and Analytics Research (CSCAR).

The booth also featured two demonstrations: one on the Open Storage Research Infrastructure or OSiRIS, the multi-institutional software-defined data storage system, and the Services Layer At The Edge (SLATE) project, both of which are supported by the NSF; the other tested conference-goers’ ability to detect “fake news” stories compared to an artificial intelligence system created by researchers supported by MIDAS.

Gallery

U-M Activities

  • Tutorial: Parallel Computing 101: Prof. Stout and Associate Professor Christiane Jablonowski of the U-M Department of Climate and Space Sciences and Engineering provided a comprehensive overview of parallel computing.
  • Introduction to Kubernetes. Presented by Bob Killen, Research Cloud Administrator, and Scott Paschke, Research Cloud Solutions Designer, both from ARC-TS. Containers have shifted the way applications are packaged and delivered. Their use in data science and machine learning is skyrocketing with the beneficial side effect of enabling reproducible research. This rise in use has necessitated the need to explore and adopt better container-centric orchestration tools. Of these tools, Kubernetes – an open-source container platform born within Google — has become the de facto standard. This half-day tutorial introduced researchers and sys admins who may already be familiar with container concepts to the architecture and fundamental concepts of Kubernetes. Attendees explored these concepts through a series of hands-on exercises and left with the leg-up in continuing their container education, and gained a better understanding of how Kubernetes may be used for research applications.
  • Brock Palen, Director of ARC-TS, spoke about the new Great Lakes HPC cluster:
    • DDN booth (3123)
    • Mellanox booth (3207)
    • Dell booth (3218)
    • SLURM booth (1242)
  • Todd Raeker, Research Technology Consultant for ARC-TS, went to the Globus booth (4201) to talk about U-M researchers’ use of the service.
  • Birds of a Feather: Meeting HPC Container Challenges as a Community. Bob Killen, Research Cloud Administrator at ARC-TS, gave a lightning talk as part of this session that presented, prioritized, and gathered input on top issues and budding solutions around containerization of HPC applications.
  • Sharon Broude Geva, Director of ARC, was live on the SC18 News Desk discussing ARC HPC services, Women in HPC, and the Coalition for Scientific Academic Computation (CASC). The stream was available from the Supercomputing Twitter account: https://twitter.com/Supercomputing
  • Birds of a Feather: Ceph Applications in HPC Environments: Ben Meekhof, HPC Storage Administrator at ARC-TS, gave a lightning talk on Ceph and OSiRIS as part of this session. More details at https://www.msi.umn.edu/ceph-hpc-environments-sc18
  • ARC was a sponsor of the Women in HPC Reception. See the event description for more details and to register. Sharon Broude Geva, Director of ARC, gave a presentation.
  • Birds of a Feather: Cloud Infrastructure Solutions to Run HPC Workloads: Bob Killen, Research Cloud Administrator at ARC-TS, presented at this session aimed at architects, administrators, software engineers, and scientists interested in designing and deploying cloud infrastructure solutions such as OpenStack, Docker, Charliecloud, Singularity, Kubernetes, and Mesos.
  • Jing Liu of the Michigan Institute for Data Science, participated in a panel discussion at the Purdue University booth.

Follow ARC on Twitter at https://twitter.com/ARC_UM for updates.

Beta cluster available for learning Slurm; new scheduler to be part of upcoming cluster updates

By | Flux, General Interest, Happenings, HPC, News

New HPC resources to replace Flux and updates to Armis are coming.  They will run a new scheduling system (Slurm). You will need to learn the commands in this system and update your batch files to successfully run jobs. Read on to learn the details and how to get training and adapt your files.

In anticipation of these changes, ARC-TS has created the test cluster “Beta,” which will provide a testing environment for the transition to Slurm. Slurm will be used on Great Lakes; the Armis HIPAA-aligned cluster; and a new cluster called “Lighthouse” which will succeed the Flux Operating Environment in early 2019.

Currently, Flux and Armis use the Torque (PBS) resource manager and the Moab scheduling system; when completed, Great Lakes and Lighthouse will use the Slurm scheduler and resource manager, which will enhance the performance and reliability of the new resources. Armis will transition from Torque to Slurm in early 2019.

The Beta test cluster is available to all Flux users, who can login via ssh at ‘beta.arc-ts.umich.edu’. Beta has its own /home directory, so users will need to create or transfer any files they need, via scp/sftp or Globus.

Slurm commands will be needed to submit jobs. For a comparison of Slurm and Torque commands, see our Torque to Slurm migration page. For more information, see the Beta home page.

Support staff from ARC-TS and individual academic units will conduct several in-person and online training sessions to help users become familiar with Slurm. We have been testing Slurm for several months, and believe the performance gains, user communications, and increased reliability will significantly improve the efficiency and effectiveness of the HPC environment at U-M.

The tentative time frame for replacing or transitioning current ARC-TS resources is:

  • Flux to Great Lakes, first half of 2019
  • Armis from Torque to Slurm, January 2019
  • Flux Operating Environment to Lighthouse, first half of 2019
  • Open OnDemand on Beta, which replaces ARC Connect for web-based job submissions, Jupyter Notebooks, Matlab, and additional software packages, fall 2018

U-M selects Dell EMC, Mellanox and DDN to Supply New “Great Lakes” Computing Cluster

By | Flux, General Interest, Happenings, HPC, News

The University of Michigan has selected Dell EMC as lead vendor to supply its new $4.8 million Great Lakes computing cluster, which will serve researchers across campus. Mellanox Technologies will provide networking solutions, and DDN will supply storage hardware.

Great Lakes will be available to the campus community in the first half of 2019, and over time will replace the Flux supercomputer, which serves more than 2,500 active users at U-M for research ranging from aerospace engineering simulations and molecular dynamics modeling to genomics and cell biology to machine learning and artificial intelligence.

Great Lakes will be the first cluster in the world to use the Mellanox HDR 200 gigabit per second InfiniBand networking solution, enabling faster data transfer speeds and increased application performance.

“High-performance research computing is a critical component of the rich computing ecosystem that supports the university’s core mission,” said Ravi Pendse, U-M’s vice president for information technology and chief information officer. “With Great Lakes, researchers in emerging fields like machine learning and precision health will have access to a higher level of computational power. We’re thrilled to be working with Dell EMC, Mellanox, and DDN; the end result will be improved performance, flexibility, and reliability for U-M researchers.”

“Dell EMC is thrilled to collaborate with the University of Michigan and our technology partners to bring this innovative and powerful system to such a strong community of researchers,” said Thierry Pellegrino, vice president, Dell EMC High Performance Computing. “This Great Lakes cluster will offer an exceptional boost in performance, throughput and response to reduce the time needed for U-M researches to make the next big discovery in a range of disciplines from artificial intelligence to genomics and bioscience.”

The main components of the new cluster are:

  • Dell EMC PowerEdge C6420 compute nodes, PowerEdge R640 high memory nodes, and PowerEdge R740 GPU nodes
  • Mellanox HDR 200Gb/s InfiniBand ConnectX-6 adapters, Quantum switches and LinkX cables, and InfiniBand gateway platforms
  • DDN GRIDScaler® 14KX® and 100 TB of usable IME® (Infinite Memory Engine) memory

“HDR 200G InfiniBand provides the highest data speed and smart In-Network Computing acceleration engines, delivering HPC and AI applications with the best performance, scalability and efficiency,” said Gilad Shainer, vice president of marketing at Mellanox Technologies. “We are excited to collaborate with the University of Michigan, Dell EMC and DataDirect Networks, in building a leading HDR 200G InfiniBand-based supercomputer, serving the growing demands of U-M researchers.”

“DDN has a long history of working with Dell EMC and Mellanox to deliver optimized solutions for our customers. We are happy to be a part of the new Great Lakes cluster, supporting its mission of advanced research and computing. Partnering with forward-looking thought leaders as these is always enlightening and enriching,” said Dr. James Coomer, SVP Product Marketing and Benchmarks at DDN.

Great Lakes will provide significant improvement in computing performance over Flux. For example, each compute node will have more cores, higher maximum speed capabilities, and increased memory. The cluster will also have improved internet connectivity and file system performance, as well as NVIDIA Tensor GPU cores, which are very powerful for machine learning compared to prior generations of GPUs.

“Users of Great Lakes will have access to more cores, faster cores, faster memory, faster storage, and a more balanced network,” said Brock Palen, Director of Advanced Research Computing – Technology Services (ARC-TS).

The Flux cluster was created approximately 8 years ago, although many of the individual nodes have been added since then. Great Lakes represents an architectural overhaul that will result in better performance and efficiency. Based on extensive input from faculty and other stakeholders across campus, the new Great Lakes cluster will be designed to deliver similar services and capabilities as Flux, including the ability to accommodate faculty purchases of hardware, access to GPUs and large-memory nodes, and improved support for emerging uses such as machine learning and genomics.

ARC-TS will operate and maintain the cluster once it is built. Allocations of computing resources through ARC-TS include access to hundreds of software titles, as well as support and consulting from professional staff with decades of combined experience in research computing.

Updates on the progress of Great Lakes will be available at https://arc-ts.umich.edu/greatlakes/.