Data Visualization & Presentation Specialist

By | SC2 jobs

AWEA: Data Visualization & Presentation Specialist

This position thinks in numbers and speaks in interactives, maps, and charts to deliver effective and compelling advocacy materials and content to further the organization’s mission and objectives. The role will also leverage data analysis tools to investigate industry data, market trends, and policy impact assessments to support AWEA’s policy and regulatory decision-making and advocacy. This position reports to the Vice President, Research & Analytics.

Essential Functions/Major Responsibilities: 

  • Think critically about graphics and data. Employ a variety of methods – interactive graphics, infographics, charts, maps, fact sheets, and more – to deliver compelling advocacy material and visually stunning market reports and presentations.
  • Develop intuitive, attractive, and interactive data visualizations for reports, presentations, and a range of other uses.
  • Distill data to communicate compelling stories to diverse audiences.
  • Produce high-quality and visually appealing presentation content based on AWEA data and secondary sources. Update legacy presentations, fact sheets, and additional data-driven advocacy content.
  • Lead design and production of AWEA market reports, fact sheets, and data visualizations. Work collaboratively with report authors and subject matter experts to develop content and tailor products to appropriate audience.
  • Build dashboards and similar data summaries for senior management that informs decision-making and planning.
  • Design and perform analyses that support AWEA’s policy agenda at the state, regional, and federal level on strategic issue areas.
  • Provide rapid response and analysis to inquiries from Government and Public Affairs teams, AWEA members, regional partners, the press, and others. Serves as a technical expert in communications with AWEA members, policy makers, the press, and the general public.
  • Assist in reviewing audience-facing material across the organization.

Knowledge, skills and abilities:

  • Strong quantitative analysis skills. Candidate must possess a strong record working with large, complex datasets.
  • Critical thinking: able to look at numbers, trends & data and come to new conclusions and transform into compelling visuals.
  • Strong written and oral communication, with the ability to tailor information to audiences of varying technical capabilities.
  • Expertise in technologies such as Tableau, SQL, JavaScript/HTML/D3.
  • Experience with Adobe Illustrator, Acrobat, and Photoshop.
  • Experience in ArcGIS or other geospatial visualization software.
  • Ability to respond quickly and professionally to inquiries.
  • Ability to prioritize and complete a variety of tasks efficiently.
  • Interpersonal skills and ability to work in a collaborative team environment.

Education and/or Experience

AWEA is seeking applicants whom reflect and understand our core values: We HEART Wind Energy, The Truth Prevails, Together We Succeed, and Ahead of the Curve. Qualified applicants may possess any combination of education and experience enabling them to successfully perform the responsibilities of the position. Qualifications are: Bachelor’s degree in a quantitative field such as economics, statistics, or engineering preferred. Seeking candidates with a minimum three years of experience conceptualizing and creating information graphics, data visualization, interactives, and presentations. Knowledge and working understanding of U.S. electricity sector preferred.

To apply

Send how you heard about this job, cover letter, salary requirement, start date availability, and resume to ResearchJobs@awea.org (use “[Your Name]- Data Visualization & Presentation Specialist” in subject line) or mail to AWEA, 1501 M Street NW, Suite 900, Washington, DC 20005;  Attention:  Human Resources.

On-Site Lead Staff Scientist

By | SC2 jobs

ERDC DSRC On-site Lead Staff Scientist

The Engineer Research and Development Center (ERDC) in Vicksburg, MS is looking for candidates for the Site Lead position. The position description is below and a Secret clearance is required.

Responsibilities

  • Works with HPCMP customers to identify potential multidisciplinary efforts.
  • Participates on appropriate teams as part of the multidisciplinary efforts.
  • Facilitates the process by which efforts are presented for consideration to the Government.
  • Supports major efforts and may serve as principal investigator.

Primary interface with Government leadership.

Role/Qualifications

  • PhD required.
  • 10 years of HPC-related experience preferred. (flexible for strong candidates)
  • Demonstrated experience in multidisciplinary computational environments.

Documented authorship in peer-reviewed publications.

 

They are also looking for scientists with DDA experience, and they could be located at ARL (Aberdeen, MD), AFRL (Dayton, OH), Navy (Stennis, MS) or ERDC (Vicksburg, MS). Description of what they are specifically defining DDA as is below.

Data and Decision Analytics (DDA)

DDA covers the entire computational ecosystem (hardware, software, storage, and networks) required to conduct large-scale data analytics.  This ecosystem includes how large data is managed, analyzed, and visualized. Capabilities include methods for conducting exploration (what does the data look like?), descriptive (what happened?), diagnostics (why did it happen?), predictive (what will happen?), and prescriptive (how can we make it happen?) analyses.  In this computational area, the Contractor shall:

  • Work with the HPCMP user base to identify requirements for software and hardware in the area of DDA
  • Provide software engineering support for development and improvement of DDA codes
  • Provide expert support for the R programming language and R studio
  • Provide application-level support for DDA tools including but not limited to Caffe, Tensorflow, Sparc, python, anaconda H2O, and container technologies
  • Provide technical support to DDA users/customers across DoD
  • Foster and develop collaborations with DoD and non-HPCMP user and customers.
  • Identify opportunities, requirements and promising research approaches to exploit and leverage promising data science capabilities. Potential areas include:
    • Cognitive modeling
    • Automated Target Recognition (ATR) (e.g., DARPA Trace program)
    • Test range and training center data streams
    • Autonomy – Supervised and unsupervised machine learning methods
    • High-throughput screening
    • Automated fitting and multiscale approaches
    • Intrusion-resilient cyber systems and cyber vulnerability assessments and reactions
    • Decision-support systems and augmentation of human performance
    • Utilization of relational databases, non-relational databases, polystores, and other data management methodologies
  • Implement algorithmic cores and new work-stream tools and collaborate with PET on-sites to promote the adoption of HPDA within the HPCMP community.
  • Provide leadership to identify, explore and evaluate data science software tools for inclusion within the HPCMP software stack and user community.
  • Collaborate with the HPCMP and Centers team to determine when traditional HPC hardware and middleware is appropriate for HPDA problems and tools.  Alternatively, provide guidance if and when specialized hardware, middleware, or system configurations would be more optimal.
  • Provide technical support to SIP customers/users for existing and emerging software in terms of usability, performance optimization, validation/verification and strong/weak scaling.
  • Foster and develop collaborative relationships with DoD and non-DoD communities.

Research Data Scientist Intermediate for ARC-TS

By | SC2 jobs

Advanced Research Computing – Technology Services (ARC-TS) has an exciting opportunity for those who wish to impact our world through science and research by use of computational and data tools such as, Machine Learning, statistical analysis, High Performance Computing (HPC), Big Data (Hadoop, Spark, DBMS, etc), cloud computing services (AWS, Azure, GCP), and more.

This position will be part of a team supporting all areas of research that utilize data at the University of Michigan.  The primary responsibilities will be consulting on as well as providing services in collecting, discovering, cataloging, manipulating, and transforming data.  This role will work closely with possibly multiple projects. Other responsibilities will include making presentations and providing training on the use, cataloging, and manipulation of data to students and researchers.

The successful candidate should be comfortable with Linux systems and the use of common data manipulation tools and languages such as Python and SQL and be able to pick up new tools quickly as needed for the scope of the project currently assigned.

Note: Technical training will be provided to address specific gaps in desired qualifications.

Responsibilities

Data Preparation and Identification — This role will help users through the lifecycle of their datasets.  The position will help users understand the data set that they have, determine programmatic ways to clean the data, prepare the data for analysis and annotate datasets with descriptions for multiple uses.  We also foresee the role helping to identify existing datasets around the University that could be used by courses and for research.  Data Collection and Programing — Your role will assist in the creation of tools that collect data from many disparate sources such as SQL and NoSQL, databases,  APIs, web scraping, flat files, and other file formats. Your interaction with the research projects may include extended functionality to manipulate, identify duplicates, removing identifying data, etc through the use of tools. Documentation and Training of Tools — Your role will participate in a larger group to provide workshops on the use of data and data manipulation tools.  This will include creating documentation of how to use tools in our supported environments such. Documentation and Cataloging of Data — Your role will document data such as meta-data, schemas and more so that researchers may consume prepared data for use in their own analysis.  This documentation will include how the data are manipulated and assumptions used for any summaries or statistics. Development of Self and Others — You will explore new tools and technologies through formal and self-directed learning. Research and provide advice to team on latest application technology trends to support ongoing development of existing tools and services.

Required Qualifications

  • Bachelors degree in a related field and/or equivalent combination of education, certification and experience
  • Two (2) years of experience in collecting, discovering, cataloging, manipulating, and transforming data
  • Python Proficiency
  • Very Basic SQL experience
  • Linux Proficiency
  • Experience with data from different fields and domains
  • Comfortable supporting a broad range of research (students, researchers, and faculty)
  • Ability to communicate effectively via email, letters, and in person to teams and customers
  • Ability to work independently and collaboratively

Desired Qualifications

  • Masters or PhD in related area
  • Experience working in an academic environment
  • Familiarity with big data tools from the Hadoop ecosystem such as Mapreduce, Spark, Hive, Impala, etc.
  • Understanding of any of the following numerical techniques:  causal inference, selection bias, dimensionality reduction (Singular value decomposition, Principal component analysis)
  • Understanding of Machine Learning tools such as Tensorflow, PyTORCH, Scikit, CNTK/Microsoft Cognitive Toolkit, Power AI, Theano, Caffe, etc.
  • Understanding of Machine Learning/AI methods such as random forest, neural networks, Markov models, etc.
  • Proficiency in any of the following:  R, SAS, SPSS, Tableau, Perl, C/C++, Go, etc.
  • Advanced SQL experience
  • Experience with any of the following:  Compilers, Makefiles, and common build chains (autoconf/automake, CMake, pip, easy_build, Spack)

Diversity, Equity and Inclusion

The University of Michigan Information and Technology Services seeks to recruit and retain a diverse workforce as a reflection of our commitment to serve the diverse people of Michigan, to maintain the excellence of the University and to offer our students richly varied disciplines, perspectives and ways of knowing and learning.

Comprehensive Benefits

The University of Michigan Benefits Office is committed to offering a high-quality benefits package to support faculty, staff and their families.  Learn more about our 2:1 retirement matching, healthcare plans with nationwide coverage including prescription drug coverage, three dental plans, a vision plan, flexible spending account, well-being programs, long-term disability, automatic life insurance, general legal services, three early childhood centers, time away from work and work-life programs to promote balance.  Learn more at hr.umich.edu/benefits-wellness

Application Procedure

To be considered, a cover letter and resume are required.  The cover letter must be the leading page of your resume and should:

  • Specifically outline the reasons for your interest in the position and
  • Outline your particular skills and experience that directly relate to this position.
  • For more information, and to apply, use the link found here

Starting salaries will vary depending upon the qualifications and experience of the selected candidate.

Salary: $68,462.00 – $89,000.00

Work Location: Ann Arbor Campus, Ann Arbor, MI

Full-Time Position

 

MICDE Director, Krishna Garikipati, wins USACM Fellow award

By | News, Uncategorized

Krishna Garikipati, professor of Mechanical Engineering and of Mathematics, and director of MICDE, has been granted a 2019 United States Association for Computational Mechanics (USACM) Fellows award for his work in developing numerical methods applied to strongly nonlinear problems in living and nonliving material systems.

The Fellows Award recognizes individuals with a distinguished record of research, accomplishment and publication in areas of computational mechanics and demonstrated support of the USACM through membership and participation in the Association, its meetings and activities. All recipients shall be members in good standing of USACM. Multiple awards may be given at two-year intervals.

MICDE to host NSF Computational Mechanics Vision workshop

By | News

In Fall 2019, MICDE will host the NSF workshop entitled Computational Mechanics Vision Workshop. Organized by Boston University, Duke University and the University of Michigan. The workshop’s aim is to solicit and synthesize directions for computational mechanics research and education in the United States over the next decade and beyond from a diverse cross section of scientists and engineers. Read more…