Skip to content

Latest commit

 

History

History
473 lines (357 loc) · 21.2 KB

weekly-email-20220110.md

File metadata and controls

473 lines (357 loc) · 21.2 KB

NERSC Weekly Email, Week of January 10, 2022

Contents

Summary of Upcoming Events and Key Dates

        January 2022
 Su  Mo  Tu  We  Th  Fr  Sa
                          1 
  2   3   4   5   6   7   8   
  9 *10-*11**12*-13*-14* 15   10-14 Jan      BerkeleyGW Events [1]
                              11 Jan         Perlmutter Maintenance [2]
                              12 Jan         CSGF Applications Due [3]
                              12 Jan         ECP Monthly Webinar POSTPONED [4]
                              12-13 Jan      NVIDIA HPC SDK Training [5]
                              14 Jan         Simons Postdoc Apps Due [6]
 16 *17* 18 *19* 20  21  22   17 Jan         Martin Luther King Holiday [7]
                              19 Jan         First Day of AY22 [8]
 23  24 *25* 26  27  28  29   25 Jan         Perlmutter Maintenance [2]
 30  31

        February 2022
 Su  Mo  Tu  We  Th  Fr  Sa
          1   2   3   4   5
  6   7   8   9  10  11  12
 13  14  15  16  17  18  19
 20 *21* 22  23  24  25  26   21 Feb         Presidents Day Holiday [9]
 27  28  

         March 2022
 Su  Mo  Tu  We  Th  Fr  Sa
          1   2   3   4   5
  6   7   8   9  10  11  12
 13  14  15  16  17  18  19
 20  21  22  23  24  25  26   
 27  28  29  30  31
  1. January 10-12 & 13-14, 2022: BerkeleyGW Tutorial Workshop & BESC2022
  2. January 11 & 25, 2022: Perlmutter Maintenance
  3. January 12, 2022: Computational Science Graduate Fellowship Applications Due
  4. January 12, 2022: IDEAS-ECP Monthly Webinar
  5. January 12-13, 2022: NVIDIA HPC SDK Training
  6. January 14, 2022: Simons Postdctoral Fellowship Applications Due
  7. January 17, 2022: Martin Luther King Jr. Holiday (No Consulting or Account Support)
  8. January 19, 2022: First day of Allocation Year 2022
  9. February 21, 2022: Presidents Day Holiday (No Consulting or Account Support)
  10. All times are Pacific Time zone
  • Upcoming Planned Outage Dates (see Outages section for more details)

    • Tuesday: Perlmutter
    • Wednesday: HPSS Archive (User), HPSS Regent (Backup)
    • next Wednesday: Iris, Cori, NoMachine, Spin
  • Other Significant Dates

    • May 30, 2022: Memorial Day Holiday (No Consulting or Account Support)
    • June 20, 2022: Juneteenth Holiday (No Consulting or Account Support)
    • July 4, 2022: Independence Day Holiday (No Consulting or Account Support)
    • September 5, 2022: Labor Day Holiday (No Consulting or Account Support)
    • November 24-25, 2022: Thanksgiving Holiday (No Consulting or Account Support)
    • December 23, 2022-January 2, 2023: Winter Shutdown (Limited Consulting and Account Support)

(back to top)


NERSC Status

NERSC Operations Continue, with Minimal Changes

Berkeley Lab, where NERSC is located, is operating under public health restrictions. NERSC continues to remain open while following site-specific protection plans. We remain in operation, with the majority of NERSC staff working remotely, and staff essential to operations onsite. We do not expect significant changes to our operations in the next few months.

You can continue to expect regular online consulting and account support as well as schedulable online appointments. Trainings continue to be held online. Regular maintenances on the systems continue to be performed while minimizing onsite staff presence, which could result in longer downtimes than would occur under normal circumstances.

Because onsite staffing is so minimal, we request that you continue to refrain from calling NERSC Operations except to report urgent system issues.

For current NERSC systems status, please see the online MOTD and current known issues webpages.

(back to top)


This Week's Events and Deadlines

BerkeleyGW Events this Week!

BerkeleyGW users and other interested parties are invited to attend the 8th BerkeleyGW Tutorial Workshop and the 3rd Berkeley Excited States Conference, beginning January 10.

The 8th BerkeleyGW Tutorial Workshop

The 8th Annual BerkeleyGW Tutorial workshop will be held online January 10-12. The workshop will feature hands-on user sessions on the GW and GW-BSE approaches using the BerkeleyGW package. The target participants for the workshop include graduate students, postdocs, and researchers interested in learning about or sharpening their skills on ab initio calculations of many-electron effects in excited-state properties of condensed matter.

The 3rd Berkeley Excited States Conference (BESC2022)

On January 13-14, the two-day annual Berkeley Excited States Conference 2022 will also be held online. BESC2022 will be a general topical conference featuring invited talks by experts on recent progress in the theory and applications of ab initio study of quantum many-body effects as well as experiments in excited-state phenomena in materials, showcasing forefront research involving advanced many-body approaches, novel experiments, and new science in the excited state.

Registration

For more information and to register please see https://workshop.berkeleygw.org. There are no fees to register but space is limited for the hands-on tutorial sessions.

Applications for DOE Computational Science Graduate Fellowship Due Wednesday!

Are you a US citizen or lawful permanent resident planning to embark on your first or second year of PhD study in physical, engineering, computer, mathematical, or life sciences at an accredited US college or university in the fall of 2022? If so, you may be eligible to apply for the Department of Energy's Computational Science Graduate Fellowship. Benefits include a yearly stipend of $38,000 and payment of full tuition and fees during the up to 4 years of total support, a 12-week practicum experience at a DOE national laboratory, access to DOE supercomputers, and more!

For more information, please see the CSGF Webpage. Applications are due this Wednesday, January 12.

NVIDIA HPC SDK Training, this Wednesday & Thursday

NVIDIA will present a two-day training for users about the GPU programming models supported by NVIDIA's HPC SDK compilers, including Standard Language Acceleration and Libraries, OpenACC, OpenMP offload, and CUDA. The NVIDIA compilers are the default, recommended compiling environment for Perlmutter. Basic GPU architecture and profiling tools will also be presented. Attendees will have the opportunity to do hands-on and homework exercises on the GPUs on Perlmutter.

For more information and to register, please see https://www.nersc.gov/users/training/events/nvidia-hpcsdk-training-jan2022/.

January 12 IDEAS-ECP Webinar Postponed

The next webinar in the Best Practices for HPC Software Developers series is entitled "Wrong Way: Lessons Learned and Possibilities for Using the 'Wrong' Programming Approach on Leadership Computing Facility Systems," has been postponed. We will let you know when it is rescheduled.

In this webinar, Philip C. Roth (Oak Ridge National Laboratory) will discuss the impact of using a "wrong" programming approach for a given system, in the context of the accelerated large-scale computing systems deployed and being deployed at US Department of Energy Computing facilities. In this webinar, he will present a few of these "wrong" programming approaches, and lessons learned in cases where the approach has been attempted.

There is no cost to attend, but registration is required. Please register here.

(NEW) Apply Today for Simons Postdoctoral Fellowship at IPAM/UCLA!

The Institute for Pure and Applied Mathematics (IPAM) at UCLA is recruiting up to three Simons Postdoctoral Scholars (SPD) for calendar-year appointments beginning on August 1, 2022.

During the 2022-23 academic year, IPAM will host two long programs: Computational Microscopy and New Mathematics for the Exascale: Applications to Materials Science. The successful candidate will pursue a robust program of mathematical research that connects to at least one of these programs. A PhD in Mathematics, statistics, or a related field recieved in May 2017 or later is required. Women and minorities are especially encouraged to apply.

For more information please see the IPAM search for postdoctoral scholars web page at http://www.ipam.ucla.edu/about-ipam/ipam-accepting-applications-for-postdoctoral-scholars/. For fullest consideration, apply by this Friday, January 14, 2022.

Martin Luther King, Jr. Day Holiday Next Monday; No Consulting or Account Support

Consulting and account support will be unavailable next Monday, January 17, due to the Berkeley Lab-observed Martin Luther King, Jr. Day holiday. Regular consulting and account support services will resume the next day.

(back to top)


Perlmutter

Perlmutter Machine Status

The initial phase of the Perlmutter supercomputer is in the NERSC machine room, running user jobs.

We have added many early users onto the machine. We hope to add even more users soon. Anyone interested in using Perlmutter may apply using the Perlmutter Access Request Form.

The second phase of the machine, consisting of CPU-only nodes, will arrive early this year. After all the new nodes arrive, all of Perlmutter will be taken out of service and integrated over a period that we anticipate could take up to 8 weeks. We are developing a plan for integration that will reduce the amount of time the entire system is down. We will let you know when this plan is finalized.

This newsletter item will be updated each week with the latest Perlmutter status.

Prepare Your Dotfiles for Perlmutter!

To help ready your account for Perlmutter, please review your dotfiles. The same home file system is mounted across all NERSC systems, so your .bashrc/.cshrc/etc. files (dotfiles) need to work on all systems. The NERSC_HOST variable can help you distinguish between systems and to set customizations for each system. The NERSC_HOST variable is set automatically to "perlmutter" on Perlmutter and to "cori" on Cori.

Some users may have older dot files that are setting the NERSC_HOST variable without first checking to see whether it already has a value, which will cause problems on Perlmutter. Please ensure that this is not the case in your dotfiles. Feel free to reach out to NERSC consulting with any questions or issues.

(back to top)


Updates at NERSC

Allocation Year 2021 to 2022 Transition Next Wednesday, January 19

The current Allocation Year (AY21) ends at 7 am Pacific Time on Wednesday, January 19, 2022.

PIs for new and continuing projects have received notification of the status of their project request. Project PIs/Proxies/Membership managers for continuing projects should select their continuing users before close of business, next Tuesday, January 18. This can be performed in the project's "Roles" tab. Note that the "alert" triangle will continue to appear at the top of the Iris webpage even if you have completed this task, until the new allocation year.

There are a few changes for AY22. Of note:

  • Charge factors have been recalibrated to the new Perlmutter machine. The new currency, "CPU Node Hours", are 1/400th the size of the former "NERSC Hours" currency. Using one hour of time on a Cori Haswell node will cost 0.34 CPU Node Hours, and one hour of time on a Cori KNL will cost 0.2 CPU Node Hours. When we begin charging for Perlmutter (which will likely happen for the second half of the year), the charge for a single Perlmutter CPU node for one hour will be 1.0.
  • Two separate allocations for CPU and GPU architectures. Projects will have allocations on CPU and GPU architectures, which are separate and cannot be traded or exchanged.
  • New default Python Module at AY Transition. Please see the entry on this topic below.
  • Cori defaults will change in March, 2022. Cori defaults will remain the same at the AY transition, but the machine will undergo what we hope will be its final major system upgrade during the March maintenance, at which time the default user environment will be updated.

For more information, please see the Allocation Year Transition page.

New Default Python module at AY transition; All Python modules will support conda activate

Python users take note: On Jan 19, 2022 at the Allocation Year rollover, NERSC will change our default Python and Python3 modules on Cori to python/3.9-anaconda-2021.11. Please note that older Python modules will remain available, but users must specify the full module name to continue to use them.

Updates in this module include:

  • Python 3.9
  • Support for conda activate
  • Mamba 0.7.6 (a faster alternative to conda)
  • netcdf4 1.5.8
  • mpi4py 3.1.3
  • authlib 0.15.4 (support for NERSC's Superfacility API)

This module is available via module load python/3.9-anaconda-2021.11, and we encourage you to test it now.

At the AY transition, we will retroactively change the behavior of all Cori Python modules to support conda activate. Please see these pending updates to our documentation for more details. As always, if you have a question, please contact us via our helpdesk.

Note for pip users: pip packages installed via --user will be installed at $HOME/.local/cori/3.9-anaconda-2021.11 (defined by $PYTHONUSERBASE).

Note that the python/3.9-anaconda-2021.11 module is already the default on Perlmutter and conda activate functionality is already supported there. There are no scheduled Python module changes on Perlmutter.

NERSC Federated Identity Pilot Has Begun!

Berkeley Lab staff can now follow a one-time process to link their Lab identity to their NERSC identity, then subsequently use their Lab credentials to log into resources such as Iris, ServiceNow, and the NERSC web site.

We anticipate that soon, more than two-thirds of our users will be able to use their institutional login credentials to log into these NERSC services.

The appearance of the NERSC login page for these services has now changed: instead of the form requesting your login name and password you will see a menu where you can choose the institution to use for login.

During this first phase, if you are not Berkeley Lab staff, simply select "NERSC" as the authentication source, and you will be sent to the familiar NERSC authentication form. If you are Lab staff, we encourage you to select the "Berkeley Lab" option and try it out!

(back to top)


Calls for Participation

Please participate in the NERSC Annual User Survey

NERSC has engaged a professional survey company, the National Business Research Institute (NBRI), to conduct our annual user survey. Users can search for
a reminder email from [email protected] in their inboxes with a personalized link to the user survey. Another reminder is coming out tomorrow.

We value your response to the survey, which helps inform future plans for improvements to benefit our users. Please take the survey to let us know what we've done well and how we can better serve you!

(back to top)


Upcoming Training Events

(back to top)


NERSC News

Come Work for NERSC!

NERSC currently has several openings for postdocs, system administrators, and more! If you are looking for new opportunities, please consider the following openings:

  • Scientific Data Architect: Collaborate with scientists to meet their Data, AI, and Analytics needs on NERSC supercomputers.
  • Exascale Computing Postdoctoral Fellow: Collaborate with ECP math library and scientific application teams to enable the solution of deep, meaningful problems targeted by the ECP program and other DOE/Office of Science program areas.
  • Data & Analytics Team Group Lead: Provide vision and guidance and lead a team that provides data management, analytics and AI software, support, and expertise to NERSC users.
  • Cyber Security Engineer: Help protect NERSC from malicious and unauthorized activity.
  • Machine Learning Engineer: Apply machine learning and AI to NERSC systems to improve on their ability to deliver productive science output.
  • HPC Performance Engineer: Join a multidisciplinary team of computational and domain scientists to speed up scientific codes on cutting-edge computing architectures.
  • Software Integration Engineer: Develop and support software integration with Continuous Integration in collaboration with ECP.

(Note: You can browse all our job openings by first navigating to https://jobs.lbl.gov/jobs/search/. Under "Business," select "View More" and scroll down to find and select the checkbox for "NE-NERSC".)

We know that NERSC users can make great NERSC employees! We look forward to seeing your application.

Upcoming Outages

  • Cori
    • 01/19/22 07:00-20:00 PST, Scheduled Maintenance
      • Systems will be unavailable during the Allocation Year transition
  • Perlmutter
    • 01/11/22 07:00-20:00 PST, Scheduled Maintenance
    • 01/25/22 07:00-20:00 PST, Scheduled Maintenance
  • HPSS Archive (User)
    • 01/12/22 09:00-13:00 PST, Scheduled Maintenance
  • HPSS Regent (Backup)
    • 01/12/22 09:00-13:00 PST, Scheduled Maintenance
  • Iris
    • 01/19/22 07:00-09:30 PST, Scheduled Maintenance
      • IRIS will be unavailable during the Allocation Year transition
  • NoMachine
    • 01/19/22 09:00-10:00 PST, Scheduled Maintenance
      • Engineers will be performing minor internal NoMachine configuration updates.
  • Spin
    • 01/19/22 10:00-11:00 PST, Scheduled Maintenance
      • Rancher 2 workloads and Rancher 1 services using global filesystems may be sluggish or hang while storage servers are upgraded.

Visit http://my.nersc.gov/ for latest status and outage information.

About this Email

You are receiving this email because you are the owner of an active account at NERSC. This mailing list is automatically populated with the email addresses associated with active NERSC accounts. In order to remove yourself from this mailing list, you must close your account, which can be done by emailing [email protected] with your request.