Biowulf High Performance Computing at the NIH
NIH HPC Systems
HPC systems overview

The High Performing Computation (HPC) group at the National Institutes of Health provides computational resources and support for the NIH intramural research community.

The Biowulf cluster
The Biowulf cluster is a 95,000+ core/30+ PB Linux cluster. Biowulf is designed for large numbers of simultaneous jobs common in the biosciences, as well as large-scale distributed memory tasks such as molecular dynamics. A wide variety of scientific software is installed and maintained on Biowulf, along with scientific databases. See our hardware page for more details. Any scientific computation should be run on cluster compute nodes as batch jobs or sinteractive sessions.. Compute nodes can access http and ftp sites outside our network via a proxy so that some data transfer jobs can be run on the cluster.
The login node
The login node ( is used to submit jobs to the cluster. Users connect to this system via ssh or NX. No compute intensive, data transfer or large file manipulation processes should be run on the login node. This system is for submitting jobs only.
Helix ( is the interactive data transfer and file management node for the NIH HPC Systems. Users should run all such processes (scp, sftp, Aspera transfers, rsync, wget/curl, large file compressions, etc.) on this system. Scientific applications are not available on Helix. Helix is a 48 core (4 X 3.00 GHz 12-core Xeon™ Gold 6136) system with 1.5 TB of main memory running RedHat Enterprise Linux 7 and has a direct connection to the internet.
The helixdrive service allows users on the NIH network to mount their home, data, and shared directories as mapped network drives on their local workstations.
Sciware is a 'software on demand' service that provides scientific software that runs on Windows, Mac and Linux desktops. Sciware is available to anyone with an HPC account. Software includes Matlab and Mathematica.
Helixweb is a set of web-based scientific tools.
Globus is a file transfer service that makes it easy to move, sync and share large amounts of data within the NIH as well as with other sites.
The http and ftp proxies allow users to fetch data from the internet on compute nodes with tools like wget, curl, and ftp.