Infrastructure and Services¶
We provide access to centrally managed computational systems, several High Performance Computing and data analysis clusters. We also offer Infrastructure that support researchers such as large and secure storage, high-speed data transfer channel using a demilitarized zone (DMZ), support in areas of High Performance Computing, Parallel programming and visualization and tranning on those areas via workshops and seminars.
Here we summarize the different dimensions of action for WVU-RC:
High-Performance Computing¶
We operate 2 High-Performance computing Clusters and one more has been acquired and should be deployed by the end of the year.
The High Performance Computing facilities are funded by the National Science Foundation EPSCoR Research Infrastructure Improvement Cooperative Agreement #1003907, the state of West Virginia (WVEPSCoR via the Higher Education Policy Commission), the WVU Research Corporation and faculty investments.
The two clusters in operation are called Mountaineer and Spruce Knob. The newest cluster, called Thorny Flat should be in operation by the end of 2018.
Mountaineer is WVU’s oldest shared cluster and it is a 384 core Intel high-density computing cluster based on Xeon Westmere processors. Each node has 12 cores and 48GB of RAM, providing 4GB per node average. Storage is provided by a direct-attached SAN unit with 10TB of formatted disk space, as well as a network attached storage system with 60 TB of storage capacity. Mountaineer will stop operation when the new cluster Thorny Flat starts.
Spruce Knob is WVU’s current HPC system. This system is 176 nodes, 3,376 core heterogeneous high-density computing cluster based on Intel Xeon Sandy Bridge, Ivy Bridge, Haswell, and Broadwell processors. Spruce Knob follows a condo model where faculty members can purchase direct access to nodes on the cluster making them part owners of the cluster.
Thorny Flat will be the next generation HPC cluster, with around 111 nodes, more than 4000 cores and a number of Nvidia P6000 GPU cards for extra computing power.
Data Analysis Cluster¶
GoFirst Cluster is a dedicated WVU MS Business Data Analytics computing resource that allows students in the Business Data Analytics M.S. program to gain experience in a controlled, secure cloud-computing environment. GoFirst is build from four compute nodes running HDFS shared filesystem, to run Hadoop and Spark jobs using RStudio as a frontend interface.
Research Exchange¶
REX is a Science DMZ, or demilitarized zone, which is a dedicated “express lane” network for research data traffic within the University’s larger network. It is funded through a nearly $487,000 cyber infrastructure grant that WVU Research Corp. won in 2014 year from the National Science Foundation.
REX gives Information Technology Services the ability to separate research traffic from other Internet traffic, guarantee high-speed Internet2 access for WVU researchers, and facilitate data exchanges with off-campus collaborators. The upgrades also provide WVU researchers with greater access to off-campus resources such as national scientific supercomputing centers. The grant funded the development and deployment of two Data Transfer Nodes, high-performance data transfer “depots” that will improve the ability to move large science data sets. These Data Transfer Nodes have 640TB of raw disk storage, giving researchers a high-speed storage location when transferring large data sets.
HPC storage¶
WVU-RC offers researchers access to two tiers of storage through our Data Direct Network GRIDScaler system. Our standard tier of storage is available free to all users, but users also have the option to purchase dedicated group storage on the system.
The GRIDScaler system provides access to high-speed parallel GPFS storage and currently provides over 7 GB of throughput and 1 PB of raw storage.
All users of HPC systems have access to more than 400 TB of high-speed scratch storage. Scratch storage is for temporary storage of files and gives researchers a place to process large amounts of data. In addition, each user is provided 10 GB of home directory space and 10 GB of group storage space upon request.
Some researchers prefer to have dedicated group storage on the HPC cluster to store large amounts of data for processing without the fear of it being removed. These researchers can purchase dedicated storage for their group at $189/TB per year. This also offers an easy way to share data between researchers in the same group.
Data Depot¶
The WVU Research Data Depot is a centrally managed, reliable, secure and fast data storage system specifically designed to meet the university’s diverse research storage needs. Designed to handle all size of files, from small to very large, researchers who use this service will have access to their data both on and off campus and can also use it to collaborate with researchers outside of WVU.
ITS designed the Data Depot to be easy to use. Researchers have access to drag and drop files through an interface they are accustomed to using such as Windows, OSX or Linux file managers. Command line tools, such as sftp, and Linux based command, such as mount, can also be used to access the files on lab PCs/servers.
More information on: Research Data Depot
Seminars and workshops¶
WVU-RC supports the mission of educate users on High Performance Computing, Parallel Programming and Data Analysis via seminars and workshops.