X

Research Servers and Clusters

The Research Computing group assists in maintaining certain research assets.  Researchers may have funds to procure various hardware elements and we assist them with specifications, shipping, installation, configuration and on-going maintenance.

University of Memphis High Performance Research Computing Facility


Cluster Update 2023

During the Fall of 2023, we upgraded the cluster with more and newer compute resources. This upgrade is part of a new 4-year cycle and it came online in late January 2024.

The older Intel partition configuration consists of 88 compute nodes with 3520 total CPU cores (40 cores per node), 20736 GB total RAM, and 12 NVIDIA V100 GPUs.

The new AMD partition configuration consists of 32 nodes with 6144 total CPU cores (192 cores per node), 27648 GB total RAM, and 8 A100 GPUs. 

Overall, the cluster has 120 compute nodes with 9152 cores, 48384 GB total RAM, and 20 GPUs.


Intel Processors

  • Intel Thin: 78 PowerEdge C6420 dual socket Intel Skylake Gold 6148 compute nodes with 192 GB DDR4 RAM and EDR Infiniband
  • NVIDIA GPU with Intel processors: 6 PowerEdge R740 dual socket Intel Skylake Gold 6148 GPU nodes with 192 GB DDR4 RAM, 2 x NVIDIA V100 GPU and EDR Infiniband - 5120 GPU cores/V100, 10240 GPU cores/GPU node
  • Intel Fat: 2 PowerEdge R740 dual socket Intel Skylake Gold 6148 Fat Memory Nodes with 768 GB DDR4 RAM and EDR Infiniband
  • Intel Large Fat: 2 PowerEdge R740 dual socket Intel Skylake Gold 6148 Nodes with 1.5 TB DDR4 RAM and EDR Infiniband

AMD Processors (new)

  • AMD Thin: 24 PowerEdge R7625 dual socket AMD Epyc Genoa 9654 compute nodes with 768 GB DDR5 RAM and HDR100 Infiniband
  • NVIDIA GPU with AMD processors: 4 PowerEdge R7625 dual socket AMD Epyc Genoa 9354 compute nodes with 768 GB DDR5 RAM, 2 x NVIDIA A100 GPU and HDR100 Infiniband
  • AMD Fat: 4 PowerEdge R7625 dual socket AMD Epyc Genoa 9654 compute nodes with 1.5 TB DDR5 RAM and HDR100 Infiniband

Storage

  • Parallel File System: Arcastream PixStor (GPFS) with 60 x 7.68 TB HDD (460.8 TB total raw storage) providing up to 7.5GB/sec read and 5.5GB/sec write performance, and 8 x 15.3 TB SSD (122.9 TB total raw storage) providing up to 80 GB/s read and write speeds. Total storage is 583.7 TB.

All compute nodes are connected via HDR100/EDR Infiniband (2:1 Blocking) and 1GbE for host/OOB management. Head and Login nodes are connected via HDR100 Infiniband and 10GbE for host/OOB management.


University of Memphis Communication Networks Supporting Research

The University of Memphis Network services provides networking services to over 20,000 nodes, providing connectivity to the main and regional campuses, facilities, on-campus residences, the Internet. The University supports a gigapop site for regional connections to Internet2 and is a connector for the State of Tennessee SEGP program for Internet2. In addition, the University developed a citywide research and educational network consortium which provides connectivity to St. Jude Children’s Research Hospital, University of Tennessee Health Sciences Center, LeMoyne Owen College, and Southwest Tennessee Community College.

The Networks Services’ Network Operations Center provides daily and nightly production of updates and reports for the University.