Skip to Navigation
University of Pittsburgh
Print This Page Print this pages

October 14, 2010

New computer cluster benefits researchers

Pitt researchers in need of high-performance computing capacity now have a new resource available. The University has purchased a new computer cluster with approximately $500,000 in Provost funding.

Jointly administered by Computing Services and Systems Development (CSSD) and the Center for Simulation and Modeling (SAM), it is housed at CSSD’s Network Operations Center (NOC) in the RIDC Industrial Park. The cluster has been named Frank, after Henry S. Frank, who chaired the chemistry department 1951-1963.

CSSD director Jinx Walton said many Pitt research groups need high-performance computing, noting that the availability of an expandable University cluster saves individual departments from investing in the hardware and the specialized environment such equipment requires.

Walton said the system will be more cost-efficient, although it’s too early to predict what savings could result.

The cluster is made up of 40 computers (“nodes”): 30 with two 6-core Intel Westmere central processing units (CPUs), 48 GB RAM and 10 with four 12-core AMD Magny-Cours CPUs, 128 GB RAM.

Cores essentially are CPUs joined on a single chip. Multiple cores can communicate faster than separate CPUs, explained SAM co-director Ken Jordan. In comparison, today’s desktop computers typically are dual-core, meaning that they contain two processors on one chip, he said.

Four of the Westmere nodes have four Nvidia Fermi general-purpose graphics processing units. GPUs can speed up some applications by as much as a factor of 200 compared to calculations performed on a single core of a standard CPU.

Most of the nodes are connected by a high-speed Infiniband network to improve the performance of parallel calculations using simultaneous CPUs across multiple nodes. Under ideal conditions, Infiniband network speeds could exceed Ethernet network speeds by a factor of 30, Jordan said.

The cluster mainly will benefit researchers who use several CPUs at the same time, enabling them to tackle bigger problems and solve them more quickly, Jordan said. It provides an intermediate level of computing power for users who need the power of 50-100 computers, jobs that  typically are too small to merit using supercomputing center resources.

“This is useable to people in many, many departments,” Jordan said, naming engineering, the sciences, the medical school and public health among the areas in which researchers could benefit from the more powerful computing capacity for such increasingly important research purposes as modeling and simulation.

Some initial jobs are being run as SAM staff finish configuring the software to prepare the cluster for broad usage. Jordan said the cluster should be available for general use starting Oct. 20. Prospective users must submit a proposal to seek time on the cluster. Jordan said he expects demand will outstrip the resource, forcing SAM leaders to decide how best to use the cluster’s capacity.

“If usage grows as fast as I think it will, we would need to make the case for the University to expand the center,” Jordan said.

SAM is in charge of scheduling time on the cluster and taking the lead in assisting users; CSSD is the cluster’s caretaker, providing the facility and support, including maintenance and 24-hour monitoring, Walton said.

The NOC has an 850-square-foot space set up with the proper environment for the cluster, which has heavy power requirements and cooling needs.

The NOC  has a 1-megawatt transformer installed specifically to support research computing and has a dedicated uninterruptible power supply backed up by a 1-megawatt diesel generator. It also has in-row cooling with a 160 kilovolt amp capacity specifically for the research computers.

All equipment, including the power supply and cooling, are monitored 24/7 and Enterprise backup systems are in place to support data stored on the clusters. Centralized monitoring and maintenance lets faculty focus on their research, rather than on the equipment, Walton noted.

A search is on to find a high-performance computing engineer experienced both in high-performance computing and Linux operating systems to manage and maintain the system. While some CSSD staff have Linux expertise, the University is seeking someone with high-performance computing management experience as well. “It’s a very specialized field,” she said.

Walton anticipates additional staff will be needed as use of the cluster grows. That likely would be filled through a reallocation of existing resources rather than additional hiring, she said.

SAM is hosting two seminars on Nov. 4 to familiarize users with the new resources. Information on the seminars, the new computer cluster and how to apply for an allocation on this or other SAM research clusters can be found at

—Kimberly K. Barlow

Filed under: Feature,Volume 43 Issue 4

Leave a Reply