Skip to Navigation
University of Pittsburgh
Print This Page Print this pages

October 14, 2010

Pitt No. 31 in research performance survey

Pitt ranked 31st overall among the top 500 universities worldwide, according to a first-time study that purports to measure institutions’ research performance over a 10-year period.

The ranking, titled High Impact Universities, was developed under the auspices of the University of Western Australia. The report was not meant to be a traditional ranking of institutions, the authors said. “These results are however a ranking in the sense that the research performance index for institutions has been reported in decreasing order.”

Commenting on the new study, Provost Patricia Beeson said, “The University is fully committed to supporting an environment conducive to high-impact research and to scholarly and creative output from our outstanding faculty. It is very gratifying to have the far-reaching impact of that scholarship recognized in this worldwide assessment.”

According to the study’s authors, the initial study involved more than 1,000 universities and 5,000 general academic areas (the same five per institution).

The No. 1 ranked institution overall in the study was Harvard, followed by Stanford, MIT, UCLA and Berkeley.

In addition to Pitt, Pennsylvania universities cracking the top 50 overall were Penn (No. 8), Penn State (No. 20) and Carnegie Mellon (No. 44).

The highest ranked non-U.S. institutions in the study were the University of Cambridge (No. 13), the University of Toronto (No. 14), the University of Oxford (No. 17), Imperial College of London (No. 27) and the University of Paris (No. 29).

According to materials accompanying the online top 500 list, the study’s methodology produced a research performance index (RPI) for each university and each of five academic areas (called “faculties” in the study’s terminology).

The report explained, “The RPI is an indicator of the ‘comprehensiveness’ of an institution, which is comprised of equal contributions from each of the five faculties. Each faculty is evaluated according to the ‘quality and consistency’ of its research as indicated by its g-index (a numerical measure of the quality and consistency of research output based on a formula that takes into account citations), which is calculated for its publications over a 10-year period (2000-2009).”

The authors said they examined only measureable outputs (publications and citations) using recognized bibliometric tools (g-index). Publication and citation data were derived from the internationally accepted Elsevier Scopus database, the world’s largest abstract and citation database of peer-reviewed literature.

The result was a set of g-scores for each academic area of each university, along with a corresponding n-score (for normalized g-score) and a final index value (RPI) for each particular institution.

An academic area was scored based on the total number of publications produced that have an equal or greater number of citations. To give more details in this example, take the following hypothetical researcher/institution, with a total of 10 publications. To calculate the g-index, the study listed the publications in decreasing order of citations. It then kept a running average of citations per publication, and looked for the number just before (or exactly where) the average citation count crossed over with the citation rank. For example, an academic area was said to have a g-index of 9 if it had nine publications that averaged at least nine citations per publication.

The study included research output in 2000-2009. The types of publications were restricted to journal and conference articles and authored and edited books. Citations could be from any source.

Credit was given to the institution where the work was performed (not the current affiliation of the author/s); credit was given to all institutions involved in the published research, and credit also was given to one or more academic areas within an institution, wherever appropriate.

The new study also includes rankings for research performance by general academic area.

In the study, each university is considered to be composed of five broad “faculties”:

Arts, humanities, business and social sciences. In this category, Pitt tied for No. 28 overall with Princeton and the University of Toronto.

Engineering, computing and technology. Pitt tied for No. 108 overall with Ghent University, Belgium; Osaka University, Japan; State University of New York-Buffalo; the University of Connecticut; the University of Erlangen-Nurnberg, Germany; the University of Sydney, Australia, and Vrije University Amsterdam (Free University), The Netherlands.

Life, biological and agricultural sciences. Pitt was No. 31 overall in this category.

Medicine, dentistry, pharmacology and health sciences. Pitt ranked No. 16 overall in this category.

Pure, natural and mathematical sciences; Pitt was No. 56 overall in this category.

According to the study’s authors, whether an institution performed well on the overall scale as given by the RPI or in the academic areas as given by the g-index “depends on the ‘impact’ — loosely speaking — of its research output, as determined by the publications it produces and the citations these receive.”

The full study is available at www.highimpactuniversities.com/rpi.html.

—Peter Hart

Filed under: Feature,Volume 43 Issue 4

Leave a Reply