Measuring Recent Research Performance for Chinese Universities Using Bibliometric Methods

Measuring Recent Research Performance for Chinese Universities Using Bibliometric Methods Jia Zhu · Saeed-UI-Hassan · Peter Haddawy · Qing Xie Receive...
Author: Lorin Rich
5 downloads 1 Views 657KB Size
Measuring Recent Research Performance for Chinese Universities Using Bibliometric Methods Jia Zhu · Saeed-UI-Hassan · Peter Haddawy · Qing Xie Received: date / Accepted: date Abstract Chinese universities have achieved great development with the rapid growth of the economy in China recent years. This study focuses on measuring the academic research performance by using bibliometric methods for Chinese universities. The data are extracted from Scopus database and our main contribution is to provide meaningful indicators to mea- sure the academic performance of Chinese universities based on a large number of conference and journal records provided by Scopus. As a comparison, we also measured the academic research performance for the universities from other two regions, US and Europe, in order to see the gap between Chinese universities and top universities in the world across various subject areas. We first use a few popular indicators to measure the research outcomes of universities in seven science subject areas from 2007 to 2010 in terms of the quantity and quality of publications and citations. We then measure international collaboration, cites and impact of their research. Based on these indicators we have a overall score for each subject category called Research Performance Point (RPP) to normalize the performance of research so that we can more clearly know the research strength and weakness of Chinese universities. Keywords Bibliometric, Research performance, Chinese universities

1 Introduction Nowadays, universities are quite influential to the society and economy of a country and such influence spreads worldwide. Chinese universities have achieved great development with the rapid growth of the economy in China recent years, in terms of number of researchers and research funding [15]. As China is one of the leading countries in fast growing world [3], it is meaningful to measure the academic research performance of Chinese universities with international comparisons for the purpose of strengthening J. Zhu
 International Institute for Software Technology, United Nation University, Macau E-mail: [email protected] S. Hassan 
 Department of Computer Science, COMSATS Institute of Information Technology, Pakistan P. Haddawy
 International Institute for Software Technology, United Nation University, Macau Qing Xie 
 Division of CEMSE, KAUST, Thuwal, Saudi Arabia

the quality and impact of research. In this paper, we focus on measuring the academic research performance by using bibliometric techniques and indicators. The data are extracted from Scopus database 1, which is a bibliographic database containing abstracts and citations for academic articles. It covers more than 5,000 international publishers and at least 19,500 peer-reviewed journals in the fields of natural, technical, medical, and social sciences [7], and it offers about 20% more coverage than Web of Science 2 [2]. Over the Scopus database, we adopted Source Normal- ized Impact per Paper (SNIP) metric for measuring the research performance to ensure that citations from journals with low number of references are weighted properly which makes the comparison of journals/conferences scores easier [11]. As a comparison, we also measured the academic research performance for the universities from two regions, US and Europe, in order to see the gap between Chinese universities and top universities in the world across various subject areas. There are two main contributions of this paper: 1) We adopt a list of existing indicators to measure the recent research performance of Chinese, US and European universities in terms of the quantity and quality of Publications and Citations, and analyze the difference; 2) We propose a new score to normalize and evaluate the internationalization of research performance of Chinese, US and European universities. The paper is organized as follows: in Section 2, we present the related work on bibliometric methods, in particular for measuring academic research performance. Section 3 and 4 present our data and methodology with results and discussion. Conclusions and future work are outlined in Section 5.

2 Related Work This section will discuss recent works for bibliometric study, particularly on measuring research performance. The following review of some important previous works is presented in chronological order. Moed et al.[12] first presented the results of a study on the potentialities of bibliometric data as tools for university research policy. In this study, the authors concluded that the use of bibliometric data can provide monitoring for university research-management and science policy even if there are a lot of problems during data collection and handling. Raan [13] gave an overview of the potentials and limitations of bibliometric methods for the assessment of strengths and weaknesses in research performance based on two different methods, research performance assessment and monitoring scientific developments. The advanced analysis of publication and citation data shown in this article provided insight into the position of actors at the research front in terms of 1 2

http://www.scopus.com/
 http://portal.isiknowledge.com/

influence and specializations. We have applied some of indicators in our bibliometric study. Zitt et al.[16] focused on determining the internationalized nature of science by the distribution of authoring and citing countries of a journal, as the average profile of science drifts with the level of visibility. The authors proposed the experiments that how experimental internationalization indexes and the SCI for science indicators are sampled, which provided useful information for our experiments. Leeuwen et al.[8] presented evidences that the value of impact indicators of research activities strongly depends upon whether one includes or excludes research publications in SCI covered journals written in other languages than in English. Additional material was gathered to show the distribution of SCI papers among publication languages. In our measurement, we exclude all the self-citations and use SNIP values to measure the level of journals, which can avoid the issues that many authors might publish articles in SCI journals with low impact factor as the journals’ language, is Chinese. Liang et al.[9] proposed a model to prove how important the indicator of publication count is in comparison with the quality or impact indicators. Extended negative exponential function proves to be an appropriate model to fit the rank-frequency distribution here. The Scopus database we used matched the criteria they used to select databases. Moed [10] focused on using bibliometric techniques to measure research activities in China based on data extracted from the Science Citation Index (SCI). One of the main contributions of the author is to split ISI journals to Chinese and non-Chinese journals based on the publishing language of the journal in order to measure Chinese universities’ research performance. However, this approach is not comprehensive as some Chinese journals can also be very good journals, Journal of Software 3 for example. Therefore, we should measure research performance based on the quality, i.e., publications in famous journals and conferences. Tijssen et al.[14] presented the results of statistical analysis among the world’s major providers of science-based information and services to the business sector. The statistical data were derived from university and industry co-authored research publications (UICs) that were published. The various UIC rankings highlight measurement issues and reveal the differences depending on the selected UIC indicator. The UIC indicator offers an interesting source for domestic and international comparisons of research universities. Cao et al.[1] focused on the academic output in overall publication activity and the influential strength of certain disciplines of selected universities based on the Essential Sci- ence Indicators (ESI) database. The authors used bibliometric indicators, such as number of papers and citations and other three indicators with complicated calculation. 3

http://www.jos.org.cn/ch/index.asp

However, ESI database is a smaller database than SCI and Scopus which might not fully reflect the performance of universities in China. In addition, their article only discussed the gap be- tween quality and quantity of research output but not the gap between Chinese universities and world-class universities, which is the main contribution of our paper. Moed [11] explored a new indicator of journal citation impact, denoted as source normalized impact per paper (SNIP). It measures a journal’s contextual citation impact based on the frequency at which authors cite other papers in their reference. SNIP is defined as the ratio of the journal’s citation count per paper and the citation potential in its subject field. It aims to allow direct comparison of sources in different subject fields. Citation potential to show vary not only between subject categories but also between journals within the same subject category. For instance, basic journals tend to show higher citation potentials than clinical journals, and journals covering emerging topics higher than periodicals in classical subjects or more general journals. SNIP corrects for such differences, which is the reason we use for our measurement. Hassan et al.[4] presented case studies based on a system called Global Research Benchmarking System (GRBS) which provides objective data to benchmark research performance for the purpose of strengthening the quality and impact of research. The system can help universities to identify niche areas in which they can make rational strategic decisions. Later, based on the indicators proposed by the GRBS, the authors introduced a new quantitative measure of international scholarly impact of countries [5]. They presented a case study to illustrate the use of proposed measure in a particular area and found that the international scholarly impact of the countries is not necessarily bound to publication output. Most of indicators introduced in GRBS will be used in our research.

3 Data and Methodology This study covers seven main research areas that most of universities in the world are currently doing: Agricultural & Biological Sciences, Biochemistry, Genetics and Molecular Biology, Chemistry, Computer Science, Engineering, Environmental Sciences and Physics and Astronomy. The data over the time period of 2007-2010 are extracted from Scopus database, which is a bibliographic database containing more than 5,000 international publishers and at least 19,500 peer-reviewed journals. We picked the top 20 universities from China, US and Europe, which have the most number of publications and each university should publish at least 100 papers in each area in order to be qualified. The data snapshots of selected universities and their numbers of publication can be found online at http://www.intelligentforecast.com/scim/ We describe the bibliometric indicators we used to evaluate in this study as following: – Total Pubs: Total number of publications during a 4-year time window. If a publica- tion has more than one author from different universities, then we

count one for each university. – %Pubs in Top 10% SNIP: Percentage of Total Pubs published in source titles (jour- nals, conference proceedings and book series) that are within top 10% of that subject area, based on the source normalized impact per paper (SNIP) of the last year in the time window. Source titles in top 10% based on SNIP values in a given subject category are tier-1, in top 11%-25% are tier-2, others are tier-3. For the window 2007 - 2010, the SNIP values of 2010 are used. The details and the definition of SNIP can be found at [11]. – %Pubs in Top 25% SNIP: Percentage of Total Pubs published in source titles that are within top 25% of that subject area, based on the SNIP value of the last year in the time window. – Total Cites: Total number of citations within a 4-year time window to papers published in that time window. All citation counts in our calculation exclude self citations. – %Cites from Top 10% SNIP: Percentage of Total Cites received from publications in source titles that are within top 10% based on SNIP value. – %Cites from Top 25% SNIP: Percentage of Total Cites received from publications in source titles that are within top 25% based on SNIP value. – 4-year h-index: A university having 4 year h-index of X means that at least X of their publications (during that 4-year window) have no less than X publications citing them (during that window). A 4-year h-index of a university is computed for a particular subject area. – %International Collaboration: Percentage of Total Pubs with international co-authorship, which means only the publication has at least 2 authors from two different countries
 will be counted. – %International Cites: Percentage of International Cites relative to Total Cites. Cita- tions received from papers authored by researchers from outside of the country in which a given university is located. – International Impact: This indicator is a measure of the impact a university’s research is having outside the country and measures the ability of a university to attract citations from outside of the country. It is defined as the ratio of International Cites to the total number of references made by the papers in a given field, which are authored by researchers from outside of the country in which the university is located.

4 Results and Discussion We discuss the research performance of Chinese universities in details in this section. We

first measure and generate some results about two factors of publications, quantity and quality, based on various indicators to compare with US and European universities. We then select several indicators to analyze the internationalization of their research. Finally a new score is proposed to evaluate the overall research performance with total points for each subject category for Chinese universities.

4.1 Quantity of Publications and Citations 
 Since the quantity of publications and citations are simple but important indicators to measure research performance, we will discuss the number of publications produced by all selected universities and also the number of citations they received from the publications in this section. We first normalized the number of publications and citations into a more common scale in order to make better comparisons (The original data resource can be found online at http://www.intelligentforecast.com/scim/). Fig. 1 shows the number of publications in all three regions from 2007 to 2010. As we can see, Chinese universities performed very well in the area of Engineering, and they published nearly more than 3 times the number of papers compared to universities in US and Europe. In the areas of Chemistry, Computer Science, Physics and Astronomy, Chinese universities also do better than other two regards to the number of publications. However, in the areas of Agricultural&Biological Sciences, Biochemistry, Genetics and Molecular Biology, and Environmental Sciences, US universities take the lead without any doubts. Particularly in the area of Biochemistry, Genetics and Molecular Biology, the number of publications from Chinese universities is only around half of US universities’. We also notice European universities are behind US universities in all areas in terms of number of publications except Chemistry. Here we propose some intuitive explanations to this phenomenon. Compared to the other regions, China has more researchers and PhD students because of the number of populations. They usually more prefer to choose the popular areas, such as Computer Science or Engineering, which are practical and interesting. As to the areas of Chemistry and Physics, which are more fundamental, theoretical and relatively uninteresting, there will also be more researchers in them due to the larger researcher base. Consequently the numbers of publications in these areas for Chinese universities are naturally larger than the other regions.

Fig. 1: Normalized number of publications by top 20 Chinese, US and European universities from 2007 to 2010 Though the number of publications is an important factor to measure a university’ research performance, it is also critical to know the impacts of publications, e.g. the number of citations. Fig. 2 shows the normalized number of received citations based on the publications by top 20 Chinese, US and European universities from 2007 to 2010. Generally speaking, the number of citations should be in proportion to the number of publications. However, even in the area of Engineering, which is the area that Chinese universities have most number of publications, US universities received much more citations than Chinese universities. Similar situation occurs in other areas, e.g. Chemistry and Computer Science. In the rest of areas, e.g. Biochemistry,Genetics and Molecular Biology, US and European universities have expanded their advantage from the number of publications, and they received nearly 7 times and 4 times citations than Chinese universities, respectively.

Fig. 2: Normalized number of citations received based on the publications by top 20 Chinese, US and European universities from 2007 to 2010

Obviously, the difference between the number of publications and the number of citations emphasizes the fundamental gap of research performance for Chinese universities. To

further analyze the results, we apply City-Block distance to calculate the difference for each subject category as denoted in Equation (1): d(x,y) = x−y,x ≥ y (1)

where x is the normalized number of citations and y is the normalized number of publications. We are only interested in those subject categories in which universities have greater or equal number of normalized number of citations than normalized number of publications as it means positive impact on research performance. Unfortunately, Chinese universities only have one category can meet this condition while US universities have all seven categories satisfied. The results are shown in Fig. 3.

Fig. 3: Number of subject categories that have greater or equal number of normalized number of citations than normalized number of publications

Table 1 shows the region that has the highest positive impact in all subject categories, which is calculated using Equation (1). Similar situation happened for Chinese universities, they are not even listed in any categories. Very interesting finding is that US universities have 0.62 positive impacts in the area of Engineering, which proves that their research performance is not as bad as we originally think according to the number of publications. European universities are very stable, and the positive impact shows the number of citations they received is matched to their number of publications. Table 1: Region has the highest positive impact in all subject categories Subject Category

Region

Positive Impact

Agricultural and Biological Sciences

Europe

0.02

Biochemistry,Genetics and Molecular Biology

US

0.00

Chemistry

US

0.43

Computer Science

US

0.44

Engineering

US

0.62

Environmental Sciences

Europe

0.02

Physics and Astronomy

US

0.09

4.2 Quality of Publications and Citations In this section, we are going to evaluate deeply the quality of publications and citations. We first assessed the percentage of publications that have been published in top 10% journals/conferences of each subject based on source normalized impact per paper (SNIP) value. It has been proposed by Prof. Henk F. Moed to normalize the original journal impact factor based on citation potential rather than simply counting the number of citations. The calculation of SNIP is defined as: SNIP = RIP/RDCP (2) where RIP is the raw impact per paper published in the journal and RDCP is relative database citation potential. The details about RIP and RDCP can be found at [11]. The results are shown in Fig. 4. Chinese universities are behind US and European universities in all subject areas. For example, only 7.3% publications published by Chinese universities are in top 10% SNIP in the area of Computer Science, which is nearly three times and two times lower than US and European universities, respectively. In addition, none of subject areas Chinese universities can have more than 35% publications in top 10% SNIP. The results of the same indicator for US university are 58.9% and 63.3% for European universities, both are much higher than Chinese universities. Compared with the number of publications, we can observe that although Chinese universities can have high number of publications in certain areas, but the proportion of publications in top journals/conferences is relatively low, which can also lead to the low number of citations. The top universities in US or Europe can have high proportion of publications in top journals/conferences, which reflects that the quality is the publication focus of top universities. However, Chinese universities have less than 30% publications in top journals/conferences, which is the basic reason for the relatively low overall publication quality. Similar situation occurs in the percentage of total cites received from publications that are within top 10% SNIP of the subject areas, based on SNIP value from 2007 to 2010,

which are shown in Fig. 5. This indicator denotes the extent of publication recognition. Chinese universities still have gaps to US and European universities in terms of the quality of citations. However, the gap is not as big as the quality of publications. We still use Computer Science as an example. In this area, Chinese universities only have 7% difference to US universities and 4.1% difference to European universities in the condition that total cites received from publications that are within top 10% SNIP. As the complementary part of this research, we have also measured the percentage of publications and total cites in the case of top 25% SNIP. Similar situation occurs as top 10% SNIP, but the gap becomes smaller. For example for Computer Science, in the case that total cites received from publications that are within top 25% SNIP, Chinese universities have the same figure to European universities and only 2.7% difference to US universities.

Fig. 4: Percentage of publications in top 10% SNIP by top 20 Chinese, US and European universities from 2007 to 2010

Fig. 5: Percentage of total cites received from publications that are within top 10% SNIP from 2007 to 2010

Such difference illustrates that publication from Chinese universities mainly falls into tier-2 and tier-3 level. We can discover that the performance of Chinese universities is

developing significantly, but still need more promotion in top-level publications. The results for top 25% SNIP can also be found online at http://www.intelligentforecast.com/scim/. We further assessed the research performance of Chinese universities using 4-year hindex as shown in Fig. 6, which is an important figure to measure both the productivity and impact of the published work of a scholar. It was first suggested by Hirsch [6] and has been widely used in many places including Google schloar 4. The h-index is defined as: Definition 1 Assume a scientist has total Np papers, if there is h papers have at least h citation each, and the other (Np − h) papers have no more than h citations each, then his/her h-index is h. As we can see from Fig. 6, US universities have the highest h-Index in all subject areas. For Chinese universities, Chemistry is the area closest to US and European universities, only around 10% difference to US universities and 2.5% difference to European universities.

Fig. 6: Sum of 4-year h-index based on the publications by top 20 Chinese, US and European universities

4.3 Further Measurements In addition to using existing indicators to measure research performance, we also use a few indicators, which represent the international collaboration, cites and impact as internationalization is an important criterion for research outcomes to be recognized widely. For international collaboration, we adopt the percentage of total publications with international co-authorship to compare within universities from China, US and Europe. We only count the publications have at least 2 authors from two different countries. As to international citations, we only count citations received from those papers authored by 4

http://scholar.google.com/

researchers from outside of the country in which the given university is located. As shown in Fig. 7 and Fig. 8, there is no doubt that European universities lead in these two indicators, due to the close collaboration within the universities in European countries, while Chinese and US universities are more likely to collaborate with the universities in their own country. Generally speaking, International Collaboration and International Cites are important indicators reflecting the extent of research activity and credibility, and they also denote the significance of academic communication. Chinese universities can have more collaboration and communication with other universities all over the world, especially those in Asia and Oceania.

Fig. 7: Mean percentage of international collaboration of publications by top 20 Chinese, US and European universities

Fig. 8: Mean percentage of international citations received from publications by top 20 Chinese, US and European universities

International impact is an indicator more precisely describing the quality of research performance, as it indicates the worldwide reputation and the credibility of the research.

It is defined as the ratio of international cites to the total number of references made by the papers in a given field which are authored by researchers from outside of the country in which the university is located. From the Fig. 9, we can conclude that US universities are leading in almost every subject, and Chinese universities are lack in the worldwide impact. However, in the area of Chemistry, in which Chinese universities have traditionally good performance, they have similar international impact to the other two regions, with gap of only 0.004 to 0.005.

Fig. 9: Sum of international impact from publications by top 20 Chinese, US and European universities

Next, we summarize the results above and pick the top 3 subject areas for each region from Fig. 7, Fig. 8 and Fig. 9 as shown in Table 2. There are some interesting findings we can observe from the table. Firstly, Chinese universities have two subject areas the same as the other two regions in terms of international contribution: Biochemistry, Genetics and Molecular Biology and Agricultural and Biological Sciences. In other words, Chinese universities have no difference to others in these two subject categories, and their research strongly rely on international collaboration. Another interesting finding is that Chemistry is one of top 3 subject categories for Chinese universities regards to international cites and international impact but not in top 3 subject categories of international collaboration, which means Chinese universities have very strong local research ability on Chemistry and the research outcomes have been recognized internationally. Engineering and Computer Science are the other two subject categories that have similar situation for Chinese universities. Last but not least, Physics and Astronomy is the subject category that takes our notice as for Chinese universities it is not in the top 3 subject categories of any international indicators. Compared to the converse situation for US and European universities, Chinese universities really need improvement for the internationalization in Physics and Astronomy.

Table 2: Region has the highest positive impact in all subject categories Region

China

US

International Collaboration

International Cites

International Impact

Biochemistry, Genetics and Molecular Biology

Biochemistry, Genetics and Molecular Biology

Chemistry


 Environmental Sciences


 Chemistry


 Agricultural and Biological Sciences

Agricultural and Biological Sciences

Physics and Astronomy

Engineering

Biochemistry, Genetics and Molecular Biology


Chemistry

Biochemistry, Genetics and Molecular Biology


Computer Science

Physics and Astronomy

Agricultural and Biological Sciences Physics and Astronomy Agricultural and Biological Sciences Europe

Biochemistry, Genetics and Molecular Biology

Engineering Computer Science


 Computer Sciences Biochemistry, Genetics and Molecular Biology Engineering
 Computer Science

Biochemistry, Genetics and Molecular Biology
 Physics and Astronomy
 Agricultural and Biological Sciences

Although there have been various indicators to evaluate the research performance for different universities, we can still observe that these indicators will rank the universities in different ways, since each indicator only reflects limited aspect of the research performance. In order to evaluate and rank the overall performance of academic research for different universities, we propose a new score called Research Performance Point (RPP) for each subject area to measure universities’ research performance, based on all above outcomes from various indicators. Similar to the method of many universities ranking including Times5, RPP is cumulative according to the figures of each indicator, but we assign the same weight to each indicator. The calculation of RPP is shown as below: RPP = Σinαi,i = 1,2,..,n (3) where αi represents the normalized score of each indicator. For example, for the number of publications in Computer Science, Chinese universities published most papers which is 55076 while the number of publications of US and European universities are 31072 and 5

http://www.timeshighereducation.co.uk/

27511. Therefore, we convert the number of publications to normalized score 1 to Chinese universities while 0.56 and 0.49 to US and European universities accordingly. Since we have used 10 indicators in total, the maximum RPP can be achieved for each subject area is 10. The results given by RPP are demonstrated in Fig. 10. where αi represents the normalized score of each indicator. For example, for the number of publications in Computer Science, Chinese universities published most papers which is 55076 while the number of publications of US and European universities are 31072 and 27511. Therefore, we convert the number of publications to normalized score 1 to Chinese universities while 0.56 and 0.49 to US and European universities accordingly. Since we have used 10 indicators in total, the maximum RPP for each subject area can be achieved is 10. The details are demonstrated in Fig. 10.

Fig. 10: Sum of RPP for top 20 Chinese, US and European universities

From the Fig. 10, we can conclude that similar to other outcomes, US universities dominate nearly all areas except some particular areas, Chemistry for instance. Chinese universities are doing well in the area of Chemistry. However, they still have long way to go for other areas, particularly on the areas of Agricultural&Biological Sciences and Biochemistry, Genetics and Molecular Biology, where Chinese universities do not even achieve half of total RPP. By RPP, we consider together all kinds of factors including the publication quantity, quality as well as international reputation of each university, so the overall performance of academic research can be quantified and visualized to people. Also the researchers can more clearly recognize the position and research level of Chinese universities, which may inspire the developing direction of academic research for Chinese universities.

5 Conclusions and Future Work In this paper, we focus on measuring the research performance by using bibliometric

methods for Chinese universities. We first use several popular indicators to measure the quantity and quality of publications and citations. Compared to US and European universities, Chinese universities have already reached world class level in terms of number of publications and citations, particularly on the area of Engineering and Chemistry. However, we find that the number of citations received from Chinese universities’ publications is not in proportion to the number of their publications compared to US and European universities which implies that the quality of Chinese universities’ publications might be not as good as the quantity. We then investigate the quality of publications and citations using SNIP value and hindex. We find that in the comparisons of either SNIP or h-index, Chinese universities still have quite a large distance to US and European universities except some particular subject areas, Chemistry for example. Biochemistry,Genetics and Molecular Biology is the weakest area for Chinese universities as less than 20% publications were published in top journals/conferences. Such distance reflects the fundamental reason for the different pub- lication performance and university reputation, so other than producing high quantity of publications, Chinese universities should focus more on publishing high level research results. To improve the quality of publications, improving research internationalization is one way to go. We further measure the international collaboration, cites and impact for Chinese universities and we find that even in the strongest area, Chemistry, the research of Chi- nese universities is not internationalized enough compared to US and European universities, particularly on the area of Physics and Astronomy. Such results inspire that Chinese uni- versities have large improvement space in international communication, and they can seek for collaboration among other countries in Asia or Oceania. Last but not least, we introduce and apply the research performance point that sums the results from all different indicators to normalize the overall research performance of Chinese universities. We also visualized the comparison results for other researchers to recognize the position and overall level of Chinese universities in terms of academic research, which can further inspire the developing direction for Chinese universities. We find that Chinese universities’ research level is very closed to US and Europe in the areas of Chemistry and Engineering. For the other weaker areas including the weakest two areas, Agricultural and Biological Sciences and Biochemistry, Genetics and Molecular Biology, Chinese universities really need to learn more from US and Europe, e.g. European universities have the best research on Environmental Sciences and Agricultural and Biological Sciences which are even better than US universities. Next, we are going to evaluate the research performance of universities in some of hot areas recent years, sustainable development and renewable energy for example. By research- ing on these areas, we propose to find out who are the top universities in these areas so that universities or organizations in fast growing countries can seek help from them.

References

1. Y. Cao, H. Tong, J. Yu, D. Chen, M. Huang, X. Zhang, Y. Luo, Y. Zhao, and Z. Zhang. Technology management for global economic growth (picmet), 2010 proceedings of picmet ’10. Journal of Research Evaluation, pages 1–8, 2010. 2. M.Falagas,E.Malietzis,andG.Pappas.Comparisonofpubmed,scopus,webofscience, and google scholar: Strengths and weaknesses. The FASEB Journal, pages 338–342, 2007. 3. M. Guilln. Multinationals, ideology, and organized labor. The Limits of Convergence, 2003. 4. S. Hassan and P. Haddawy. Global research benchmarking system. Responding to the 21st Century Demands for Educational Leadership and Management in Higher Education SEAMEO RETRAC, 2012. 5. S. Hassan and P. Haddawy. Measuring international knowledge flows and scholarly impact of scientific research. Journal of Scientometrics, 94:163–179, 2013. 6. E. J. Hirsch. An index to quantify an individual’s scientific research output. PNAS, 102:16569–16572, 2005. 7. S. Info. Scopus in detail: What does it cover? elsevier. retrieved 2013-01-29. 8. T. Leeuwen, H. Moed, R. Tussen, M. Visser, and A. Raan. Language biases in the converage of the science citation index and its consequences for international comparisions 
 of national research performance. Journal of Scientometrics, 53:249–266, 2001. 9. L. Liang and Y. Wu. Selection of databases, indicators and models for evaluating research performance of chinese universities. Research Evaluation, 10(2):105–113, 2001. 10. H. Moed. Measuring china’s research performance using the science citation index. 
 Journal of Informetrics, 53(3):281–296, 2002. 11. H. Moed. Measuring contextual citation impact of scientific journals. Journal of Infor
 metrics, 4(3):265–277, 2010. 12. H. Moed, W. Burger, J. Frankfort, and A. V. Raan. The use of bibliometric data for 
 the measurement of university research performance. Research Policy, 14(3):131–149, 
 1985. 13. A. Raan. Advanced bibliometric methods as quantitative core of peer review based 
 evaluation and foresight exercises. Journal of Scientometrics, 38:396–420, 1996. 14. R. Tijssen, T. Leeuwen, and E. Wijk. Benchmarking university-industry research cooperation worldwide: performance measurements and indicators based on co-authorship data for the world’s largest universities. Journal of Research Evaluation, 18(1):13–24, 
 2009. 15. K. Xu. The structure and measurement of chinese university leadership. University of 
 Colorado, 2008. 16. M. Zitt and E. Bassecoulard. Internationalization of scientific journals: A measurement 
 based on publication and citation scope. Journal of Scientometrics, 41:255–271, 1998.

Suggest Documents