Quality Australian Journals in the Humanities and Social Sciences

Australian Academic & Research Libraries ISSN: 0004-8623 (Print) 1839-471X (Online) Journal homepage: http://www.tandfonline.com/loi/uarl20 Quality ...
Author: Abel Goodman
3 downloads 4 Views 326KB Size
Australian Academic & Research Libraries

ISSN: 0004-8623 (Print) 1839-471X (Online) Journal homepage: http://www.tandfonline.com/loi/uarl20

Quality Australian Journals in the Humanities and Social Sciences Gaby Haddow To cite this article: Gaby Haddow (2008) Quality Australian Journals in the Humanities and Social Sciences, Australian Academic & Research Libraries, 39:2, 79-91, DOI: 10.1080/00048623.2008.10721334 To link to this article: http://dx.doi.org/10.1080/00048623.2008.10721334

Published online: 08 Jul 2013.

Submit your article to this journal

Article views: 150

View related articles

Full Terms & Conditions of access and use can be found at http://www.tandfonline.com/action/journalInformation?journalCode=uarl20 Download by: [37.44.207.64]

Date: 24 January 2017, At: 16:31

QUALITY AUSTRALIAN JOURNALS IN THE HUMANITIES AND SOCIAL SCIENCES Gaby Haddow A pilot study was undertaken to test the journal diffusion factor (JDF) as an alternative to journal impact factors (JIFs) for ranking journals. Bibliometric research methods were applied to rank Australian architecture, communications and education journals by the JDF; this was with the total number of citations they attract in ISI indexed journals, the proportion of articles published that attracted citations, and publishing characteristics. It was found that JDF does not provide a comparable alternative journal ranking method to JIF, although it may contribute to our understanding of the nature of a journal. Until further research is conducted, a JDF ranking should be considered as an independent measure of journal rank. AARL June 2008 vol 39 no 2 pp 79-91. Gaby Haddow, Faculty Librarian, Humanities, Curtin University Library, GPO Box U1987, Perth 6845. Email: [email protected]

A

ustralian researchers are under increasing pressure to provide evidence of quality research output, an imperative in part due to the research funding model for universities and to plans for the Research Quality Framework 1 (RQF), which was scheduled for implementation in 2008. Research output takes many forms, but journal articles are chief among them. For researchers in science disciplines, ‘quality’ journals have been equated with those indexed by Thomson Scientific ISI (ISI), and particularly journals with high impact factors. Journal impact factors (JIF) are published by ISI each year and have gained the attention of research funding bodies, higher education institutions, and academics as a means to rank journals in a field. However, ISI indexes less than 50% of peer2 reviewed journals published world-wide and, of these, less than one third are humanities and social science journals. A further limitation to assessing the ‘quality’ of these journals is that JIFs are not published for humanities journals. Australian Academic & Research Libraries

June 2008

79

Gaby Haddow

Researchers in humanities and social sciences are characterised by an orientation 3 ‘to their social context and are inherently more national’. Australian researchers are limited in their publishing choices, not only by the number of ISI journals in these disciplines, but also because ISI indexes relatively few Australian journals in which one would expect a national discourse to take place. How, then, are Australian researchers in these disciplines to identify ‘quality’ journals? This paper reports on a pilot study undertaken to test an alternative to JIFs 4 for ranking journals, the journal diffusion factor (JDF). A journal’s JDF is calculated by determining the average number of citing journals for every 100 citations. Ranking journals using the JDF provides information about the extent of exposure Australian journals gain and may offer a useful tool to assess journals when ISI journals are either inappropriate or unattainable as publishing channels. Journals from three subject areas were investigated: architecture, communications, and education. Bibliometric research methods were applied to rank Australian journals by the JDF, comparing this with an impact factor, the total number of citations they attract in ISI indexed journals, the proportion of articles published that attracted citations, and publishing characteristics. JOURNAL RANKING METHODS The earliest reported study into ranking journals was in 1927 in which the 5 researchers used total citation numbers as the primary measure. A range of journal ranking methods have been employed since, including surveys of academics (also called perception studies), cognitive mapping, library holdings, 6 page ranking, and article downloads. Bibliometric research methods, which use citation data alongside other measures, continue to attract researchers studying journal ranking. Rowlands proposed the JDF (average number of citing journals for every 100 citations), which provides a measure of the extent to which journals are reaching into their 7 field. This tool was developed further by Frandsen as the new journal diffusion 8 factor (NJDF). Frandsen added the number of journal articles to the equation. Another bibliometric method, the h-index, has gained currency and been 9 applied in a number of studies. The h-index is calculated by listing publications in order of number of citations. An h-value is 20 when 20 articles have at least 20 citations and the rest of the articles have fewer than 20 citations. Although initially proposed as a method to assess an individual’s impact, the h-index has 10 been successfully applied to rank journal publications. In Australia, RQF panels have contracted discipline-related groups to create ranked lists of research output. It can be expected that, as with any ranked list, the criteria for inclusion and the methods of ranking applied by the discipline groups will attract debate. Journal rankings derived from discipline-specific panel assessment can be seen in recent work by the European Science Foundation (ESF). The ESF has produced the European Reference Index for Humanities (ERIH), which are lists of journals ranked in three tiers for selected Humanities 11 disciplines. Expert panels developed guidelines for journal inclusion and sought submissions from ESF member organisations. While the ERIH lists are not 80

Volume 39 Number 2

Australian Academic & Research Libraries

Quality Australian journals in the humanities and social sciences

intended for use as a measure of journal quality, inevitably a tiered system of 12 ranking journals (A, B and C) suggests just that. In a British Academy report, the authors suggest caution when using the ERIH journal lists, noting the criteria used to develop the lists were unreliable. The ESF clearly states the ERIH lists are ‘not a bibliometric tool … [and] does not encourage using the lists as the only basis for assessment of individual candidates for positions or promotions or of applicants for research grants’. This statement resonates with Eugene Garfield’s commentary on the use of JIFs. Garfield notes ‘the use of journal impacts in evaluating individuals has its inherent dangers. In an ideal world, evaluators would read each article and make 13 personal judgments’. The important argument here is not about ranked journals lists in and of themselves, but the use to which they are put. Generally, peer review is regarded as the most reliable method to assess research output. This is borne out in the 14 guidelines for the RAE 2008 and the RQF. However, metrics and ranking of outputs will play a role in the RQF and discussions in the UK suggest metrics 15 may be included in the next RAE. 16

Very few Australian journal ranking studies have been conducted and none have applied the JDF calculation to examine diffusion of Australian humanities and social sciences journals. A metrics approach to the RQF means Australian researchers will benefit in their final assessment ratings if they have a good understanding of where Australian journals are positioned. However, journal ranking lists should not be the sole method of assessing research output. They should be used as a guide only, with peer review providing the main form of assessment. The objectives of this study are to test the JDF to determine if this is a potentially useful method of ranking Australian journals, and to compare the JDF with other methods of journal ranking. To carry out the analyses involved in the study, ISI citation data were used. The reliance on ISI data demands an explanation of the assumptions being made by the author. Firstly, journal quality does not equate with the number of citations a journal attracts. However, the number of citations a journal receives is an indicator of journal impact. That is, citations signify a journal has gained exposure, and in the case of ISI data, that exposure is primarily in international literature. Secondly, all citations are assumed to be equal. This is particularly important when using data from the ‘cited reference search’ of Web of Science because of the limited amount of information available. METHODS Journal Sample, Citations and Publication Data Bibliometric research methods are central to this study, which required total citations, cited articles, citing journals, and number of published articles as the primary data sources. The JDF calculation is based on data for the total number of citations to Australian journals and the number of different journals citing Australian Academic & Research Libraries

June 2008

81

Gaby Haddow

those journals. Impact factors are calculated from total citations data and articles published. To determine the proportion of articles cited, data for the number of articles published over the period and the number of articles cited were needed. Journal characteristics (for example, format and extent of indexing) provided supplementary data to examine factors that might influence a journal’s rank. Journals included in the study had to meet the following criteria: they had to be refereed, published in Australia, published for the entire period 2001-2006, and with two or more issues published each year. Ulrich’s Periodicals Directory (Ulrich’s) was searched to identify journals for the study, by using the ‘refereed’ and ‘active’ status limits in searches for ‘Australia*’ (as country of publication) and subject heading (‘architecture’, ‘communications’, and ‘education’). Titles issued irregularly and those that had not been published continually over the 20012006 period were excluded, as were titles with a primary focus in other subject areas (for example, ‘Accounting and Finance’ was excluded from the education set). On the basis of these criteria, three journals comprised the architecture set, five were included in the communications set, and 55 in the education set. Ulrich’s was also used to collect additional journal data, including: extent of indexing; electronic and/or print format; publisher; joint place of publication; and first year of publication. The ‘Cited Reference Search’ function of the ISI database Web of Science was used to identify citations to articles published in the Australian journals between 2001 and 2006. These citations were from sources published between 2001 and September 2007. Due to inconsistencies in journal title abbreviations in ISI, the searches involved a variety of truncation strategies. Total number of citations was manually counted, as was the number of articles cited. The ‘finish search’ and ‘refine’ functions were then used to determine the number of different citing journals. Online and printed sources were consulted to locate data relating to the number of articles published in 2001-2006. For most titles, table of contents pages provided adequate information. Editorials, if listed, were included in the article count, but items such as book reviews, obituaries, letters to the editor, and corrections were excluded. Data for some titles were incomplete, and in these cases an estimate of the number of articles published was made by calculating the mean number of articles published each year from available data for at least four years. A limitation of the JDF calculation is the tendency for journals with fewer 17 citations to get a high JDF and highly cited journals to have a low JDF. To reduce this potential, all journals with fewer than ten citations were excluded from further analysis. This exclusion left one title in the architecture set, three in the communications set, and 14 titles in the education set. Table 1 lists journals included in the analysis and notes those that were excluded in later analyses.

82

Volume 39 Number 2

Australian Academic & Research Libraries

Quality Australian journals in the humanities and social sciences

Table 1:Journals Included in Initial Analysis Architecture Architectural Science Review

Education Australasian Journal of Educational Technology Australian Journal of Education Australian Journal of Language & Literacy*

Communications

Australian Journal of Learning Disabilities*

Australian Journal of Communication

Babel*

M-C Journal* Prometheus

Discourse Distance Education English in Australia* Higher Education Research & Development International Journal of Music Education Mathematics Education Research Journal* Prospect* Research in Science Education Teaching Education

* excluded in later analyses

Data Analyses In common with the JIF, the JDF is calculated over a two year publication period, 18 but Rowlands states a different timeframe can be selected. Six publication years, 2001-2006 were chosen for this study for three reasons. Firstly, compared with citations to science journals, citations to humanities and social science journals tend to have a longer time lag from publication date. Secondly, a shorter period may have resulted in too few citations for application of the analyses. Lastly, the period corresponds with the timeframe for published output required by the RQF. For each Australian journal, the JDF was calculated using the equation: Number of different citing journals (2001-Sept 2007) x 100 No of citations to journal (2001-Sept 2007) The JDF analysis for the three communications and 14 education journals indicated that the earlier exclusion of titles with fewer than ten citations had not adequately addressed the tendency for journals with very few citations to have high JDFs. A higher cut-off point, journals with fewer than 25 citations, was applied to reduce this effect, leaving eight journals in the education set and two in the communications set. As JIFs were unavailable from ISI for all but one journal in the sample, a calculation based on the JIF was conducted for the journals’ 2001-2006 publication period. The calculation applied the following equation: Australian Academic & Research Libraries

June 2008

83

Gaby Haddow

Number of citations to journal (2001-Sept 2007) Number of articles published in journal (2001-2006) A further analysis calculated the proportion of articles cited over the period. Information about extent of indexing, format, publisher, and tier rank in the ERIH education list was recorded in a spreadsheet. Numeric data for each journal, such as extent of indexing, was reassigned a rank (for example, 1 to 8 in the education set) to indicate its place in the set. Data were analysed using the sort and chart functions of Excel. RESULTS Titles in the architecture and communications set required very little analysis because of the small number of journals meeting all criteria. The architecture set included one title only (Architectural Science Review) and no further analyses were conducted. After exclusions, the communication set comprised two titles, Australian Journal of Communication and Prometheus. An initial analysis of the communications set and education set was conducted to determine if there was a tendency for journals with lower citation numbers to receive a comparatively 19 high JDF, as noted by Frandsen. Figure 1 presents this analysis, showing JDF, percentage of articles cited, and total citations for the titles. Figure 1: Communications Journals: JDF, % Articles Cited and Total Citations

Prometheus is clearly the higher ranked journal in terms of the data shown in Figure 1, and remains in this position when other factors are examined. The title is ranked first for extent of indexing and impact factor, and is published by a large international publisher. Both journals are available in print and electronic format and neither is indexed by ISI. 84

Volume 39 Number 2

Australian Academic & Research Libraries

Quality Australian journals in the humanities and social sciences

The same factors were examined for the education titles. Figure 2 shows the analysis for JDF, percentage of articles cited, and total citations for the eight education journals. Figure 2: Education Journals: JDF, % Articles Cited and Total Citations

Figure 2 confirms the tendency for journals with lower citation numbers to receive a comparatively high JDF. With the exception of the International Journal of Music Education, the journals are ranked almost in reverse order when comparing JDF and total citations, as shown in Table 2. Table 2: Education Journals Ranked by JDF Journals

JDF Total % Indexing cites Articles cited

ISI indexed

IF Publisher

Australian Journal of Education

1

6

6

1

Australasian Journal of Educational Technology

2

7

7

8

7

Distance Education

3

4

4

4

4

Routledge

Discourse

4

3

2

2

3

Routledge

Teaching Education

5

5

5

7

6

Routledge

Higher Education Research & Development

6

2

3

4

2

Routledge

Research in Science Education

7

1

1

4

X

1

Springer

International Journal of Music Education

8

8

8

3

X

8

SAGE

Australian Academic & Research Libraries

X

5

June 2008

85

Gaby Haddow

It is worth noting that the International Journal of Music Education was included in the education set because it just met the total citations cut-off point of 25, and was the lowest ranked journal for total citation numbers. The second last ranked journal by total citations attracted twice as many citations (Australasian Journal of Educational Technology with 57). The data were sorted to identify associations between the journal rankings for different factors. When JDF rank was compared with the number of indexes a journal is listed in, a positive association was seen only for the journals ranked first and sixth by JDF. The closest associations found were for percentage of articles cited and total citations, and total citations and impact factor (which are in any case associated by the calculation for impact factor), shown in Table 3. Table 3: Education Journals Ranked by % Articles Cited and Total Citations Journals

% Citations JDF Articles cited

IF Indexing

ISI indexed

Publisher

X

Springer

Research in Science Education

1

1

7

1

4

Discourse

2

3

4

3

2

Routledge

Higher Education Research and Development

3

2

6

2

4

Routledge

Distance Education

4

4

3

4

4

Routledge

Teaching Education

5

5

5

6

7

Routledge

Australian Journal of Education

6

6

1

5

1

Australasian Journal of Educational Technology

7

7

2

7

8

International Journal of Music Education

8

8

8

8

3

X

X

SAGE

No association was found between total journal articles cited and JDF, with some journals highly cited by a relatively small set of journals. This finding supports the intention of the JDF, which is to identify the diffusion of a journal, not its impact as measured by citations. An association can be seen between the number of articles cited and total citations. Examining the rankings for extent of indexing and percentage of articles cited revealed no clear association. All titles in the set were available in print and electronic format, six were published by international publishers, and six were assigned Tier B (‘standard international publications with a good reputation among researchers of the field in different countries’) ranking in the ERIH education list.

86

Volume 39 Number 2

Australian Academic & Research Libraries

Quality Australian journals in the humanities and social sciences

A final ranked list of the education journals was produced by adding the rank number for each factor examined. These factors were: JDF, citations, percentage of articles cited, number of articles cited, extent of indexing, and impact factor. Table 4 displays the titles ranked using this analysis and provides a comparison with the JDF and impact factor ranks. Table 4: Education Journals Ranked by All Factors Journal

Overall rank

JDF

IF

Research in Science Education

1

7

1

Discourse

2

4

3

Higher Education Research & Development

3

6

2

Distance Education

4

3

4

Australian Journal of Education

5

1

5

Teaching Education

6

5

6

Australasian Journal of Educational Technology

7

2

7

International Journal of Music Education

8

8

8

DISCUSSION Due to the small number of journals meeting the criteria for inclusion in the architecture and communications sets, no conclusions can be drawn from the findings. The sole title in the architecture set means that comparisons or ranking are unavailable. When the two titles in the communications set were examined, one title (Prometheus) was ranked first for all factors. However, the set is too small to reach any valid conclusions. The same might be argued for the education set (eight titles), but some findings have the potential to inform future research using JDF as a ranking method and these are outlined in the discussion below. Any reading of these findings should be done with caution and on the basis that this is a pilot study which applied relatively simple analyses. JDF rank, when examined alongside total journal articles cited, supports the notion that some journals are highly cited by a relatively small set of journals. That is, the journal’s diffusion is limited but may be important to a set of specialised 20 journals. In his JDF analyses, Rowlands reported a ‘very general principle’ that journals in the top third of his list of library and information science (LIS) 21 journals are those with a more general academic and practitioner focus. Titles in the lower third of Rowlands’ list tend to the more specialised and scholarly. The results of this study may be interpreted along similar lines, with the top ranked journals by JDF being Australian Journal of Education, Australasian Journal of Educational Technology, and Distance Education. Journals with a low JDF ranking are Higher Education Research & Development, Research in Science Education, and International Journal of Music Education.

Australian Academic & Research Libraries

June 2008

87

Gaby Haddow

Rowlands found a negative correlation between JDF and impact factor, noting JDF and IF are statistically independent. Although the same statistical analyses were not applied in this study, the general association between JDF and impact factor ranking supports Rowlands’ finding. It was suggested by Rowlands that calculating JDFs over a longer publication period could provide ‘greater 22 stability’. Further research is required to ascertain if the six year publication period of this study contributed in this way. It was speculated that a high JDF, which indicates the extent to which a journal is cited in different journals, might be associated with the number of indexes a journal is listed in. However, the findings indicate only two journals (those ranked first and sixth by JDF) were on the same rank for extent of indexing. In fact, extent of indexing appeared to have no positive associations with any other factor examined. A high JDF was not associated with a journal having an international publisher, nor being indexed by ISI. When the overall ranking of journals was compared with JDF and impact factor (Table 4), it was impact factor that was positively associated with the overall rank. This finding reflects the study results overall and Rowlands’ findings. That is, the JDF calculation appears to rank journals in a very different order to that seen using the JIF. The closest association found in the study was journal rank for percentage of articles cited and total citations. Similarly, a positive association was found between the number of articles cited and total citations. These findings seem intuitive as it might be expected that a higher number of articles cited will naturally result in higher numbers of citations. Also, it is these data that are used by ISI to calculate the JIF. Although six of the education titles are assigned a Tier B in the ERIH list, four journals excluded from this study’s analyses were also ranked as a Tier B journal. It would be necessary to conduct a qualitative assessment of the Australian education journals in the ERIH before reaching a conclusion from this data. However, there are indications that the British Academy’s concerns about the criteria for inclusion in the ERIH lists are well founded. Lessons and Limitations of the Study The data required for this study was drawn primarily from published sources such as ISI and Ulrich’s. Many studies have used ISI data, and most acknowledge the difficulties that arise due to inconsistencies across records in the database. When using the ‘cited reference search’ function to retrieve ISI data these inconsistencies are particularly evident. In some cases, catalogues and indexes were searched to identify volume numbers for the period to ensure the citations listed in ISI were, in fact, to the Australian journals. Ulrich’s presents different challenges. The subject headings used are generally broad and their assignment is limited, which left the author in some doubt about whether all Australian journals in the subject areas had been identified. Also,

88

Volume 39 Number 2

Australian Academic & Research Libraries

Quality Australian journals in the humanities and social sciences

past searches of Ulrich’s found that the refereed status data in entries were not always reliable. On the basis of this experience, other data drawn from Ulrich’s should be regarded with reservations about its accuracy. Although each of the above limitations to the retrieval of reliable data presents problems, they can be corrected to some degree by supplementary searches of other sources. The main weakness of the study lies in the small journal publishing base in Australia. It is clear that a study of this kind is not appropriate in some subject areas because too few journals meet the inclusion criteria and therefore a ranking exercise becomes meaningless. The ranking results for International Journal of Music Education suggest the exclusion cut-off point of 25 total citations was too low. In most of the analyses carried out, this title’s rank appeared anomalous. A higher cut-off point, possibly 50 or more citations, may produce a more reliable ranked list. Ideally, the study would have collected citations data to the end of 2007. This would ensure the calculations for JDF and impact factor are aligned with the equations proposed by Rowlands23 and that used by ISI for JIFs, albeit for a longer period of time. However, given the number of citations identified for the Australian journals over six years, it would seem unlikely that the study results have been significantly affected by the loss of three months citations. CONCLUSION Journals ranked by JDF produce very different results from any other ranking methods applied in this study. Although these findings are based on a small set of journals with slightly modified methods, they are not dissimilar to Rowlands’ 24 results for LIS journals. It might be argued that journal JDF ranking is a useful tool when assessing journals for degree of specialisation and scholarliness (those with a lower ranked JDF) and ‘degree of transdisciplinary reception’ (journals 25 with a higher ranked JDF). However, the scale and methods of this study mean that only tentative conclusions can be reached about the reliability of JDF ranking for this purpose. To test the relationship between JDF and other journal ranking methods, a much larger study is required. This may be difficult to achieve in studies of Australian humanities and social sciences journals due to the small journal publishing base in Australia. Future research may be better served by studying all journals published in a subject field and by determining the JDF rank of Australian journals within the larger set. In addition, a study could apply the NJDF and the h-index calculations to a set of journals to determine if these journal ranking 26 method provide useful assessment tools. In conclusion, the JDF does not provide a comparable alternative journal ranking method to JIF. The JDF may contribute to our understanding of the nature of a journal, but until further research is conducted a JDF ranking should be considered as a quite independent measure of journal rank.

Australian Academic & Research Libraries

June 2008

89

Gaby Haddow

NOTES 1.

Following the presentation and acceptance of this paper, a Federal Government election in November 2007 brought a change of government in Australia. The new Labor Government cancelled the RQF in December 2007 and proposed a replacement, Excellence in Research Australia (ERA), on 28 February 2008 (see http://www.arc.gov.au/era/default.htm). As details of new research measures are not yet known, this article has not been updated and retains the RQF as the context for discussion.

2.

C Steele, L Butler & D Kingsley, ‘The Publishing Imperative: The Pervasive Influence of Publication Metrics’ Learned Publishing 2006 vol 19 no 4 pp277-290.

3.

D Hicks ‘The Difficulty of Achieving Full Coverage of International Social Science Literature and the Bibliometric Consequences’ Scientometrics 1999 vol 44 no 2 p202.

4.

I Rowlands ‘Journal Diffusion Factors: A New Approach to Measuring Research Influence’ Aslib Proceedings 2002 vol 54 no 2 pp77-84.

5.

B D Cameron ‘Trends in the Usage of ISI Bibliometric Data: Uses, Abuses, and Implications’ Portal: Libraries and the Academy 2005 vol 5 no 1 pp105125.

6.

J Bollen, H van de Sompel, J A Smith & R Luce ‘Toward Alternative Metrics of Journal Impact: A Comparison of Download and Citation Data’ Information Processing & Management 2005 vol 41 no 6 pp1419-1440; Cameron op cit; J W East ‘Ranking Journals in the Humanities: An Australian Case Study’ Australian Academic & Research Libraries 2006 vol 37 no 1 pp3-16; B Lane ‘Ranking System for Journals’ The Australian 25 April 2007 p34; N Martin ‘Keeping Score: CJR Presents a New Paradigm for Rating Journals’ EContent 2007 vol 30 no 3 p14; I Rowlands & D Nicholas ‘The Missing Link: Journal Usage Metrics’ Aslib Proceedings 2007 vol 59 no 3 pp222-228;. R M Shewchuk, S J O’Connor, E S Williams & G T Savage ‘Beyond Rankings: Using Cognitive Mapping to Understand What Health Care Journals Represent’ Social Science & Medicine 2006 vol 62 no 5 [abstract only].

7.

Rowlands op cit.

8.

T F Frandsen ‘Journal Diffusion Factors: A Measure of Diffusion?’ Aslib Proceedings 2004 vol 56 no 1 pp5-11.

9.

T Braun, W Glanzel & A Schubert ‘A Hirsch-type Index for Journals’ Scientometrics 2006 vol 69 no 1 pp169-173; J E Hirsch ‘An Index to Quantify an Individual’s Scientific Research Output’ Proceedings of the National Academy of Sciences 2005 vol 102 pp16569-16572; C Oppenheim ‘Using the H-index to Rank Influential British Researchers in Information Science and Librarianship’ Journal of the American Society for Information Science and Technology 2007 vol 58 no 2 pp297-301.

10. Braun, Glanzel & Schubert op cit.

90

Volume 39 Number 2

Australian Academic & Research Libraries

Quality Australian journals in the humanities and social sciences

11. European Science Foundation European Reference Index for the Humanities (ERIH) 2007 at http://www.esf.org/fileadmin/be_user/research_areas/HUM/ Documents/ERIH/ESF-ERIH.pdf. 12. British Academy Peer Review: The Challenges for the Humanities and Social Sciences 2007 at http://www.britac.ac.uk/reports/peer-review/report. pdf. 13. E Garfield ‘The History and Meaning of the Journal Impact Factor’ JAMA 2006 vol 295 no 1 p92. 14. RAE2008: Research Assessment Exercise 2007 at http://www.rae.ac.uk; Department of Education Science and Training Research Quality Framework: Assessing the Quality and Impact of Research in Australia March 2005 at http://www.dest.gov.au/sectors/research_sector/publications_ resources/profiles/research_quality_framework_issues_paper.htm. 15. RAE2008 op cit; Department of Education Science and Training op cit; Steele, Butler & Kingsley op cit. 16. L Butler ‘Identifying “Highly Rated” Journals: An Australian Case Study’ Scientometrics 2002 vol 53 no 2 pp207-227; East op cit; H K Lawrence The Use of Bibliometric Analysis of Citations to Define and Analyze the Core of Australian Legal Literature Masters Thesis University of New South Wales 1979; P Royle ‘A Citation Analysis of Australian Science and Social Science Journals’ Australian Academic & Research Libraries 1994 vol 25 no. 3 pp162-171. 17. Frandsen op cit. 18. Rowlands op cit. 19. Frandsen op cit. 20. Rowlands op cit. 21. ibid p82. 22. ibid p83. 23. ibid. 24. ibid. 25. ibid p83. 26. Frandsen op cit; Hirsch op cit.

Australian Academic & Research Libraries

June 2008

91

Suggest Documents