2014年9月10日 星期三

Waltman, L., Yan, E., & van Eck, N. J. (2011). A recursive field-normalized bibliometric performance indicator: An application to the field of library and information science. Scientometrics, 89(1), 301-314.

Waltman, L., Yan, E., & van Eck, N. J. (2011). A recursive field-normalized bibliometric performance indicator: An application to the field of library and information science. Scientometrics, 89(1), 301-314.

在利用引用做為研究成效測量指標的基礎時,一般有兩種方法,一種是根據領域分類系統(field classification scheme)使引用次數正規化,另一為遞迴式的引用加權(recursive citation weighting)。前者認為不同領域有不同的引用密度(每一出版品上引用的平均數目),因此需要進行調整。在這種方法裡,調整的方式有兩種:一種是根據出版品所指定的領域進行調整,另一種則是從出版品引用的參考文獻數進行調整,這種方式又稱為來源正規化(source normalization)。後者的遞迴式指標則是認為從有影響力的出版品、有聲譽的期刊以及知名作者處來的引用較為有價值。

過去的研究,根據它們是領域分類系統或是來源正規化已是是否為遞迴式的加權機制分為下面的四種:


可以發現目前並沒有根據分類系統正規化同時採用遞迴式機制的方法,因此本研究便式提出一個根據分類系統正規化同時採用遞迴式機制的引用評估方法。

本研究以圖書資訊學為範例,分析這個領域的期刊以及研究機構的引用影響力。在這裡,圖書資訊學領域是以Journal of the American Society for Information Science and Technology (JASIST) 為種子期刊,以共同引用為基礎,搜尋和JASIST有最強關連的期刊,同時這些期刊也必須被歸類在並且在WoS的資訊科學和圖書館學(information science & library science)主題分類下,結果共有47種期刊。連同JASIST在內的48種期刊便做為圖書資訊學領域的代表。並且針對這些期刊利用它們之間的書目耦合(bibliographic coupling)資料以及VOS叢集演算法歸類成三群,包括圖書館學(Library Science)、資訊科學(Information Science)以及科學計量學(Scientometrics),期刊的題名與它們的歸類情形,如下表所示:


選取圖書資訊學領域48種期刊從2000到2009年發表的類型為Article或Review的文章進行分析,共12202篇。


首先將圖書資訊學全體視為是一個領域。TABLE 4是根據MNCS評估指標的前十種圖書資訊學期刊,將α分別設為1(也就是沒有遞迴式的機制)及20(遞迴的結果收斂)。在α=1的情形,前十種期刊主要為資訊科學及資訊計量學,圖書館學僅占有三種(4, 8與10)。在α=20時,前十種期刊則幾乎是資訊科學及科學計量學,圖書館學僅有排名第9的一種。同樣以MNCS評估指標來評估研究機構時,也可以發現以科學計量學為主要研究項目的機構在α=20時,獲得更好的排名。


當將圖書資訊學分為三個次領域計算時,從TABLE 6的前十種圖書資訊學期刊,可以看到三個次領域的分布已經比較平衡。


scientometrics

Two commonly used ideas in the development of citation-based research performance indicators are the idea of normalizing citation counts based on a field classification scheme and the idea of recursive citation weighing (like in PageRank-inspired indicators).

Our empirical analysis shows that the proposed indicator is highly sensitive to the field classification scheme that is used. The indicator also has a strong tendency to reinforce biases caused by the classification scheme.

One stream of research focuses on the development of indicators that aim to correct for the fact that the density of citations (i.e., the average number of citations per publication) differs among fields.

One approach is to normalize citation counts for field differences based on a classification scheme that assigns publications to fields (e.g., Braun and Gla¨nzel 1990; Moed et al. 1995; Waltman et al. 2011). The other approach is to normalize citation counts based on the number of references in citing publications or citing journals (e.g., Moed 2010; Zitt and Small 2008). The latter approach, which is sometimes referred to as source normalization (Moed 2010), does not need a field classification scheme.

A second stream of research focuses on the development of recursive indicators, typically inspired by the well-known PageRank algorithm (Brin and Page 1998). ...  The underlying idea is that a citation from an influential publication, a prestigious journal, or a renowned author should be regarded as more valuable than a citation from an insignificant publication, an obscure journal, or an unknown author.

It is sometimes argued that non-recursive indicators measure popularity while recursive indicators measure prestige (e.g., Bollen et al. 2006; Yan and Ding 2010).

To test our recursive MNCS indicator, we use the indicator to study the citation impact of journals and research institutes in the field of library and information science (LIS).

We focus on the period from 2000 to 2009. Our analysis is based on data from the Web of Science database.

We first needed to delineate the LIS field. We used the Journal of the American Society for Information Science and Technology (JASIST) as the ‘seed’ journal for our delineation. We decided to select the 47 journals that, based on co-citation data, are most strongly related with JASIST. Only journals in the Web of Science subject category Information Science & Library Science were considered. JASIST together with the 47 selected journals constituted our delineation of the LIS field.

From the journals within our delineation, we selected all 12,202 publications in the period 2000–2009 that are of the document type ‘article’ or ‘review’.

We first collected bibliographic coupling data for the 48 journals in our analysis. Based on the bibliographic coupling data, we created a clustering of the journals. The VOS clustering technique
(Waltman et al. 2010), available in the VOSviewer software (Van Eck and Waltman 2010),
was used for this purpose. We tried out different numbers of clusters. We found that a solution with three clusters yielded the most satisfactory interpretation in terms of well-known subfields of the LIS field. We therefore decided to use this solution. The three clusters can roughly be interpreted as follows. The largest cluster (27 journals) deals with library science, the smallest cluster (7 journals) deals with scientometrics, and the third cluster (14 journals) deals with general information science topics.

We first consider the case of a single integrated LIS field. The recursive MNCS indicator is said to have converged for a certain α if there is virtually no difference between values of the αth-order MNCS indicator and values of the (α + 1)th-order MNCS indicator. For our data, convergence of the recursive MNCS indicator can be observed for α = 20. In our analysis, our main focus therefore is on comparing the first-order MNCS indicator (i.e., the ordinary non-recursive MNCS indicator) with the 20th-order MNCS indicator.

In the case of the first-order MNCS indicator, the top 10 consists of journals from all three subfields. However, journals from the information science and scientometrics subfields seem to slightly dominate journals from the library science subfield.

Let’s now turn to the top 10 journals according to the 20th-order MNCS indicator. This top 10 provides a much more extreme picture. The top 10 is now almost completely dominated by information science and scientometrics journals. There is only one library science journal left, at rank 9.

The top 10 institutes according to both the first-order MNCS indicator and the 20th-order MNCS indicator are listed in Table 5. Comparing the results of the two MNCS indicators, it is clear that institutes which are mainly active in the scientometrics subfield benefit a lot from the use of a higher-order MNCS indicator.

In Table 6, the top 10 journals according to both the first-order MNCS indicator and the 20th-order MNCS indicator is shown. ...  Comparing Table 6 with Table 4, it can be seen that library science journals now play a much more prominent role, both in the case of the first-order MNCS indicator and in the case of the 20th-order MNCS indicator. As a consequence, the top 10 journals now looks much more balanced for both MNCS indicators.

沒有留言:

張貼留言