No Arabic abstract
Properties of a percentile-based rating scale needed in bibliometrics are formulated. Based on these properties, P100 was recently introduced as a new citation-rank approach (Bornmann, Leydesdorff, & Wang, in press). In this paper, we conceptualize P100 and propose an improvement which we call P100_. Advantages and disadvantages of citation-rank indicators are noted.
Scholarly usage data provides unique opportunities to address the known shortcomings of citation analysis. However, the collection, processing and analysis of usage data remains an area of active research. This article provides a review of the state-of-the-art in usage-based informetric, i.e. the use of usage data to study the scholarly process.
In research policy, effective measures that lead to improvements in the generation of knowledge must be based on reliable methods of research assessment, but for many countries and institutions this is not the case. Publication and citation analyses can be used to estimate the part played by countries and institutions in the global progress of knowledge, but a concrete method of estimation is far from evident. The challenge arises because publications that report real progress of knowledge form an extremely low proportion of all publications; in most countries and institutions such contributions appear less than once per year. One way to overcome this difficulty is to calculate probabilities instead of counting the rare events on which scientific progress is based. This study reviews and summarizes several recent publications, and adds new results that demonstrate that the citation distribution of normal publications allows the probability of the infrequent events that support the progress of knowledge to be calculated.
Multidisciplinary cooperation is now common in research since social issues inevitably involve multiple disciplines. In research articles, reference information, especially citation content, is an important representation of communication among different disciplines. Analyzing the distribution characteristics of references from different disciplines in research articles is basic to detecting the sources of referred information and identifying contributions of different disciplines. This work takes articles in PLoS as the data and characterizes the references from different disciplines based on Citation Content Analysis (CCA). First, we download 210,334 full-text articles from PLoS and collect the information of the in-text citations. Then, we identify the discipline of each reference in these academic articles. To characterize the distribution of these references, we analyze three characteristics, namely, the number of citations, the average cited intensity and the average citation length. Finally, we conclude that the distributions of references from different disciplines are significantly different. Although most references come from Natural Science, Humanities and Social Sciences play important roles in the Introduction and Background sections of the articles. Basic disciplines, such as Mathematics, mainly provide research methods in the articles in PLoS. Citations mentioned in the Results and Discussion sections of articles are mainly in-discipline citations, such as citations from Nursing and Medicine in PLoS.
Accessibility research sits at the junction of several disciplines, drawing influence from HCI, disability studies, psychology, education, and more. To characterize the influences and extensions of accessibility research, we undertake a study of citation trends for accessibility and related HCI communities. We assess the diversity of venues and fields of study represented among the referenced and citing papers of 836 accessibility research papers from ASSETS and CHI, finding that though publications in computer science dominate these citation relationships, the relative proportion of citations from papers on psychology and medicine has grown over time. Though ASSETS is a more niche venue than CHI in terms of citational diversity, both conferences display standard levels of diversity among their incoming and outgoing citations when analyzed in the context of 53K papers from 13 accessibility and HCI conference venues.
The past year has seen movement on several fronts for improving software citation, including the Center for Open Sciences Transparency and Openness Promotion (TOP) Guidelines, the Software Publishing Special Interest Group that was started at Januarys AAS meeting in Seattle at the request of that organizations Working Group on Astronomical Software, a Sloan-sponsored meeting at GitHub in San Francisco to begin work on a cohesive research software citation-enabling platform, the work of Force11 to transform and improve research communication, and WSSSPEs ongoing efforts that include software publication, citation, credit, and sustainability. Brief reports on these efforts were shared at the BoF, after which participants discussed ideas for improving software citation, generating a list of recommendations to the community of software authors, journal publishers, ADS, and research authors. The discussion, recommendations, and feedback will help form recommendations for software citation to those publishers represented in the Software Publishing Special Interest Group and the broader community.