Showing posts with label h-index. Show all posts
Showing posts with label h-index. Show all posts

Thursday, December 11, 2008

Impact (humour) factor



by Jorge Cham at PHD comics,

now we need something for h-index....

Thursday, June 12, 2008

Assessing Scientific Research and Citation Analysis

A thoughtful report " about the use and misuse of citation data in the assessment of scientific research."

"We do not dismiss citation statistics as a tool for assessing the quality of research—citation data and statistics can provide some valuable information. We recognize that assessment must be practical, and for this reason easily‐derived citation statistics almost surely will be part of the process. But citation data provide only a limited and incomplete view of research quality, and the statistics derived from citation data are sometimes poorly understood and misused. Research is too important to measure its value with only a single coarse tool."

Saturday, November 10, 2007

Bibliometrics and research quality measurement

Anyone who is involved with impact factor, peer review, citation searching etc should peruse this report from Universities UK. Here is Guardian's brief on this.

Here are some highlighted points from the report:

  • "Bibliometrics are probably the most useful of a number of variables that could feasibly be used to measure research performance.
  • There is evidence that bibliometric indices do correlate with other, quasi-independent measures of research quality - such as RAE grades - across a range of fields in science and engineering.
  • There is a range of bibliometric variables as possible quality indicators. There are strong arguments against the use of (i) output volume (ii) citation volume (iii) journal impact and (iv) frequency of uncited papers.
  • 'Citations per paper' is a widely accepted index in international evaluation. Highly-cited papers are recognised as identifying exceptional research activity.
  • Accuracy and appropriateness of citation counts are a critical factor.
  • There are differences in citation behaviour among STEM and non-STEM as well as different subject disciplines.
  • Metrics do not take into account contextual information about individuals, which may be relevant. They also do not always take into account research from across a number of disciplines.
  • The definition of the broad subject groups and the assignment of staff and activity to them will need careful consideration.
  • Bibliometric indicators will need to be linked to other metrics on research funding and on research postgraduate training.
  • There are potential behavioural effects of using bibliometrics which may not be picked up for some years
  • There are data limitations where researchers' outputs are not comprehensively catalogued in bibliometrics databases." via Universities UK