Transcript indicators

3. Challenges of bibliometrics
DATA: many problems linked to the collection of data
FIELDS – DISCIPLINES: various classifications, not satisfactory
workshop
INDICATORS: which ones are valid/needed to quantify
research performance?
MAPPING: to assess potential rather than past
performances (metrics)
3. Challenges
FIELDS – SCIENTIFIC DISCIPLINES
Various classification (OCDE, national, ISI, Scopus, …)
Problems of coverage
Academic structure might be quite different from the classification
of the database
Context : a research assessment exercice (ULB; school of
economy and business)
Comparison between bibliometrics and peer-review
approaches to measure publications in economics
Issue 1: retrieval of all publications relevant to economics
from the ISI-Thomson database (Web of Science)
Issue 2 : matching the knowledge-based sorting with the
academic delimitation of disciplines in this university
What does ISI database show about the sciencific production in economics of a Belgian
university (ULB) ?
Economics
Management
Business, finance
Business
Industrial relations & labor
Planning & dev elopment
Social sciences, mathematical methods
History of social sciences
Transportation
International relations
Public administration
Psy chology, applied
Health policy & serv ices
Law
Env ironmental studies
Geography
Operations research & management science
Information science & library science
Social issues
Ergonomics
Urban studies
Political science
Ethics
Hospitality, leisure, sport & tourism
Social sciences, interdisciplinary
Statistics & probability
Health care sciences & serv ices
Public, env ironmental & occupational health
Engineering, industrial
History
Mathematics, interdisciplinary applications
ARTICLEÕs SUBJECT
Economics
Other
Retrieval of ULB publications
(2001-2008) for all ISI categories
related to economics (JEL
classification)
For each category, manual
sorting of publications
4 categories with 100% articles in
economics
26 categories with at least 10%
articles in economics
20 other relevant categories with
less than 10% articles
Energy & fuels
Ecology
And also...
0
25
50
75
100
125
150
175
200
Agricultural economics & policy - Fisheries - Geosciences - Water resources - Engineering (environmental)
Area studies - Demography - Gerontology - Nursing - Psychology (3 sub-domains) - Ethnic studies - Family
studies - Computer science (3 sub-domains) - Communication - Education & educational research
3. Challenges of scientometrics
INDICATORS: choice is critical!
Repeatable, based on transparent methodology (allow the
institutions to reproduce the methodology in-house)
Capable of identifying comparable levels of research quantity
and/or quality across disciplines
Quantitative (scientific production) vs qualitative (citations)
Impact on researchers behaviour
Australian case study (Harzing, in Economics & Bysiness, 2005)
Indicators used by the government for funding: till 1993,
competitive research grants; since1994: # publications
Consequence: high publication quantity (# papers) but slow
publication quality (citations per paper)
Main drawbacks
no control for the size of the institution
databases not complete for SHS
3. Challenges of scientometrics
INDICATORS
Which indicator is a valid and robust measure of research
quality for a discipline?
h-index: measure scientific productivity and apparent scientific impact
m-index: measure the quality of the h « best » papers
FNRS study of co-publications (2005-2008)
3. Challenges
Indicators (a tool for improvment)
To feed research policy and improve performances
Size versus production
Production versus quality
Number of papers
Number of citations per paper
20
10000
Univ A
18
Univ D
16
8000
Univ D
14
12
Univ G
6000
Univ B
Univ E
10
Univ K
Univ J
8
4000
Univ F
Univ C
Univ G
Univ I
Univ H
Univ K
6
Univ C
4
Univ E
2000
2
0
0
0
0
100
PHYSICS & ASTRONOMY
200
Size
2000
4000
6000
8000
Number of papers
6. Pending questions
Research evaluation is usefull
But
Needs to be carefully handled
And
Raises several questions
Impact of evaluation on research activity, strategy (researchers,
universities)
Impact of evaluation on research funding (disciplines?)
Risk to miss emergence of teams, themes (bibliometrics assess the past)
Choice of methodology (peer review? bottom-up?)
How to feed the research part of the (probably unavoidable)
rankings with indicators developed as a support of research policy
Possible role of Eurohorcs