Library and Bibliometrics Services as a Shared Service 2015

Download Report

Transcript Library and Bibliometrics Services as a Shared Service 2015

Do the Less Economically Developed
(LED) countries have the knowledge
to drive a knowledge organization?
A perspective of the National System
of Innovation (NSI) June 2016
Daisy Selematsela (PhD)
email: [email protected]
[email protected]
National System of Innovation
• The construct ‘national system of innovation’
(NSI) is used to characterise a country’s
collective efforts towards fostering
technological innovation. Sbusiso.T Manzini Sept/Oct2012
• Since appearing in the 1996 White Paper on
Science and Technology, the term has been
used widely in South African policy discourses.
NationaI System of Innovation
(NSI)
“The broader collective comprising
all public and private sector research
and technology organisations”
Knowledge generation influencers
• SA government transformation objectives - a
knowledge society that competes globally:
• National R&D strategy
• National Plan for Higher Education (NPHE)
• Human Resources Development Strategy for SA
• DST Ten Year Innovation Plan
• National Development Plan
• DoE Research Funding Framework
• National Plan on Higher Education
Knowledge society indicators
• Amount spent on R&D as a percentage of GDP
– Government target 1.5% of GDP by 2019 – current level 0.73%
– 15% of annual investment in R&D performed in SA comes from international
partners.
• Qualitative measurement of use /access to ICT’s
• Ability to produce high export technology
• HE internationalization
– “Process to integrate international, intercultural & global dimension into all
aspects of higher education – to enhance the quality of education and
research for all students and staff – to make a meaningful contribution to
society”
• Number of scientists in the country
• Number of patents filed
• Number/impact of articles published in ranked (impact) journals
South Africa: International collaboration 2011-2015 (SciVal)
South Africa: International collaboration 2011-2015 (SciVal)
South Africa: International collaboration 2011-2015(InCites)
South Africa: International collaboration 2011-2015(InCites)
South Africa: % Industry collaboration 2011-2015(InCites)
Knowledge Ecosystem
expectations
Are KM practitioners positioned to
support the knowledge ecosystem?
Emergence of a revised social contract
between science and the state
• Expectations - in return for public funds, scientists
and universities must address the needs of ‘users’
in the economy and society. Furthermore, they
are subject to much more explicit accountability
for the money they receive.
• Implicit is a much more complex model of
innovation than the previous linear model, making it much harder to persuade politicians
of the merits of increasing public spending on
research!
Revised social contract…
• The expectations of governments from science
are more direct and concrete… Trust in the
mechanisms of scientific self-regulation and the
linear model of innovation have been replaced
by
– “benchmarking practices, performance measures
and indicators of quality”.
Research Evaluation
• “the emerging methods, experiences and
lessons for appraising and evaluating
research”
Levels of Research Evaluation
System level
Institutional level
Programme level
Individual level
Properties of Evaluandi - Systems Level
• Output/ Volume (nr. of papers, doctoral graduates,
patents, etc.)
• Efficiency (Nr of ISI papers per researcher, nr of patents
per million of population/ Nr of papers per R invested)
• Effectiveness (achievement in terms of national goals as
explicated in science policy statements)
• Relevance (alignment between NSI and country or MDG
goals)
• Comparative performance (i.t.o. some benchmarking
indicators, e.g. R&D intensity/ papers in ISI per 1000 of
labour force/ R&D workers per million of population/
PhD’s in SET per 1000 of age cohort)
Institution Level
• Quality/Excellence (quality of research produced) –
also in comparative perspective
• Visibility and recognition (how visible is the university
internationally – usually measured through citation
impact measures)
• Productivity (output/ input e.g. nr of research papers
per researcher/ staff member)
• Efficiency (output per Dollar spent/investment)
• Comparative performance (benchmarking or ranking
measures) i.t.o. output measures, citation measures.
Programme/Project Level
• Quality/Excellence (quality of research produced) –
also in comparative perspective
• Visibility and recognition (how visible is the university
internationally – usually measured through citation
impact measures)
• Productivity (output/ input e.g. nr of research papers
per researcher/ staff member)
• Efficiency (output per Dollar spent/investment)
• Comparative performance (benchmarking or ranking
measures) i.t.o. output measures, citation measures.
Individual Level
• Productivity (nr of papers per individual scientist/ nr of
graduates pers supervisor, etc. )
• Quality (usually based on peer review/ relative rating
within NRF system)
• Recognition/ visibility (citation measures such as nr of
citations per paper/ h- index/ g-index)
• Influence/ Reputation (measured by invitations to
present keynote addresses/ prizes/membership of
boards/ academies)
State of KM readiness – country to become a preferred
global science & technology investment destination?
• National S&T
comparisons – research
evaluation
• Scientific fields and
disciplines
• Institutional level
• University (Corporate)
• Dept, Institute, Centres
(Divisional)
• Research programmes
• Funding instruments
• Collaborators
• Evaluation and Rating
researchers
• Grant proposals
• Identification of reviewers
• Interpretation of
indicators
• Reliable measures of
impact and
competitiveness
• Business/competitive
intelligence
State of KM readiness – country to become a preferred
global science & technology investment destination?
• Bibliometrics/scientometr
ics/webometrics/informet
rics/Altmetrics & other
alternative methods of
tracking & presenting
impact
• Systemic reviews/audits
• Field reviews
• Research assessment
exercise
• Departmental reviews
• Performance
appraisals/ratings
• Research integrity
• Predatory
Journals/Deceptive
Journals
• Pre-evaluation of outputs
(bibliographic info)
• Appropriately completed
submissions (DHET for
subsidy)
• Liaison – Research Offices
– Researchers
State of KM readiness – country to become a preferred
global science & technology investment destination?
•
•
•
•
•
Communities of Practice
Research data management
Open Science
Open Data
Records and document
management
• Digitisation/digital
preservation
• Intellectual Property
• Copyright licenses –
Creative Commons
• Procurement processes
• Contract design
• Terms of Reference/
Memorandum of
Understanding design
• Protection of Personal
Information Act
implementation
• Promotion of Access to
Information Act
• Policy development ie. Data
Classification & Handling
• ICT Service Continuity plans
What does this tell us
• How do we organize KM to better support the
knowledge economy?
• How do we demonstrate the KM value
through impact and usage measurements?
• How do we sell good KM practices?
• How do we capture & communicate the value
of KM within the NSI
st
21
We live in the
Century –
have financial, climate change
crises and ecological overshoot!
If we do not adapt our KM processes
we would not be able to contribute
to a sustainable knowledge economy.
Thank you
References
• Department of Science & Technology Budget Vote 2016. The
Star, Pretoria News, The Sunday Independent advertising
feature: May 2016. www.dst.gov.za
• International links key for research and development. Sunday
Independent: 8 May 2016
• Eva Egron-Polak. Higher education internationalization: global
trends and African opportunities. International Association of
Universities. SARIMA 2016 Annual Conference: May 2016
• J Mouton. Center for Research on Evalaution, Science &
Technology (CREST). SARIMA Workshop – Introduction to
Bibliometrics: 13 July 2013
• Extracted from InCites™ (Thomson Reuters, 2016)
• Extracted from SciVal ®(2016 Elsevier B.V., 2016)