Leonhard Jörg

Download Report

Transcript Leonhard Jörg

new frontiers in evaluation
Evaluating the RTD policy portfolio
the Austrian experience
Leonhard Jörg
Andreas Schibany
24. April 2006
1
Road map
• Why portfolio evaluation?
• Basic dimensions for evaluating RTD policy portfolios
• Some observations from Austria
• Limitations and practical problems
2
Why should we look more systematically on RTD
policy portfolios?
... without a portfolio manager
... as long budgets keep expanding
• End of catching up process is in sight
• Attention may shift again from “how much we spend” to
“how we spend”
• There might be quite some room for increasing the
effectiveness of the funding system
3
Some remarks on the context
• Portfolios are not designed on the drawing table but the
result of
• Changing perceptions of needs and problems
• Changing ways of how R&D is undertaken (mode 1  mode 2)
• Policy making in competitive environments
• There is no optimal portfolio
• Portfolios are usually messy with single instruments
addressing multiple goals
We are looking after improvements rather than for THE
optimal portfolio
4
Road map
• Why portfolio evaluation?
• Basic dimensions for evaluating RTD policy portfolios
• Some observations from Austria
• Limitations and practical problems
5
Basic dimensions for evaluating RTD policy
portfolios (i)
•Coverage:
• What policy goals are covered?
• Are there gaps?
•Proportions:
• Follow the money: How do the financial proportions fit to the
policy agenda?
• Follow the debate: Does the amount of attention devoted to single
instruments correspond with „importance“
6
Basic dimensions for describing RTD policy
portfolios (ii)
• Appearance/Visibility:
• Are differentiations between neighbouring instruments/brands clear to the
clients?
• How many brands does the funding system communicate?
• Take the perspective of beneficiaries/clients:
• How many schemes/ programmes are available for specific RTD activities of
specific groups: One? More than one? None?
• Patterns of usage:
• What instruments are used in parallel?
• Are there migration patterns between instruments?
7
What indicates quality?
• Overall R&D-performance of the innovation system (hopefully)
• Responsiveness to changing environments and needs
• Interrelation between instruments (supporting complimentarity vs.
interference and overlapping/competition)
• Interrelation between different levels of RTD-policy (regional,
national, international)
• Entry rules and conditions for new instruments/programmes
• Exit strategies
8
Road map
• Why portfolio evaluation?
• Basic dimensions for evaluating RTD policy portfolios
• Some observations from Austria
• Limitations and practical problems
9
Growing budgets
Austria: R&D expenditure by financing sector
7.000
3,00
other
abroad
enterprise sector
Federal and States
GERD/GDP
6.000
2,00
4.000
1,50
3.000
1,00
2.000
0,50
1.000
0
10
2006
2005
2004
2003
2002
2001
2000
1999
1998
1997
1996
1995
1994
1993
1992
1991
1990
0,00
GERD/GDP
million Euro
5.000
2,50
11
2004
2003
2002
2001
2000
1999
1998
1997
1996
1995
1994
1993
1992
1991
F&E-Quote
GERD/GDP
Catching-up
4
3,5
3
2,5
2
AUT
1,5
FIN
DEU
1
IRL
0,5
NLD
0
EU-15
Total
OECD
Expanding policy portfolio
Funding of institutions (universities, CRO’s)
bottom-up project funding (ERP, FWF, FFF)
first thematic programmes (energy) run by ministries
Soft measures (coaching, information, IPR)
more thematic programmes (transport, Flex-Cim,..)
fiscal measures
programmes … programmes
Kplus, Kind/net, Fhplus, NW, NANO ...
1965
1970
1975
1980
Research infrastructure, investments
Technology centres
education
12
1985
1990
Industry structure
1995
2000
2005
Critical masses
high-tech sectors
excellence
diffusion
leverage effects
clusters
science-industry linkages
Bottom-up project funding
Institutional funding (colour
of funding ministry)
Programme funding
Catalytic financial measures
Policy
Committee for science,
industry and economic affairs
Parliament
ERP Fund
fiscal measures
Government
BMF
BMWA
BMVIT
BMBWK
Austrian Science Council
Performers
Programmes / Agencies
National Research
Fund
13
Anniversary Fund
FFG
Structural
Programmes
Thematic
Programmes
Mobility/
Talent
Research projects
Start-up, IPR, PE/VC
Firms
R&D-projects
ARC
KFI
Universities
CD-L.
LB-S
AoS
Polyt.
Financial and Fiscal Measures: Objectives and
Instruments
Instruments/
primary goals
RTD programmes
thematic
functional
Keeping the baseline
Increasing private R&Dinvestment


Institutional
funding






Enhancing entrepreneurship
Improving science industry
linkages


Creation of excellence poles




improving innovation support
infrastructure
14
Exploiting
specific new
technology options
Fiscal
measures

Broaden the innovation base
Improving quality and relevance
of scientific research
Bottom-up
project
funding




Financial Resources for main funding instruments
1.600
1.400
10 %
16 %
10 %
1.200
20 %
in million EURO
19 %
21 %
1.000
fiscal measures
direct funding
institutional funding
800
600
71 %
70 %
63 %
400
200
2000
15
2001
2002
2003
Focus: direct funding
350
3%
5%
300
9%
in million EURO
250
200
11%
5%
9%
5%
11%
7%
6%
2%
14%
other
11%
contribution to international
research bodies
mobility/talent
17%
12%
12%
thematic programmes
150
functional programmes
71%
bottom-up project funding
67%
100
63%
63%
50
0
2000
16
2001
2002
2003
Observations on the Austrian policy portfolio
• High level of diversification
• Strong in mobilising communities
• Significant improvements in management and evaluation standards
• Fragmentation – Tendency for establishing new programmes for
ever smaller target groups
• Increasing competition between programmes – competing for
beneficiaries
• Lack of portfolio management
17
Road map
• Why portfolio evaluation?
• Basic dimensions for evaluating RTD policy portfolios
• Some observations from Austria
• Limitations and practical problems
18
Limitations and practical problems
•International benchmarking:
• New collections of “good practice” examples usually remain
vague on the portfolio side “it’s the recipe not the ingredients”)
•Information base is dispersed and messy:
• Monitoring routines at programme level can rarely be
combined/matched
• Evaluations on programme level usually address question of
external coherence. However the big picture remains a
patchwork
• Where is the customer?
19
Thank you for your attention !
20