Julian Ellis

Download Report

Transcript Julian Ellis

Future and Emerging Technologies
INNOVATION WORKSHOP
Exploitation of Neuromorphic Computing (NMC)
Technologies
Julian Ellis
3 February 2017
Directorate-General for Communications Networks, Content and Technology
Neuromorphic research has long been funded in FET
•
•
FET Open
FET Proactive
o LPS (FP5); Bio-I3 (FP6)
o Bio-ICT, Brain-ICT, NBIS, EVLIT (FP7)
•
FET Flagships (HBP)
•
Good example of interdisciplinary research
•
Few FET projects focus only on NMC
•
Many FET projects include NMC work in Annex 1
Role of NMC in FET projects
•
Check models of information processing in the brain
•
Electrical interfaces to neurons (PNS/CNS/in vitro)
•
Machine vision systems
•
New information processing systems (eg learning)
NMC for validating models
•
•
•
•
•
SENSEMAKER (FP5)
Martin McGinnity, U. Ulster, UK
DAISY (FP6), SECO (FP7)
Henry Kennedy, INSERM, FR
FACETS (FP6), BRAINSCALES (FP7)
Karlheinz Meier, U. Heidelberg, DE
SI-ELEGANS (FP7)
Axel Blau, IIT, IT
HBP (H2020)
EPFL, CH
These projects developed neuromorphic VLSI systems to test
hypotheses from neuroscience experiments and gain insight into
using ("programming") NMC.
NMC for interfacing to neurons
•
•
•
•
•
CORONET (FP7)
Jochen Braun, U. Magdeburg, DE
RAMP (FP7)
Stefano Vassanelli, U. Padova, IT
NEBIAS (FP7)
Silvestro Micera, SSSA, IT
SI-CODE (FP7)
Stefano Panzeri, IIT, IT
BRAINBOW (FP7)
Michela Chiappalone, IIT, IT
These projects developed neuromorphic interfaces to natural neurons
NMC for machine vision
• CAVIAR (FP5)
Bernabé Linares-Barranco, IMSE, ES
• EMORPH (FP7)
Chiara Bartolozzi, IIT, IT
• SEEBETTER (FP7)
David San Segundo Bello, IMEC, BE
• VISUALISE (FP7)
Martin McGinnity, U. Ulster, UK
New low power high performance vision sensors
• SCANDLE (FP7-ICT) – sound processing
Sue Denham, U. Plymouth, UK
NMC for machine learning
•
•
•
ALAVLSI (FP5)
Jochen Braun, U. Magdeburg, DE
Brain-I-NETS (FP7)
Wolfgang Maas, TU Graz, AT
NEURAM3 (H2020 - LEIT)
Carlo Reiter, CEA, FR
New ways of processing information
Thank you for your attention!
Any questions?
••• 8