Karlheinz Meier

Local learning capabilities will lead to a breakthrough of brain-inspired computing

Download the presentation

Brain-inspired computer architectures have been explored for many decades. They have demonstrated several computational principles but never shown the dramatic superiority over conventional computers that was initially expected.

The recent revival of classical convolutional neural networks has shown that spectacular performance can be reached with network sizes and depths that can now be trained with traditional learning algorithm because of the availability of very high performance computer systems, graphic cards in particular. On close inspection is is clear however, that the training is extremely costly in energy but even more importantly in time.

I will argue, that spiking neurons offer computational advantaged over non-spiking ones and that local learning capabilities on the computing substrate will accelerate learning by substantial factors, leading to systems that can efficiently perform computations according to various principles of neural computing. Deep learning is just one example.
 

About the panel speaker:

Karlheinz Meier is a professor of experimental physics at Heidelberg University in Germany. He received his PhD in 1984 from Hamburg University. For more than 30 years he worked in experimental particle physics, contributing to several experiments at the CERN and DESY laboratories. He designed and implemented a large-scale data selection system for an LHC experiment at CERN: Since 2005 he has shifted his interest towards custom hardware implementations of neural circuits. He has initiated and led 2 major European initiatives in the field (FACETS and BrainScaleS) and is currently co-director of the Human Brain Project.