Life-long learning using dynamical systems

Living systems continuously process and respond to non-stationary signals on multiple scales, starting from single cells to neuronal networks. In order to accurately interpret the signals and generate coordinated response, the computations must occur in real-time and have distinct features, including specificity in responses as well as generalisation of signals, flexibility in response to novel signals, and tolerance to noise, among others. The current paradigms describing natural computation mainly focus on Turing-like computations, that from aspect of dynamical systems resemble computations with stable steady states. However, switching between steady states is not generally compatible with efficient processing of non-stationary signals. I propose to investigate computation with non-asymptotic transients in networks characterised with “ghost” or attractor ruin sets. Formulating mathematically how to study attractor ruins, we will investigate the generality of ghost-network computations as a basis for natural computations, life-long and on-the-fly learning. — layout: page title: project description: a project with a background image img: /assets/img/scheme.007.jpeg —

References