Publications from MINDS

Journals

H. Jaeger, B. Noheda, W.G. van der Wiel (2023): Toward a formal theory for computing machines made out of whatever physics offers. Nature Communications 14, 4911 open access article

C. P. Lawrence (2023): Simple machine learning methods work surprisingly well for Ramanomics. Journal of Raman Spectroscopy 54(8), 887-889 doi.org/10.1002/jrs.6555

C. P. Lawrence (2022): Compact Modeling of Nanocluster Functionality as a Higher-Order Neuron. IEEE Transactions on Electronic Devices 69(9), 5373 - 5376 (preprint)

M. Cucchi, S. Abreu, G. Ciccone, D. Brunner, H. Kleemann (2022): Hands-on reservoir computing: a tutorial for practical implementation. Neuromorphic Computing and Engineering (online early access) (open access article)

H. Jaeger (2021): Toward a Generalized Theory Comprising Digital, Neuromorphic, and Unconventional Computing. Neuromorphic Computing and Engineering 1(1) (open access article)

F. Hadaeghi, H. Jaeger (2019): Computing optimal discrete readout weights in reservoir computing is NP-hard. Neurocomputing 338 (April 2019), 233-236 (arXiv preprint)

H. Jaeger (2017): Using Conceptors to Manage Neural Long-Term Memories for Temporal Patterns. Journal of Machine Learning Research 18, 1-43 (pdf at JMRL) (Matlab code [10 MB])

H. Jaeger (2016): Deep Neural Reasoning. Nature 538 (27 Oct), 467-468 (News & Views Intro to A. Graves et al: Hybrid computing using a neural network with dynamic external memory, same issue) (pdf)

M. Thon, H. Jaeger (2015): Links Between Multiplicity Automata, Observable Operator Models and Predictive State Representations -- a Unified Learning Framework. Journal of Machine Learning Research 16, 103-147 (pdf)

M. Galtier (2015): Ideomotor feedback control in a recurrent neural network. Biological Cybernetics 109(3), 363-375. (arXiv preprint)

M. Galtier, C. Marini, G. Wainrib, H. Jaeger (2014): Relative entropy minimizing noisy non-linear neural network to approximate stochastic processes.  Neural Networks 56, 10-21. (Preprint pdf).

F. wyffels, J. Li, T. Waegeman, B. Schrauwen, H. Jaeger (2014): Frequency Modulation of Large Oscillatory Neural Networks. Biological Cybernetics 108, 145-157 (Preprint pdf)

G. Manjunath, H. Jaeger (2014): The Dynamics of Random Difference Equations is Remodeled by Closed Relations. SIAM Journal on Mathematical Analysis 46(1), 459-483 (pdf)

M. Galtier, J. Touboul (2013): Macroscopic Equations Governing Noisy Spiking Neuronal Populations with Linear Synapses.
PLoS One. (open article)

M. Galtier, G. Wainrib (2013): A biological gradient descent for prediction through a combination of STDP and homeostatic plasticity. Neural Computation. Vol. 25, No. 11: 2815-2832. (Preprint pdf)

G. Manjunath, H. Jaeger (2013): Echo State Property Linked to an Input: Exploring a Fundamental Characteristic of Recurrent Neural Networks. Neural Computation Vol. 25, No. 3: 671–696 (Preprint pdf)

I. B. Yildiz, H. Jaeger, S. J. Kiebel (2012): Re-Visiting the Echo State Property. Neural Networks 35, 1-20 (Preprint pdf)

M. Lukoševičius, H. Jaeger, B. Schrauwen (2012): Reservoir Computing Trends. KI - Künstliche Intelligenz, 1-7 (Preprint pdf)

R. Pascanu, H. Jaeger (2011): A Neurodynamical Model for Working Memory. Neural Networks 24(2), 199-207 (Preprint pdf)

M. Zhao, H. Jaeger (2010): Norm Observable Operator Models. Neural Computation 22(7), 1927-1959 (Preprint pdf)

M. Zhao, H. Jaeger, M. Thon (2009): Making The Error Controlling Algorithm of Observable Operator Models Constructive. Neural Computation 22(12), 3460-3486 (Preprint pdf)

M. Lukoševičius, H. Jaeger (2009): Reservoir Computing Approaches to Recurrent Neural Network Training. Computer Science Review 3(3), 127-149.  (Preprint pdf)

A. Schönhuth, H. Jaeger (2009): Characterization of ergodic hidden Markov sources. IEEE Transactions on Information Theory, 55(5), 2107-2118. (Preprint pdf)

M. Zhao, H. Jaeger, M. Thon (2009): A Bound on Modeling Error in Observable Operator Models and an Associated Learning Algorithm. Neural Computation 21(9), 2687-2712. (Preprint pdf)

H. Jaeger, M. Lukoševičius, D. Popovici (2007): Optimization and Applications of Echo State Networks with Leaky Integrator Neurons. Neural Networks 20(3), 335-352, 2007. (Preprint pdf) (Matlab 6.5 code for Japanese Vouwel study) (Matlab 6.5 code for parameter optimization by stochastic gradient descent)

H. Jaeger, H. Haas (2004): Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication. Science, April 2, 2004,78-80. Preprint including supplementary online materials (pdf)

H. Jaeger (2000): Observable operator models for discrete stochastic time series. Neural Computation 12(6), 2000, 1371-1398. (pdf) Addendum (2012): a brief note on an improved "probability clock" design.

H. Jaeger, T. Christaller (1998): Dual Dynamics: Designing Behavior Systems for Autonomous Robots. Artificial Life and Robotics 2 (1998), 108-112 (pdf)

H. Jaeger (1998): Today's dynamical systems are too simple. Commentary to Tim van Gelder's BBS target article "The dynamical hypothesis in cognitive science". Behavioral and Brain Sciences 21(5), 1998, p. 643 (pdf)

H. Jaeger (1996): Dynamische Systeme in der Kognitionswissenschaft. Kognitionswissenschaft 5 (4), 1996, 151-174 (pdf)

 

Proceedings, Collections

S. Abreu, M. Gouda, A. Lugnan, P. Bienstman (2023): Flow Cytometry With Event-Based Vision and Spiking Neuromorphic Hardware. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, pp. 4138-4146 (pdf)

H. Jaeger (2021): Foreword to the book Reservoir Computing: Theory, Physical Implementations, and Applications (K. Nakajima and I. Fischer, eds.), Springer Nature Singapore (preprint pdf)

F. Hadaeghi (2021): Neuromorphic Electronic Systems for Reservoir Computing. In: K. Nakajima, I. Fischer (eds): Reservoir Computing: Theory, Physical Implementations and Applications. Springer ( arXiv preprint)

X. He and Lin, M. (2020): Continual Learning from the Perspective of Compression. Lifelong Learning Workshop (ICML 2020) ( pdf)

X. He, J. Sygnowski, A. Galashov, A. A. Rusu, Y. W. Teh, and R. Pascanu (2020): Task Agnostic Continual Learning via Meta Learning. Lifelong Learning Workshop (ICML 2020) ( pdf)

T. Liu, L. Ungar, J. Sedoc (2019): Continual Learning for Sentence Representations Using Conceptors. 2019 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL-HLT 2019) (arXiv:1904.09187v1)

X. He, T. Liu, F. Hadaeghi, H. Jaeger (2019): Reservoir Transfer on Analog Neuromorphic Hardware. Int. IEEE EMBS Conf. on Neural Engineering (NER '19) 3rd best paper award ( pdf)

X. He (2018): Continual Learning by Conceptor Regularization. Continual Learning Workshop at NeurIPS 2018 ( pdf)

X. He and H. Jaeger (2018): Overcoming Catastrophic Interference using Conceptor-Aided Backpropagation. International Conference on Learning Representations (ICLR 2018) ( pdf)

X. He and H. Jaeger (2017): Overcoming Catastrophic Forgetting by Conceptors. Workshop on Dynamical Systems and Brain-Inspired Information Processing, Konstanz, October 2017 (best poster award, abstract pdf)

D. Bahdanau, K. Cho and Y. Bengio, Y. (2015): Neural Machine Translation  by Jointly Learning to Align and Translate. International Conference on Learning Representations (ICLR) (arXiv)

K. Cho, B. van Berrienboer, C. Gulcehre, D. Bahdanau, F. Bougares, H. Schwenk   Y. Bengio (2014): Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation. Proc. Empiricial Methods in Natural Language Processing (EMNLP), 1724-1734 (pdf from EMNLP)

G. Manjunath, P. Tino, and H. Jaeger (2012): Theory of Input-Driven Dynamical Systems. In: Proc. ESANN 2012,  ES2012-6 (pdf)

M. Lukoševičius (2012): A Practical Guide to Applying Echo State Networks. In: G. Montavon, G. B. Orr, and K.-R. Müller (eds.) Neural Networks: Tricks of the Trade, 2nd ed. Springer LNCS 7700, pp 659-686 (preprint.pdf) (code samples)

M. Lukoševičius (2012): Self-organized reservoirs and their hierarchies. In: Proceedings of the 22nd International Conference on Artificial Neural Networks (ICANN 2012). Springer-Verlag LNCS 7552, pp 587-595 (preprint pdf)

H. Jaeger, A. Baronchelli, T. Briscoe, M. H. Christiansen, T. Griffiths, G. Jäger, S. Kirby, N. Komarova, P. J. Richerson, L. Steels, J. Triesch (2009): What can mathematical, computational and robotic models tell us about the origins of syntax? In: D. Bickerton and E. Szathmary (eds.): Biological Foundations and Origin of Syntax, p. 385-410. Strüngmann Forum Reports 3, MIT Press (preprint pdf)

H. Jaeger, D. Eck (2008): Can't Get You Out of My Head: A Connectionist Model of Cyclic Rehearsal. In: I. Wachsmuth, G. Knoblich (eds.): Modeling Communication for Robots and Virtual Humans. Springer LNAI 4930, 310-335 (http://dx.doi.org/10.1007/978-3-540-79037-2_17) (pdf)

H. Jaeger, M. Zhao, K. Kretzschmar, T. Oberstein, D. Popovici, A. Kolling (2006): Learning observable operator models via the ES algorithm. In: S. Haykin, J. Principe, T. Sejnowski, J. McWhirter (eds.), New Directions in Statistical Signal Processing: from Systems to Brain. MIT Press, Cambridge, MA., 417-464 (draft version, pdf) (Matlab code)

H. Jaeger (2007): Echo State Networks. Scholarpedia, 2(9):2330

H. Jaeger, M. Zhao, A. Kolling (2005): Efficient training of OOMs. Advances in Neural Information Processing Systems 18 (Y. Weiss, B. Schölkopf, J. Platt, eds.), MIT Press, Cambridge, MA., 555-562 (draft version, pdf)

H. Jaeger (2005): Reservoir Riddles: Suggestions for Echo State Network Research (Extended Abstract of Invited Talk) Proceedings of IJCNN 2005, 1460-1462. (paper, pdf) (slides, pdf)

H.-U. Kobialka, H. Jaeger (2003): Experiences Using the Dynamical System Paradigm for Programming RoboCup Robots. In Proc. AMiRE 2003 (2nd International Symposium on Autonomous Minirobots for Research and Edutainment) / U. Rueckert, J. Sitte (Eds), pp 193-202, 2003 (pdf)

H. Jaeger (2002): Adaptive nonlinear system identification with echo state networks. In Advances in Neural Information Processing Systems 15, S. Becker, S. Thrun, K. Obermayer (Eds), (MIT Press, Cambridge, MA, 2003) pp. 593-600 (draft, pdf)

J. Hertzberg, H. Jaeger, F. Schoenherr (2002): Learning to Ground Fact Symbols in Behavior-Based Robot. In: ECAI 2002. Proceedings of the 15th European Conference on Artificial Intelligence. (F. van Harmelen ed.), IOS Press, Amsterdam, pp. 708-712, ISBN 1-58603-257-7, Lyon, France, July, 2002 (pdf)

H. Jaeger (1999): Action selection for delayed, stochastic reward. Proc. 4th Annual Conf. of the German Cognitive Science Society (KogWis99), Infix Verlag, 213-219 (pdf)

H. Jaeger (1999): From continuous dynamics to symbols. In: W. Tschacher, J.-P. Dauwalder: Dynamics, Synergetics, Autonomous Agents. Studies of Nonlinear Phenomena in Life Science Vol. 8, World Scientific 1999, 29-48 (pdf)

A. Bredenfeld, W. Goehring, H. Guenter, H. Jaeger, H.-U. Kobialka, P.-G. Ploeger, P. Schoell, A. Siegberg, A. Streit, C. Verbeek, J. Wilberg (1999): Behavior engineering with "dual dynamics" models and design tools. In: M.M. Veloso, (ed.), Proc. 3rd Int. Workshop on RoboCup at IJCAI-99, IJCAI Press, 57-62 (pdf)

H. Jaeger (1998): A short introduction to observable operator models for stochastic processes. In: Trappl, R. (ed.), proceedings of the Cybernetics and Systems 98 conference, Vol.1, Austrian Society for Cybernetic Studies, 38-43 (pdf)

J. Hertzberg, H. Jaeger, Ph. Morignot, U.R. Zimmer (1998): A Framework for Plan Execution in Behavior-Based Robots. Proc. Joint ISIC/ISAS'98 conference on "Intelligent Control of Complex Systems" (pdf)

H. Jaeger (1998): Multifunctionality: a fundamental property of behavior mechanisms based on dynamical systems. In: R. Pfeifer, B. Blumberg, J.-A. Meyer, and S.W. Wilson (eds.): From animals to animats 5: Proc. of the fifth int. conf. on simulation of adaptive behavior. MIT Press 1998, 286-290 (pdf)

A. Wismueller, H. Jaeger, D.R. Dersch, H. Ritter, G. Palm (1998): Self-organization of feature detectors in time sequences (SOFT) - a neural network approach to multidimensional signal analysis. Proceedings of IEEE World Congress on Computational Intelligence 1998 (WCCI 98) Vol. 1, IEEE Computer Society Press 1998, 575-580 (pdf)

H. Jaeger (1997): An Introduction to Dynamic Symbol Systems. In: Hallam, J. (ed.): Hybrid Problems, Hybrid Solutions. Proceedings of the AISB-95. IOS Press/Ohmsha, Amsterdam 1995, 109-120 (pdf)

 

Technical Reports

J. E. Pedersen, S. Abreu, M. Jobst, G. Lenz, V. Fra, F. C. Bauer, D. R. Muir, P. Zhou, B. Vogginger, K. Heckel, G. Urgese, S. Shankar, T. C. Stewart, J. K. Eshraghian, S. Sheik (2023): Neuromorphic Intermediate Representation: A Unified Instruction Set for Interoperable Brain-Inspired Computing. arxiv: 2311.14641

H. Jaeger, B. Noheda, W.G. van der Wiel (2023): Toward a formal theory for computing machines made out of whatever physics offers: extended version. arxiv:2307.15408

M. Gouda, S. Abreu, A. Lugnan, P. Bienstman (2023): Training a spiking neural network on an event-based label-free flow cytometry dataset. arXiv:2303.10632

H. Jaeger, F. Catthoor (2023): Timescales: the choreography of classical and unconventional computing. (arxiv: 2301.00893)

H. Jaeger, D. Doorakkers, C. Lawrence, G. Indiveri (2021): Dimensions of “Timescales” in Neuromorphic Computing Systems. (arxiv: 2102.10648)

H. Jaeger (2020): Exploring the landscapes of “computing”: digital, neuromorphic, unconventional - and beyond. (arxiv:2011.12013)

F. Hadaeghi, H. Jaeger (2018): Computing optimal discrete readout weights in reservoir computing is NP-hard. (arxiv.org/abs/1809.01021)

T. Liu (2018): A Consistent Method for Learning OOMs from Asymptotically Stationary Time Series Data Containing Missing Values. (arXiv)

F. Hadaeghi, X. He, H. Jaeger (2017): Unconventional Information Processing Systems, Novel Hardware: A Tour d’Horizon. Jacobs University Technical Report Nr 36 (pdf)

X. He, H. Jaeger (2017): Overcoming Catastrophic Interference by Conceptors. Jacobs University Technical Report Nr 35 (arXiv)

H. Jaeger (2014): Conceptors: an easy introduction. (arXiv)

Cho, K., van Berrienboer, B., Bahdanau, D. and Bengio, Y. (2014): On the Properties of Neural Machine Translation: Encoder-Decoder Approaches. (arXiv)

D. Bahdanau, H. Jaeger (2014): Smart Decisions by Small Adjustments: Iterating Denoising Autoencoders. Jacobs University Technical Report Nr 31 (pdf)

H. Jaeger (2014): Controlling Recurrent Neural Networks by Conceptors. Jacobs University technical report Nr 31 (arXiv)  (Matlab code)

J. Ivanchev (2013): Fast time scale modulation of pattern generators realized by Echo State Networks. Jacobs University technical report Nr. 28 (pdf)

H. Jaeger (2012): Long Short-Term Memory in Echo State Networks: Details of a Simulation Study. Jacobs University technical report Nr. 27 (pdf) (Matlab code)

J. Li, H. Jaeger (2011): Minimal energy control of an ESN pattern generator. Jacobs University technical report Nr. 26 (pdf)

M. Lukoševičius (2010): On self-organizing reservoirs and their hierarchies. Jacobs University technical report Nr. 25 (pdf)

V. Šakėnas (2010): Distortion Invariant Feature Extraction with Echo State Networks. Jacobs University technical report Nr. 24 (pdf)

H. Jaeger (2010): Reservoir Self-Control for Achieving Invariance Against Slow Input Distortions. Jacobs University technical report Nr. 23 (pdf)

M. Thon (2008): Input-Output OOMs. Jacobs University technical report Nr. 16 (pdf)

I. Ilies, H. Jaeger, O. Kosuchinas, M. Rincon, V. Šakėnas, N. Vaškevičius (2007): Stepping forward through echoes of the past: forecasting with Echo State Networks. Report on our winning entry in the 2007 financial time series competition. Available online at the compeition website and here (pdf).

M. Lukoševičius, H. Jaeger (2007): Overview of Reservoir Recipes. Jacobs University technical report Nr. 11 (pdf)

H. Jaeger (2007): Discovering multiscale dynamical features with hierarchical Echo State Networks. Jacobs University technical report Nr. 10 (pdf)

M. Zhao, H. Jaeger (2007): Norm observable operator models. Jacobs University technical report Nr. 8 (39 pp.) (pdf)

M. Zhao, H. Jaeger (2007): The Error Controlling Algorithm for Learning OOMs. Jacobs University technical report Nr. 6 (42 pp.) (pdf)

M. Lukoševičius (2007): Echo State Networks with Trained Feedbacks. IUB technical report Nr. 4 (38 pp.)  (pdf)

H. Jaeger (2006): Generating exponentially many periodic attractors with linearly growing Echo State Networks. IUB technical report Nr. 3 (26 pp.) (pdf)  (Matlab code)

M. Lukoševičius, D. Popovici, H. Jaeger, U. Siewert (2006): Time Warping Invariant Echo State Networks. IUB technical report Nr. 2 (15 pp.) (pdf)

H. Jaeger (2002): Tutorial on training recurrent neural networks, covering BPTT, RTRL, EKF and the "echo state network" approach. GMD Report 159, German National Research Center for Information Technology, 2002 (48 pp.) (revised version from July 2013, pdf)

H. Jaeger (2001): Short term memory in echo state networks. GMD Report 152, German National Research Center for Information Technology, 2001 (60 pp.) (pdf)

H. Jaeger(2001): The "echo state" approach to analysing and training recurrent neural networks. GMD Report 148, German National Research Center for Information Technology, 2001 (43 pp.) (pdf) Erratum note

H. Jaeger(2001): Modeling and learning continuous-valued stochastic processes with OOMs. GMD Report 102, German National Research Center for Information Technology, 2001 (30 pp.) (pdf)

H. Jaeger (1999): Characterizing distributions of stochastic processes by linear operators. GMD Report 62, German National Research Center for Information Technology 1999 (27 pp.) (pdf)

H. Jaeger(1998): Observable operator models of stochastic processes: a tutorial. GMD Report 42, German National Research Center for Information Technology 1998. (i) Original version from December 23, 1998 (72pp, pdf) (ii) Update from April 2012 (84pp, pdf)

H. Jaeger (1997): Observable Operator Models II: Interpretable models and model induction. Arbeitspapiere der GMD 1083, GMD, St. Augustin 1997 (33 pp) (pdf) (Erratum)

H. Jaeger (1997): Observable Operator Models and Conditioned Continuation Representations. Arbeitspapiere der GMD 1043, GMD, St. Augustin 1997 (38 pp) (pdf)

H. Jaeger (1996): The Dual Dynamics Design Scheme for Behavior-based Robots: a Tutorial. Arbeitspapiere der GMD 966, GMD, St. Augustin 1996 (23 pp.) (pdf)

H. Jaeger (1995): Dynamische Systeme in der KI und ihren Nachbarwissenschaften. (In German.) Arbeitspapiere der GMD 925, GMD, St. Augustin 1995 (75 pp.) (pdf)

H. Jaeger (1995): Modulated Modules: Designing Behaviors as Dynamical Systems. Arbeitspapiere der GMD 927, GMD, St. Augustin 1995 (23 pp.) (pdf)

H. Jaeger (1995): Identification of Behaviors in an Agent's Phase Space. Arbeitspapiere der GMD 951, GMD, St. Augustin 1995 (37 pp.) (pdf)

 

Theses

X. He (2023): Continual lifelong learning in neural systems: overcoming catastrophic forgetting and transferring knowledge for future learning. PhD thesis, Dpt. of AI, University of Groningen. (pdf)

J. De Jong (2021): Controlling Recurrent Neural Networks by Diagonal Conceptors. Master thesis, Dpt. of AI, University of Groningen. (pdf)

T. Liu (2019): Harnessing Slow Dynamics in Neuromorphic Computation. Master thesis, Dpt. of EE and CS, Jacobs University. (arXiv copy)

M. Thon (2017): Spectral Learning of Sequential Systems. PhD thesis, Dpt. of EE and CS, Jacobs University. 167 pp. (pdf)

M. Lukoševičius (2012): Reservoir Computing and Self-Organized Neural Hierarchies. PhD thesis, School of Engineering and Science, Jacobs University. 146 pp. (pdf)

H. Jaeger (1994): Dynamical Symbol Systems. Ph.D. thesis, Faculty of Technology, University of Bielefeld. 170 pp. (pdf)

 

Patents

H. Jaeger (inventor, 2001): A Method for Supervised Teaching of a Recurrent Neural Network. WO 2002031764 A2. European and international patents held by Fraunhofer Gesellschaft.

 

Popular

H. Jaeger (2001): Ihre Zahnbürste, der Urknall und Sie. (In German.) In: von Randow, G. (Hrsg.), Wieviel Körper braucht der Mensch? Edition Koerber-Stiftung, Hamburg 2001, 22-40. (pdf)

H. Jaeger (1989): Komplexe Systeme - eine Schule der Bescheidenheit. (In German.) Kursbuch 98 "Das Chaos", Kursbuch/Rotbuch Verlag 1989, 149-164