I am a PhD student in the Intelligent Systems group at the Bernoulli Institute for Mathematics, Computer Science and Artificial Intelligence. In the project, we use techniques from statistical physics to analyze the learning behavior of machine learning models and get more insight into practically relevant phenomena, which can be used to improve machine learning algorithms.
We propose and evaluate an information-preserving complex-valued embedding for general non-psd proximity data.
A stat. phys. based modelling framework is developed in which we study standard training algorithms (sgd, LVQ1) under concept drift and i.a. compare ReLU vs sigmoidal activation.
Our systematic comparison of networks with ReLU and sigmoidal units in model situations reveals surprising differences in their training and generalization behavior.
In this contribution, we consider the classification of time series and similar functional data which can be represented in complex Fourier and wavelet coefficient space. We apply versions of learning vector quantization (LVQ) which are suitable for complex-valued data, based on the so-called Wirtinger calculus. It allows for the formulation of gradient-based update rules in the framework of cost-function-based generalized matrix relevance LVQ (GMLVQ). Alternatively, we consider the concatenation of real and imaginary parts of Fourier coefficients in a real-valued feature vector and the classification of time-domain representations by means of conventional GMLVQ. In addition, we consider the application of the method in combination with wavelet-space features to heartbeat classification.