I am a PhD student in the Intelligent Systems group at the Bernoulli Institute for Mathematics, Computer Science and Artificial Intelligence. In the project, we use techniques from statistical physics to analyze the learning behavior of machine learning models in order to get more insight into practically relevant phenomena, which can be used to improve machine learning algorithms. Furthermore, my work focuses on making industrial processes more intelligent.
We present a typical Industry 4.0 case study: the real-time quality estimation of steel in a high-throughput production line using Machine Learning methods.
A statistical physics based modelling framework is developed in which we study standard training algorithms (SGD, LVQ1) under concept drift and in addition compare ReLU vs sigmoidal activation.
Our systematic comparison of networks with ReLU and sigmoidal units in model situations reveals surprising differences in their training and generalization behavior.
We propose and evaluate an information-preserving complex-valued embedding for general non-psd proximity data.
In this contribution, we consider the classification of time series and similar functional data which can be represented in complex Fourier and wavelet coefficient space. We apply versions of learning vector quantization (LVQ) which are suitable for complex-valued data, based on the so-called Wirtinger calculus. It allows for the formulation of gradient-based update rules in the framework of cost-function-based generalized matrix relevance LVQ (GMLVQ). Alternatively, we consider the concatenation of real and imaginary parts of Fourier coefficients in a real-valued feature vector and the classification of time-domain representations by means of conventional GMLVQ. In addition, we consider the application of the method in combination with wavelet-space features to heartbeat classification.
In order to compute weight updates, backpropagation uses complete knowledge of the downstream weights in the network. In  it is …