Research

Currently my work focusses on neural networks and their structure in particular. This is quite a complex intersection between Functional Analysis, Graph Theory, Learning Theory, Topology, Optimization and Machine Learning. As a computer scientist I mostly approach it from an computational and evidence based perspective. More generally speaking, I am interested in Machine Learning, Philosophy, Artificial General Intelligence and Psychology.

Projects

  • deepstruct: blending graph theory and neural networks

  • deepgg: A generative model for graphs (a hot and complex topic with various applications)

  • eddy: Artifcial Landscapes for Optimization A nice little visualization and experimenting project I am currently working on. Basically I am collecting test functions for optimization, provide additional functionality for simple visualizations of them and test various optimization strategies over them.

  • pyklopp: repeatable model training While the field of machine learning has a huge reproducibility problem and I am confronted with it permanently when designing new Neural Architecture Search experiments, I developed a simple binary which I am using for invoking model trainings. This is far from being finalized and currently there are lots of development efforts to make experiments repeatable.

Theses

  • Investigating Sparsity in Recurrent Neural Networks, Harshil Darji, 2021
  • A Scalable Distributed Training Ecosystem, Marouene Zouauoui, 2020
  • Evolutionary Neural Architecture Search with graph-based Performance Estimation, Jerome Würf, 2020
  • A comparative evaluation of pruning techniques for Artificial Neural Networks, Paul Häusner, 2019
  • Evaluation of Sparse Neural Networks robustness to Adversarial examples, Mehdi Ben Amor, 2019
  • Evofficient: Reproducing and Enhancing a Cartesian Genetic Programming Method, Lorenz Wendlinger, 2019
  • Analysis of Neural Networks from a Network Science Perspective, Hann Holze, 2019