Publications

  • deepstruct – linking deep learning and graph theory is a short technical paper summarizing implementations around the idea of providing a round-trip transformation between neural networks and their structure. Such a back-and-forth transform can be e.g. used for pruning, structure analysis or neural architecture search, e.g. with genetic algorithms encoding information of graphs.
@article{stier2022deepstruct,
  title={deepstruct -- linking deep learning and graph theory},
  author={Stier, Julian and Granitzer, Michael},
  journal={Software Impacts},
  volume={11},
  pages={100193},
  year={2022},
  publisher={Elsevier}
}
@inproceedings{stier2021experiments,
  title={Experiments on properties of hidden structures of sparse neural networks},
  author={Stier, Julian and Darji, Harshil and Granitzer, Michael},
  booktitle={International Conference on Machine Learning, Optimization, and Data Science},
  pages={380--394},
  year={2021},
  organization={Springer}
}
@article{havas2021spatio,
  title={Spatio-temporal machine learning analysis of social media data and refugee movement statistics},
  author={Havas, Clemens and Wendlinger, Lorenz and Stier, Julian and Julka, Sahib and Krieger, Veronika and Ferner, Cornelia and Petutschnig, Andreas and Granitzer, Michael and Wegenkittl, Stefan and Resch, Bernd},
  journal={ISPRS International Journal of Geo-Information},
  volume={10},
  number={8},
  pages={498},
  year={2021},
  publisher={MDPI}
}
@inproceedings{wendlinger2021evofficient,
  title={Evofficient: Reproducing a Cartesian Genetic Programming Method.},
  author={Wendlinger, Lorenz and Stier, Julian and Granitzer, Michael},
  booktitle={EuroGP},
  pages={162--178},
  year={2021}
}
@article{amor2021correlation,
  title={Correlation analysis between the robustness of sparse neural networks and their random hidden structural priors},
  author={Amor, Mehdi Ben and Stier, Julian and Granitzer, Michael},
  journal={Procedia Computer Science},
  volume={192},
  pages={4073--4082},
  year={2021},
  publisher={Elsevier}
}
  • DeepGG: a Deep Graph Generator contains results of our studies on building a generative model of graphs. It generalizes and analyses a model called DGMG. Following the line of previous publications, we try to improve on Neural Architecture Search by using search techniques in continuous domains and for this leverage graph embeddings. This idea just became popular in 2019/2020 within the NAS-community.
@inproceedings{stier2020deep,
  title={DeepGG: a Deep Graph Generator},
  author={Stier, Julian and Granitzer, Michael},
  booktitle={Advances in Intelligent Data Analysis XIX: 19th International Symposium on Intelligent Data Analysis, IDA 2021, Porto, Portugal, April 26--28, 2021, Proceedings},
  pages={325},
  organization={Springer Nature}
}
  • Structural Analysis of Sparse Neural Networks .. should be better titled Analysis of Neural Networks of random structure and contains results of our first invesigations into the influence of the structure solely. The motivation is biologically inspired and fuses network scientific results with Neural Architecture Search. Since 2018 we’re working on formalizing this in a structural prior hypothesis.
@article{stier2019structural,
  title={Structural Analysis of Sparse Neural Networks},
  author={Stier, Julian and Granitzer, Michael},
  journal={Procedia Computer Science},
  volume={159},
  pages={107--116},
  year={2019},
  publisher={Elsevier}
}
  • Analysing neural network topologies: a game theoretic approach We’ve been actually one of the first to use Shapley’s value (an optimal game theoretic solution) to prune neurons of Neural Network models and investigated on its theoretical limiations. Shapley value applied in the feature space became later quite popular as a way to understand neural networks by researchers from Google.
@article{stier2018analysing,
  title={Analysing neural network topologies: a game theoretic approach},
  author={Stier, Julian and Gianini, Gabriele and Granitzer, Michael and Ziegler, Konstantin},
  journal={Procedia Computer Science},
  volume={126},
  pages={234--243},
  year={2018},
  publisher={Elsevier}
}