Publications | Lukas Paul Achatius Galke

Publications

Upcoming journal articles

  • [j7] Lukas Galke, Yoav Ram, Limor Raviv (accepted). What makes a language easy to deep-learn? preprint code
  • [j8] Lukas Galke, Andor Diera, Bao Xin Lin, Bhakti Khera, Tim Meuser, Tushar Singal, Fabian Karl, Ansgar Scherp (in revision). Are We Really Making Much Progress? Bag-of-Words vs. Sequence vs. Graph vs. Hierarchy for Single-label and Multi-label Text Classification. preprint.

Journal articles

  • [j6] Lukas Galke, Limor Raviv (2024). Learning and communication pressures in neural networks: Lessons from emergent communication. Language Development Research 5(1). paper preprint
  • [j5] Eva Seidlmayer, Tetyana Melnychuk, Lukas Galke, Lisa Kühnel, Klaus Tochtermann, Carsten Schultz, Konrad Förstner (2024). Research Topic Displacement and the Lack of Interdisciplinarity: Lessons from the Scientific Response to COVID-19. Scientometrics. paper
  • [j4] Tetyana Melnychuk, Lukas Galke, Eva Seidlmayer; Stefanie Bröring, Konrad U. Förstner, Klaus Tochtermann, Carsten Schultz (2023). Development of Similarity Measures from Graph-Structured Bibliographic Metadata: An Application to Identify Scientific Convergence. IEEE Transactions on Engineering Management. paper
  • [j3] Lukas Galke, Iacopo Vagliano, Benedikt Franke, Tobias Zielke, Marcel Hoffmann, Ansgar Scherp (2023). Lifelong Learning on Evolving Graphs Under the Constraints of Imbalanced Classes and New Classes. Neural Networks 164, 156-176. paper code
  • [j2] Iacopo Vagliano, Lukas Galke, Ansgar Scherp (2022). Recommendations for item set completion: on the semantics of item co-occurrence with data sparsity, input size, and input modalities. Inf Retrieval J 25, 269–305. paper code
  • [j1] Tetyana Melnychuk, Lukas Galke, Eva Seidlmayer, Konrad Ulrich Förster, Klaus Tochtermann, Carsten Schultz (2021). Früherkennung wissenschaftlicher Konvergenz im Hochschulmanagement [translated: Early-detection of scientic convergence in university management]. Hochschulmanagement 16(1). complete issue

Conference papers

  • [c13] Yousef Younes, Lukas Galke, and Ansgar Scherp (2024). RADAr: A Transformer-based Autoregressive Decoder Architecture for Hierarchical Text Classification. In 27th European Conference on Artificial Intelligence 2024 (ECAI 2024). paper code
  • [c12] Marcel Hoffmann, Lukas Galke, Ansgar Scherp (2024). POWN: Prototypical Open-world Node Classification. In Conference on Lifelong Learning Agents. paper code
  • [c11] Marcel Hoffmann, Lukas Galke, Ansgar Scherp (2023). Open-World Lifelong Graph Learning. In International Joint Conference on Neural Networks (IJCNN). IEEE. paper code
  • [c10] Lukas Galke, Isabelle Cuber, Christoph Meyer, Henrik Ferdinand Nölscher, Angelina Sonderecker, Ansgar Scherp (2022). General Cross-Architecture Distillation of Pretrained Language Models into Matrix Embeddings. In Proceedings of the 2022 International Joint Conference on Neural Networks (IJCNN). IEEE. paper
  • [c9] Lukas Galke, Ansgar Scherp (2022). Bag-of-Words vs. Graph vs. Sequence in Text Classification: Questioning the Necessity of Text-Graphs and the Surprising Strength of a Wide MLP. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 4038–4051, Dublin, Ireland. Association for Computational Linguistics. paper code
  • [c8] Lukas Galke, Benedikt Franke, Tobias Zielke, Ansgar Scherp (2021). Lifelong Learning of Graph Neural Networks for Open-World Node Classification. In Proceedings of the 2021 International Joint Conference on Neural Networks (IJCNN). IEEE. paper code
  • [c7] Lukas Galke, Tetyana Melnychuk, Eva Seidlmayer, Steffen Trog, Konrad U. Förster, Carsten Schultz, Klaus Tochtermann (2019). Inductive Learning of Concept Representations from Library-Scale Bibliographic Corpora GI. paper
  • [c6] Florian Mai, Lukas Galke, Ansgar Scherp (2019). CBOW Is Not All You Need: Combining CBOW with the Compositional Matrix Space Model. In Proceedings of the Seventh International Conference on Learning Representations (ICLR). OpenReview.net. paper code
  • [c5] Ahmed Saleh, Tilman Beck, Lukas Galke, Ansgar Scherp (2018). Performance of Ad-Hoc Retrieval Models over Full-Text vs. Titles of Documents. In Proceedings of the International Conference on Asian Digital Libraries (ICADL). paper
  • [c4] Lukas Galke, Florian Mai, Iacopo Vagliano, Ansgar Scherp (2018). Multi-Modal Adversarial Autoencoders for Recommendation of Citations and Subject Labels. In Proceedings of the 26th Conference on User Modeling, Adaptation and Personalization (UMAP). ACM. paper
  • [c3] Anne Lauscher, Kai Eckert, Lukas Galke, Ansgar Scherp, Syed Tassen Raza Rizvi, Sheraz Ahmed, Andreas Dengel, Philipp Zumstein, Annette Klein (2018). Linked Open Citation Database: Enabling Libraries to Contribute to an Open and Interconnected Citation Graph. In Proceedings of the 18th ACM/IEEE Joint Conference on Digital Libraries (JCDL). ACM. paper
  • [c2] Florian Mai, Lukas Galke, Ansgar Scherp (2018). Using Deep Learning for Title-Based Semantic Subject Indexing to Reach Competitive Performance to Full-Text. In Proceedings of the 18th ACM/IEEE Joint Conference on Digital Libraries (JCDL). ACM. paper
  • [c1] Lukas Galke, Florian Mai, Alan Schelten, Dennis Brunch, Ansgar Scherp (2017). Using Titles vs. Full-text as Source for Automated Semantic Document Annotation. In Knowledge Capture Conference (K-CAP). ACM. paper

Workshop papers

  • [w10] Anh Dang, Limor Raviv, Lukas Galke (2024). Morphology Matters: Probing the Cross-linguistic Morphological Generalization Abilities of Large Language Models through a Wug Test. In Cognitive Modeling and Computational Linguistics Workshop at ACL. paper
  • [w9] Andor Diera, Abdelhalim Dahou, Lukas Galke, Fabian Karl, Florian Sihler, Ansgar Scherp (2023). GenCodeSearchNet: A Benchmark Test Suite for Evaluating Generalization in Programming Language Understanding. GenBench Workshop @ EMNLP 2023. paper
  • [w8] Lukas Galke, Yoav Ram, Limor Raviv (2022). Emergent communication for understanding human language evolution: What’s missing?. Emergent Communication workshop at Tenth International Conference on Learning Representations (ICLR 2022). OpenReview.net. paper
  • [w7] Lukas Galke, Eva Seidlmayer, Gavin Lüdemann, Lisa Langnickel, Tetyana Melnychuk, Konrad U Förstner, Klaus Tochtermann, Carsten Schultz (2021). COVID-19++: A Citation-Aware Covid-19 Dataset for the Analysis of Research Dynamics. In 2021 IEEE International Conference on Big Data (Big Data). IEEE. paper
  • [w6] Eva Seidlmayer, Jakob Voß, Tatyana Melnychuk, Lukas Galke, Klaus Tochtermann, Carsten Schultz, Konrad U. Förstner (2020). ORCID for Wikidata — Data enrichment for scientometric applications, Wikidata workshop @ ISWC 2020. paper
  • [w5] Eva Seidlmayer, Lukas Galke, Tatyana Melnychuk, Carsten Schultz, Klaus Tochtermann, Konrad U. Förstner (2019): Take it Personally — A Python library for enrichment in informetrical applications. Posters&Demos @ SEMANTICS 2019. paper
  • [w4] Lukas Galke, Iacopo Vagliano, Ansgar Scherp (2019). Can Graph Neural Networks Go „Online“? An Analysis of Pretraining and Inference. Representation Learning on Graphs and Manifolds workshop @ ICLR 2019. paper
  • [w3] Iacopo Vagliano, Lukas Galke, Florian Mai, Ansgar Scherp (2018). Using Adversarial Autoencoders for Multi-Modal Automatic Playlist Continuation. RecSys Challenge workshop @ RecSys’18. paper
  • [w2] Lukas Galke, Gunnar Gerstenkorn, Ansgar Scherp (2018). A Case Study of Closed-Domain Response Suggestion with Limited Training Data. Text-based Information Retrieval workshop @ DEXA’18. paper code
  • [w1] Lukas Galke, Ahmed Saleh, Ansgar Scherp (2017). Word Embeddings for Practical Information Retrieval. In INFORMATIK 2017. Gesellschaft für Informatik, Bonn. (S. 2155-2167). paper code (>200 stars, >40 forks)

(Extended) abstracts

  • [a3] Lukas Galke, Yoav Ram, Limor Raviv (2023). What makes a languagae easy to deep-learn? [1-page abstract]. Protolang 8, 2023.
  • [a2] Lukas Galke (2022). Representation Learning for Texts and Graphs: A Unified Perspective On Efficiency, Multimodality, and Adaptability [selected PhD thesis abstract]. IEEE Intelligent Informatics Bulletin, 22(1), 52. complete issue
  • [a1] Lukas Galke, Florian Mai, Ansgar Scherp (2019): What If We Encoded Words as Matrices and Used Matrix Multiplication as Composition Function [extended abstract]. INFORMATIK 2019. GI. paper

Thesis

  • Lukas Galke (2023). Representation Learning for Texts and Graphs: A Unified Perspective On Efficiency, Multimodality, and Adaptability. Number 2023/1 in Kiel Computer Science Series. Department of Computer Science, 2023. Dissertation, Faculty of Engineering, Kiel University. pdf

Project reports

  • [r3] Iacopo Vagliano, Till Blume, Lukas Galke, Florian Mai, Ahmed Saleh, Alexandros Pournaras, Nikolaos Gkalelis, Damianos Galanopoulos, Vasileios Mezaris, Ilija Šimić, Vedran Sabol, Aitor Apaolaza, Markel Vigo, Andrea Zielinski, Peter Mutschke (2019). Deliverable 3.3: Technologies for MOVING data processing and visualisation v3.0. report
  • [r2] Iacopo Vagliano, Mohammad Abdel-Qader, Till Blume, Falk Böschen, Lukas Galke, Ahmed Saleh, Ansgar Scherp, Vasileios Mezaris, Alexandros Pournaras, Christos Tzelepis, Ilija Šimić, Cecilia di Sciascio, Vedran Sabol, Aitor Apaolaza, Markel Vigo, Tobias Backes, Peter Mutschke (2018). Technologies for MOVING data processing and visualisation v2.0. report
  • [r1] Till Blume, Falk Böschen, Lukas Galke, Ahmed Saleh, Ansgar Scherp, Matthias Schulte-Althoff, Chrysa Collyda, Vasileios Mezaris, Alexandros Pournaras, Christos Tzelepis, Peter Hasitschka, Vedran Sabol, Aitor Apaolaza, Markel Vigo, Tobias Backes, Peter Mutschke Thomas Gottron (2017). Technologies for MOVING data processing and visualisation v1.0. report

Contact: lukas 'at' lpag.de
Design: Adapted from Diane Mounter.
Privacy: No personal data, no cookies.