diff --git a/_data/seminar.yml b/_data/seminar.yml
index c56cb5e47fd7e5bcdd50462aa3a5dc277f8fc0ff..caa781fb661bb3ccc5c1abdaf658a9b3a8f2666e 100644
--- a/_data/seminar.yml
+++ b/_data/seminar.yml
@@ -1,3 +1,14 @@
+- date: 2025-04-01 10:30
+  team: Alco
+  room: Henri Poincaré
+  speaker: Simon Wietheger
+  picture: https://www.ac.tuwien.ac.at/wp/wp-content/uploads/2024_04_23_1-200x300.jpg
+  website: https://www.ac.tuwien.ac.at/people/swietheger/
+  lab: Technische Universität Wien
+  title: Training One-Dimensional Graph Neural Networks is NP-Hard
+  abstract: |
+    We initiate the study of the computational complexity of training graph neural networks (GNNs). We consider the classical node classification setting; there, the intractability of training multidimensonal GNNs immediately follows from known lower bounds for training classical neural networks (and holds even for trivial GNNs). However, one-dimensional GNNs form a crucial case of interest: the computational complexity of training such networks depends on both the graphical structure of the network and the properties of the involved activation and aggregation functions. As our main result, we establish the NP-hardness of training ReLU-activated one-dimensional GNNs via a highly non-trivial reduction. We complement this result with algorithmic upper bounds for the training problem in the ReLU-activated and linearly-activated settings.
+Joint work with Robert Ganian and Mathis Rocton, to appear at ICML 2025.
 - date: 2025-04-11 14:00
   team: PARTOUT
   room: Grace Hopper