The research group on Geometry, Algebra and Physics in Deep Neural Networks (GAPinDNNs) is based at the Department for Mathematical Sciences at Chalmers University of Technology and the University of Gothenburg. Our vision is to develop a mathematical foundation for deep learning which elevates the field into a theoretically well-grounded science.
Philipp’s internship at Genentech
Today, Philipp is starting his 10-months internship at Genentech (Roche) in Switzerland. Under the supervision of Pan Kessel he will explore new ways of generative protein design. We wish him a successful start!
Welcome to our new group members Miaowen and Longde
We are happy to announce that our group welcomes two new PhD students: Miaowen Dong and Longde Huang. They will both work under the supervision of Jan. Their respective research areas can be found at their linked profiles.
Group excursion
With the Swedish summer about to end, we came together for a memorable group excursion. Thanks to the surprisingly good weather, the exploration of Gothenburg’s archipelago with kayaks was a pleasant and exciting experience. A few capsizes added to the to the adventure- and are part of the learning process. To cap off the day, Daniel treated us to an authentic Mexican-themed BBQ evening.
GDL Workshop in Umeå
ICML 2025
Philipp Misof is presenting our paper Equivariant Neural Tangent Kernels written together with Pan Kessel and Jan Gerken at this year’s ICML in Vancouver. In this work, we extend the neural tangent kernel to equivariant neural networks and use it to draw an interesting connection between equivariant neural networks and data-augmented networks.
LOGML Summer School 2025
Elias Nyholm and Philipp Misof are attending the LOGML Summer School at the Imperial College London this week. They are both working in small groups on projects related to geometric deep learning, which they will summarize in a short presentation at the end of the week.
New Article on global Weather Forecasting
Our work PEAR: Equal Area Weather Forecasting on the Sphere by Hampus Linander, Christoffer Petersson, Daniel Persson and Jan Gerken, is now available on the arXiv. We use an equal area gridding (HEALPix) of the sphere to perform global weather forecasting with a volumetric transformer architecture, outperforming the same architecture on the standard Driscoll-Healy grid.
New Article on Non-linear Equivariant Neural Networks
The preprint Equivariant Non-linear Maps for Neural Networks on Homogeneous Spaces is now available on the arXiv, with authors including Elias Nyholm, Oscar Carlsson and Daniel Persson. In this paper we define and study a family of equivariant neural network layers which unify convolution-based and attention-based architectures. We derive the generalised equivariance condition and show how it specialises in individual cases. The work is in collaboration with Maurice Weiler.
Open PhD Position
We have an open PhD position with Jan Gerken about symmetries in neural networks.
We seek a PhD student for a project at the intersection of mathematics and deep learning to work on theoretical aspects of geometric deep learning. The question whether more data and compute are sufficient to improve neural networks is highly debated at the moment in all areas of deep learning. In this project, you will work on a theoretical framework which will help to better understand these questions in the context of geometric deep learning and add rigorous arguments to a debate driven by empirical results.
Application Deadline is 21 May 2025.