Geometry, Algebra and Physics in Deep Neural Networks

The research group on Geometry, Algebra and Physics in Deep Neural Networks (GAPinDNNs) is based at the Department for Mathematical Sciences at Chalmers University of Technology and the University of Gothenburg. Our vision is to develop a mathematical foundation for deep learning which elevates the field into a theoretically well-grounded science.

News

New Article on global Weather Forecasting

28 May 2025

Our work PEAR: Equal Area Weather Forecasting on the Sphere by Hampus Linander, Christoffer Petersson, Daniel Persson and Jan Gerken, is now available on the arXiv. We use an equal area gridding (HEALPix) of the sphere to perform global weather forecasting with a volumetric transformer architecture, outperforming the same architecture on the standard Driscoll-Healy grid.

New Article on Non-linear Equivariant Neural Networks

29 Apr 2025

The preprint Equivariant Non-linear Maps for Neural Networks on Homogeneous Spaces is now available on the arXiv, with authors including Elias Nyholm, Oscar Carlsson and Daniel Persson. In this paper we define and study a family of equivariant neural network layers which unify convolution-based and attention-based architectures. We derive the generalised equivariance condition and show how it specialises in individual cases. The work is in collaboration with Maurice Weiler.

Open PhD Position

22 Apr 2025

We have an open PhD position with Jan Gerken about symmetries in neural networks.

We seek a PhD student for a project at the intersection of mathematics and deep learning to work on theoretical aspects of geometric deep learning. The question whether more data and compute are sufficient to improve neural networks is highly debated at the moment in all areas of deep learning. In this project, you will work on a theoretical framework which will help to better understand these questions in the context of geometric deep learning and add rigorous arguments to a debate driven by empirical results.

Application Deadline is 21 May 2025.

New Paper on Gauge Equivariant Networks for Topological Insulators

24 Feb 2025

Our work on Learning Chern Numbers of Topological Insulators with Gauge Equivariant Neural Networks is now on the arXiv! In this paper, we combine lattice gauge equivariant networks with a novel training mechanism to learn topological invariants (Chern numbers) of topological insulators. This paper combines several beautiful topics in machine learning, physics and mathematics.

First author is our master’s student Longde Huang. Congratulations to his first publication! From our group, Hampus Linander, Daniel Persson and Jan Gerken were also involved. Thanks to our phyiscs-collaborators Oleksandr Balabanov (then at Stockholm University) and Mats Granath (University of Gothenburg) for their expertise and a fun collaboration!

Learning on Graphs and Geometry Meetup Sweden

11 Feb 2025

Elias Nyholm is one of the organisers of the LoG Meetup Sweden in Uppsala this year. This two-day workshop is the official local meetup of the online LoG Conference and will consist of keynote talks, contributed talks and a poster session.

WASP Winter Conference 2025

15 Jan 2025

The group had a strong presence at the WASP Winter Conference in Norrköping this year, including poster presentations by Elias, Philipp and Oscar.

Equivariant Neural Networks

Theory and applications of quivariant neural networks and geometric deep learning

Read more
Spherical Computer Vision

Computer vision models for spherical data like fisheye images

Read more
Wide Neural Networks

Theory and applications of wide neural networks, using the neural tangent kernel

Read more