From Clifford Neural Layers to Clifford Group Equivariant Networks David Ruhei, Johannes Brandstetter, Patrick Forré -- University of Amsterdam Recently, we have seen a surge of interest in the application of Clifford algebras, also known as geometric algebras, in machine learning. They have marked a significant shift in how complex spatial and geometric relationships can be encoded and processed in neural networks. In this talk, I will discuss these developments, outlined in the following. Clifford algebras, with their rich geometric interpretation and ability to generalise complex numbers, quaternions, and other algebraic systems, offer a unique mathematical framework for addressing the intricacies of physical spaces. The study by Brandstetter et al. (2022) introduced Clifford Neural Layers for PDE (Partial Differential Equations) Modeling, laying the groundwork for employing these algebras in capturing the dynamical behaviours of physical systems. Their work demonstrated how Clifford algebras' geometric properties as well as Clifford Fourier Transforms could be harnessed for modelling phenomena across Euclidean 2D and 3D spaces efficiently and with high fidelity. Building on this foundation, Ruhe et al. (2023) expanded on this through two contributions: ""Geometric Clifford Algebra Networks"" and ""Clifford Group Equivariant Neural Networks."" These studies explored the application of Clifford algebras to encode geometric transformations and symmetries within neural network architectures. Notably, their research ventured into modelling the Lorentz symmetries of space-time, a leap towards leveraging ML in relativistic physics contexts. The exploration of Clifford algebras in machine learning provides a new direction for enabling a model's understanding of the physical world. As already demonstrated by several follow-up works, the developments above opened up new avenues for machine learning research. In the long term, they could aid with new breakthroughs in physics modelling.