November 30, 2021

12:00 pm / 1:00 pm

Venue

https://wse.zoom.us/j/99567504456?pwd=WkI2UlpGT3p6MldLS05VNkdmcGxiZz09

 Recorded Seminar: 

https://wse.zoom.us/rec/share/q_6G-xD7kXGwRGh7te_ZKo_0WMKN-xgSmgBhdSEHtmshAtXY1KHWiMKSHMcZ33C0.QPBxe4RuQi4IdGrF?startTime=1638291563000

?Equivariant Machine Learning,Structured Like Classical Physics?


Soledad Villar, PhD

Assistant Professor

Applied Mathematics and Statistics

Johns Hopkins University

Abstract:There has been enormous progressin the last few years in designing neural networks that respect the fundamentalsymmetries and coordinate freedoms of physical law. Some of these frameworksmake use of irreducible representations, some make use of high-order tensorobjects, and some apply symmetry-enforcing constraints. Different physical lawsobey different combinations of fundamental symmetries, but a large fraction(possibly all) of classical physics is equivariant to translation,rotation,reflection (parity), boost (relativity), and permutations. Here we show that itis simple to parameterize universally approximating polynomial functions thatare equivariant under these symmetries, or under the Euclidean, Lorentz, andPoincar√© groups, at any dimensionality d. Thekey observation is thatnonlinear O(d)-equivariant (and related-group-equivariant) functions can beuniversally expressed in terms of a lightweight collection of scalars — scalarproducts and scalar contractions of the scalar, vector, and tensor inputs. Wecomplement our theory with numerical examples that show that the scalar-basedmethod is simple, efficient, and scalable.

 

Biography: SoledadVillar is anAssistant Professor in the Applied Mathematics and Statisticsdepartment atJohns Hopkins University. She co-organizes the MINDS/CIS as wellas the AMSseminar. If you want to suggest (and host) a speaker in 2022, youcan contact her at [email protected].We are hoping to have in-person talks soon.