Session Type
Student Technical Talks
Date & Time
Wednesday, November 9, 2022, 5:15 PM - 5:30 PM
Name
LAGrad: Leveraging the MLIR Ecosystem for Efficient Differentiable Programming
Speakers
Abstract/s
Automatic differentiation (AD) is a central algorithm in machine learning and optimization. This talk introduces LAGrad, a reverse-mode source-to-source AD system that differentiates tensor operations in the linalg, scf, and tensor dialects of MLIR. LAGrad leverages the value semantics of linalg-on-tensors in MLIR to simplify the analyses required to generate adjoint code that is efficient in terms of both run time and memory consumption. LAGrad also combines AD with MLIR’s type system to exploit structured sparsity patterns such as lower triangular tensors. We compare performance results to Enzyme, a state of the art AD system, on Microsoft’s ADBench suite. Our results show speedups of up to 2x relative to Enzyme and in some cases use 30x less memory.
Location Name
Hayes Ballroom - Main Level