Date & Time
Friday, November 19, 2021, 10:40 AM - 11:30 AM
Name
How to Use Enzyme to Automatically Differentiate Any LLVM-based Language for CPU, GPU, and More
Description

Derivatives are key to algorithms in scientific computing and machine learning such as training neural networks, optimization, uncertainty quantification, and stability analysis. While machine learning frameworks rely on differentiable domain-specific languages (DSLs), the automatic differentiation (AD) of general-purpose programs poses significant challenges with regards to the to be differentiated programming languages, and the differentiability of language features such as parallelism and heterogeneity. Enzyme is a LLVM-based compiler plugin for automatic differentiation of statically analyzable programs expressed in the LLVM intermediate representation (IR), thus generating fast gradients of programs in a variety of languages (C/C++, FORTRAN, Julia, Rust, Swift, etc) and architectures (CPU, CUDA, ROCm). While existing tools operate at the source level, Enzyme differentiates after the application of compile-time optimizations, which allows for asymptotically faster gradients. The need to optimize first is especially pronounced when differentiating parallel, and specifically GPU programs, where data races and complex memory hierarchies can dramatically alter runtimes. This tutorial is aimed for both potential users of automatic differentiation and compiler-writers who may want to enable automatic differentiation in their compiler. Participants will be given an interactive introduction to automatic differentiation with Enzyme. Along the way, we will cover the foundations of automatic differentiation, how to use Enzyme to differentiate programs, parallel and GPU-specific differentiation, and all the tools necessary to enable Enzyme for your choice of compiler.

Session Type
Tutorial