|
| 1 | +- title: "Advanced optimizations for source transformation based |
| 2 | + automatic differentiation" |
| 3 | + description: | |
| 4 | + Clad is a LLVM/Clang plugin designed to provide automatic differentiation (AD) |
| 5 | + for C++ mathematical functions. It generates code for computing derivatives modifying |
| 6 | + abstract syntax tree using LLVM compiler features. Clad supports forward- and |
| 7 | + reverse-mode differentiation that are effectively used to integrate all kinds of |
| 8 | + functions. The typical AD approach in Machine Learning tools records and flattens the |
| 9 | + compute graph at runtime, whereas Clad can perform more advanced optimizations at |
| 10 | + compile time using a rich program representation provided by the Clang AST. These |
| 11 | + optimizations investigate which parts of the computation graph are relevant to |
| 12 | + the AD rules. |
| 13 | + |
| 14 | + One such technique is the “To-Be-Recorded” optimization, which reduces |
| 15 | + the memory pressure to the clad tape data structure in the adjoint mode. Another |
| 16 | + optimization technique is activity analysis, which discards all derivative |
| 17 | + statements that are not relevant to the generated code. In the talk we will explain |
| 18 | + compiler-level optimizations specific to AD, and will show some specific examples |
| 19 | + of how these analyses have impacted clad applications. |
| 20 | +
|
| 21 | + location: "[MODE 2024](https://indico.cern.ch/event/1380163/)" |
| 22 | + date: 2024-09-25 |
| 23 | + speaker: Maksym Andriichuk |
| 24 | + id: "VVMODE2024" |
| 25 | + artifacts: | |
| 26 | + [Link to Slides](/assets/presentations/Maksym_Andriichuk_MODE2024_Optimizations.pdf) |
| 27 | + highlight: 1 |
| 28 | + |
1 | 29 | - title: "Improving BioDynamo's Performance using ROOT C++ Modules"
|
2 | 30 | description: |
|
3 | 31 | Poster presented at the FOURTH Mode Workshop on Differentiable Programming for Experiment Design
|
|
0 commit comments