Skip to content

Commit

Permalink
split up old markdown a bit more
Browse files Browse the repository at this point in the history
  • Loading branch information
ZuseZ4 committed Dec 19, 2023
1 parent 9f780d9 commit 1e0c3be
Show file tree
Hide file tree
Showing 7 changed files with 21 additions and 16 deletions.
8 changes: 5 additions & 3 deletions src/SUMMARY.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,14 @@

# User Guide
- [What is Autodiff?](./chapter_1.md)
- [What do we use it for?](./chapter_2.md)
- [Motivation](./motivation.md)
- [Prior Art](./prior_art.md)
- [user facing design](./user_design.md)
- [User facing design](./user_design.md)
- [rustc internal design](./rustc_design.md)
- [Other Enzyme frontneds](./other_Frontends.md)
- [Other Enzyme frontends](./other_Frontends.md)
# Reference Guide
- [Forward Mode](./fwd.md)
- [Reverse Mode](./rev.md)
- [Current Limitations](./limitations.md)

- [Acknowledgments](./acknowledgments.md)
1 change: 1 addition & 0 deletions src/acknowledgments.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
# Acknowledgments
1 change: 0 additions & 1 deletion src/chapter_prior_art.md

This file was deleted.

6 changes: 6 additions & 0 deletions src/fwd.md
Original file line number Diff line number Diff line change
@@ -1 +1,7 @@
# Forward Mode

In Forward mode we are only allowed to mark input arguments
The return value of forward mode with a Duplicated return is a tuple containing as the first value the primal return value and as the second value the derivative.

In forward mode Duplicated(x, 0.0) is equivalent to Const(x), except that we can perform more optimizations for Const.

7 changes: 4 additions & 3 deletions src/chapter_2.md → src/motivation.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,14 @@ We propose to add automatic differentiation to Rust. This would allow Rust user
A few voices of autodiff users.

Jan Hückelheim (Argonne National Lab, US):
_Automatic differentiation (AD,also known as autodiff or bakc-propagation) has been used at Argonne and other national laboratories, at least, since the 1980s. For example, we have used AD to obtain gradients of computational fluid dynamics applications for shape-optimization, which allows the automated design of aircraft wings or turbine blades to minimze drag or fuel consumption. AD is used extensively in many other applications including seismic imaging, climate modeling, quantum computing, or software verification._

_Besides the aformentionend "conventional" uses of AD, it is also a cornerstone for the development of ML methods that incorporate physical models. The 2022 department of energy report on Advanced Research Directions on AI for Science, Energy, and Security states that "End-to-end differentiability for composing simulation and inference in a virtuous loop is required to integrate first-principles calculations and avanced AI training and inference". It is therefore conceivable that ADusage and development will become even more important in the near future._
_Automatic differentiation (AD, also known as autodiff or back-propagation) has been used at Argonne and other national laboratories, at least, since the 1980s. For example, we have used AD to obtain gradients of computational fluid dynamics applications for shape-optimization, which allows the automated design of aircraft wings or turbine blades to minimize drag or fuel consumption. AD is used extensively in many other applications including seismic imaging, climate modeling, quantum computing, or software verification.

Besides the aforementioned “conventional” uses of AD, it is also a cornerstone for the development of ML methods that incorporate physical models. The 2022 department of energy report on Advanced Research Directions on AI for Science, Energy, and Security states that “End-to-end differentiability for composing simulation and inference in a virtuous loop is required to integrate first-principles calculations and advanced AI training and inference”. It is therefore conceivable that AD usage and development will become even more important in the near future._
[1](https://www.anl.gov/sites/www/files/2023-05/AI4SESReport-2023.pdf)

Prof. Jed Brown (UC Boulder, US):

_My primary applications are in computational mechanics (constitutive modeling and calibration), where it'll enable us to give a far better user experience than commercial packages, but differentiable programming is a key enabler for a lot of scientific computing and ML research and production._

# Background

Expand Down
6 changes: 5 additions & 1 deletion src/other_Frontends.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,16 @@
# Other Enzyme frontends

Enzyme currently has experimental frontends for C/C++, Julia, Fortran, Numba, and Rust.
Enzyme currently has experimental frontends for C/C++, Julia, Fortran, Numba, some MLIR dialects, and Rust.

C/C++/CUDA documentation is available at [https://enzyme.mit.edu](https://enzyme.mit.edu)
Julia documentation is available at [https://enzyme.mit.edu/julia](https://enzyme.mit.edu/julia)
Rust documentation is available at [https://enzyme.mit.edu/rust](https://enzyme.mit.edu/rust)
Enzyme-JAX interop is available at [https://github.com/EnzymeAD/Enzyme-JAX](https://github.com/EnzymeAD/Enzyme-JAX)
MLIR (Reverse Mode) documentation is available at [https://github.com/EnzymeAD/Enzyme-MLIR-Reverse](https://github.com/EnzymeAD/Enzyme-MLIR-Reverse)
Numba documentation is tba.
Fortran documentation is tba.

We have a compiler-explorer fork with support for autodiff in C/C++/CUDA, Julia, and MLIR [here](https://enzyme.mit.edu/explorer).

Developer documentation is available at [https://enzyme.mit.edu/doxygen](https://enzyme.mit.edu/doxygen)
Please reach out if you would like to see support for additional languages.
8 changes: 0 additions & 8 deletions src/rev.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,14 +9,6 @@ TODO: Talk about what makes this approach special and a good fit for Rust concep
TODO: Talk about the new attributes and define the semantics of these new attributes. Give examples.


#### Forward mode
WIP

In Forward mode we are only allowed to mark input arguments
The return value of forward mode with a Duplicated return is a tuple containing as the first value the primal return value and as the second value the derivative.

In forward mode Duplicated(x, 0.0) is equivalent to Const(x), except that we can perform more optimizations for Const.


#### Reverse Mode
Both the inplace and "normal" variant return the gradient. The difference is that with Active the gradient is returned and with Duplicated the gradient is accumulated in place.
Expand Down

0 comments on commit 1e0c3be

Please sign in to comment.