Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 16 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -93,6 +93,20 @@ in parallel, respectively:
fpm build --compiler caf --profile release --flag "-cpp -DPARALLEL"
```

An experimental capability exists for parallel runs when building with LLVM `flang-new`
version 22 or later and [Caffeine](https://go.lbl.gov/caffeine). Steps for installing
LLVM 22.0.0git (the llvm-project main branch as of this writing) and Caffeine are
outlined in [parallel-testing-with-flang.md]. Once installed, an `fpm` command of the
following form should launch the neural-fortran test suite with two executing images:

```
GASNET_PSHM_NODES=2 \
fpm test \
--compiler flang-new \
--flag "-O3 -fcoarray -DPARALLEL" \
--link-flag "-lcaffeine -lgasnet-smp-seq -L<caffeine-install-prefix>/lib -L<gasnet-install-prefix>/lib"
```

#### Testing with fpm

```
Expand Down Expand Up @@ -305,3 +319,5 @@ group.
Neural-fortran has been used successfully in over a dozen published studies.
See all papers that cite it
[here](https://scholar.google.com/scholar?cites=7315840714744905948).

https://github.com/BerkeleyLab/julienne/blob/e9f7ea8069206bfc4abf6a9e6dbbd7d07bda075a/doc/parallel-testing-with-flang.md
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did you mean to put this here? It seems out of context

2 changes: 1 addition & 1 deletion src/nf/nf_activation.f90
Original file line number Diff line number Diff line change
Expand Up @@ -733,7 +733,7 @@ pure function eval_3d_celu_prime(self, x) result(res)
end function eval_3d_celu_prime

! Utility Functions
pure function get_activation_by_name(activation_name) result(res)
function get_activation_by_name(activation_name) result(res)
character(len=*), intent(in) :: activation_name
class(activation_function), allocatable :: res

Expand Down
2 changes: 1 addition & 1 deletion src/nf/nf_optimizers.f90
Original file line number Diff line number Diff line change
Expand Up @@ -316,7 +316,7 @@ end subroutine minimize_adagrad

! Utility Functions
!! Returns the default optimizer corresponding to the provided name
pure function get_optimizer_by_name(optimizer_name) result(res)
function get_optimizer_by_name(optimizer_name) result(res)
character(len=*), intent(in) :: optimizer_name
class(optimizer_base_type), allocatable :: res

Expand Down