Skip to content

Commit

Permalink
V239 changes (#5031)
Browse files Browse the repository at this point in the history
* Save statmap significance fits information (#4951)

* Report significance fit info in sngls_findtrigs

* Report significance calculation info in statmap jobs

* Fix typo, fix test

* TD comments, some tidying up

* neaten comments

* comment fix

---------

Co-authored-by: Thomas Dent <[email protected]>

* index issue (#5018)

* Bump to v2.3.9

* Updating pegasus_sites

* Ensure v2.3.8 pycbc_inspiral

* Missing " in updating singularity image

* Fix [hopefully] documentation failure (#4901)

* try macos latest version (#4922)

* try macos latest version

* move to update macos on build as well

* Fix mac tests with conda (#4946)

* tox: fix tox integration with conda

use setup-miniconda github action and specify more packages in conda_deps for each testenv

* tox: install ligo-segments and python-ligo-lw with conda

these packages don't install cleanly with pypi, but the conda packages have patches

* test: use numpy.longdouble instead of float128

float128 isn't available on macOS ARM64

* tox: clean up duplicate package lists

* ci: unpin tox

* tox: further simplify duplicate configuration

* Use old stashcp

* Undo hardcode singularity image

* Issue in rebasing(?)

* Don't run on 3.8

* Used the wrong version ... somehow

* Don't build for 3.12

* Try not building mac wheels

* Don't try and support BBHx

* CHange optimization

* Avoid lalsuite 7.25

---------

Co-authored-by: Gareth S Cabourn Davies <[email protected]>
Co-authored-by: Thomas Dent <[email protected]>
Co-authored-by: Alex Nitz <[email protected]>
Co-authored-by: Duncan Macleod <[email protected]>
  • Loading branch information
5 people authored Feb 11, 2025
1 parent d6e94f5 commit 5593465
Show file tree
Hide file tree
Showing 19 changed files with 169 additions and 93 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/basic-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ jobs:
max-parallel: 60
matrix:
os: [ubuntu-20.04]
python-version: [3.8, 3.9, '3.10', '3.11']
python-version: [3.9, '3.10', '3.11']
test-type: [unittest, search, docs]
steps:
- uses: actions/checkout@v4
Expand All @@ -25,7 +25,7 @@ jobs:
run: |
sudo apt-get -o Acquire::Retries=3 update
sudo apt-get -o Acquire::Retries=3 install *fftw3* mpi intel-mkl* git-lfs graphviz
pip install "tox<4.0.0" pip setuptools --upgrade
pip install tox pip setuptools --upgrade
- name: installing auxiliary data files
run: |
GIT_CLONE_PROTECTION_ACTIVE=false GIT_LFS_SKIP_SMUDGE=1 git clone https://git.ligo.org/lscsoft/lalsuite-extra
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/distribution.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ jobs:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ubuntu-20.04, macos-12]
os: [ubuntu-20.04]

steps:
- uses: actions/checkout@v4
Expand Down
46 changes: 37 additions & 9 deletions .github/workflows/mac-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,18 +12,46 @@ jobs:
strategy:
max-parallel: 4
matrix:
os: [macos-12]
python-version: [3.8, 3.9, '3.10', '3.11']
os: [macos-latest]
python-version:
- '3.10'
- '3.11'

# this is needed for conda environments to activate automatically
defaults:
run:
shell: bash -el {0}

steps:
- uses: actions/checkout@v1
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5

- name: Cache conda packages
uses: actions/cache@v4
env:
# increment to reset cache
CACHE_NUMBER: 0
with:
path: ~/conda_pkgs_dir
key: ${{ runner.os }}-conda-${{ matrix.python-version}}-${{ env.CACHE_NUMBER }}

- name: Configure conda
uses: conda-incubator/setup-miniconda@v3
with:
activate-environment: test
channels: conda-forge
miniforge-version: latest
python-version: ${{ matrix.python-version }}
- run: |
brew install fftw openssl gsl
pip install --upgrade pip setuptools "tox<4.0.0"
- name: run basic pycbc test suite

- name: Conda info
run: conda info --all

- name: Install tox
run: |
conda install \
pip \
setuptools \
tox
- name: Run basic pycbc test suite
run: |
sudo chmod -R 777 /usr/local/miniconda/
tox -e py-unittest
4 changes: 2 additions & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,8 @@ ADD docker/etc/cvmfs/config-osg.opensciencegrid.org.conf /etc/cvmfs/config-osg.o
RUN dnf -y install https://ecsft.cern.ch/dist/cvmfs/cvmfs-release/cvmfs-release-latest.noarch.rpm && dnf -y install cvmfs cvmfs-config-default && dnf clean all && dnf makecache && \
dnf -y groupinstall "Development Tools" \
"Scientific Support" && \
rpm -e --nodeps git perl-Git && dnf -y install @python39 rsync zlib-devel libpng-devel libjpeg-devel sqlite-devel openssl-devel fftw-libs-single fftw-devel fftw fftw-libs-long fftw-libs fftw-libs-double gsl gsl-devel hdf5 hdf5-devel python39-devel swig which osg-ca-certs && python3.9 -m pip install --upgrade pip setuptools wheel cython && python3.9 -m pip install mkl ipython jupyter jupyterhub jupyterlab lalsuite && \
dnf -y install https://repo.opensciencegrid.org/osg/3.5/el8/testing/x86_64/osg-wn-client-3.5-5.osg35.el8.noarch.rpm && dnf clean all
rpm -e --nodeps git perl-Git && dnf -y install @python39 rsync zlib-devel libpng-devel libjpeg-devel sqlite-devel openssl-devel fftw-libs-single fftw-devel fftw fftw-libs-long fftw-libs fftw-libs-double gsl gsl-devel hdf5 hdf5-devel python39-devel swig which osg-ca-certs && python3.9 -m pip install --upgrade pip setuptools wheel cython && python3.9 -m pip install mkl ipython jupyter jupyterhub jupyterlab lalsuite==7.24 && \
dnf -y install https://repo.opensciencegrid.org/osg/3.5/el8/testing/x86_64/osg-wn-client-3.5-5.osg35.el8.noarch.rpm && dnf -y install pelican-osdf-compat-7.10.11-1.x86_64 && dnf -y install pelican-7.10.11-1.x86_64 && dnf clean all

# set up environment
RUN cd / && \
Expand Down
12 changes: 6 additions & 6 deletions bin/all_sky_search/pycbc_add_statmap
Original file line number Diff line number Diff line change
Expand Up @@ -310,14 +310,14 @@ if injection_style:
for bg_fname in args.background_files:
bg_f = h5py.File(bg_fname, 'r')
ifo_combo_key = bg_f.attrs['ifos'].replace(' ','')
_, far[ifo_combo_key] = significance.get_far(
_, far[ifo_combo_key], _ = significance.get_far(
bg_f['background/stat'][:],
f['foreground/stat'][:],
bg_f['background/decimation_factor'][:],
bg_f.attrs['background_time'],
**significance_dict[ifo_combo_key])

_, far_exc[ifo_combo_key] = \
_, far_exc[ifo_combo_key], _ = \
significance.get_far(
bg_f['background_exc/stat'][:],
f['foreground/stat'][:],
Expand All @@ -329,15 +329,15 @@ else:
# background included
for f_in in files:
ifo_combo_key = get_ifo_string(f_in).replace(' ','')
_, far[ifo_combo_key] = \
_, far[ifo_combo_key], _ = \
significance.get_far(
f_in['background/stat'][:],
f['foreground/stat'][:],
f_in['background/decimation_factor'][:],
f_in.attrs['background_time'],
**significance_dict[ifo_combo_key])

_, far_exc[ifo_combo_key] = \
_, far_exc[ifo_combo_key], _ = \
significance.get_far(
f_in['background_exc/stat'][:],
f['foreground/stat'][:],
Expand Down Expand Up @@ -608,7 +608,7 @@ while True:
fg_time_ct[key] -= args.cluster_window
bg_t_y = conv.sec_to_year(bg_time_ct[key])
fg_t_y = conv.sec_to_year(fg_time_ct[key])
bg_far, fg_far = significance.get_far(
bg_far, fg_far, _ = significance.get_far(
sep_bg_data[key].data['stat'],
sep_fg_data[key].data['stat'],
sep_bg_data[key].data['decimation_factor'],
Expand All @@ -632,7 +632,7 @@ while True:

logging.info("Recalculating combined IFARs")
for key in all_ifo_combos:
_, far[key] = significance.get_far(
_, far[key], _ = significance.get_far(
sep_bg_data[key].data['stat'],
combined_fg_data.data['stat'],
sep_bg_data[key].data['decimation_factor'],
Expand Down
2 changes: 1 addition & 1 deletion bin/all_sky_search/pycbc_coinc_findtrigs
Original file line number Diff line number Diff line change
Expand Up @@ -411,7 +411,7 @@ def process_template(tnum):
# with any trigger in the fixed network
tidx = len(threshes)
for i in range(1, len(threshes)):
if pivot_stat[-1] >= pivot_lower[kidx]:
if pivot_stat[-1] >= pivot_lower[i]:
tidx = i
break

Expand Down
14 changes: 10 additions & 4 deletions bin/all_sky_search/pycbc_coinc_statmap
Original file line number Diff line number Diff line change
Expand Up @@ -241,7 +241,7 @@ fore_stat = all_trigs.stat[fore_locs]

# Cumulative array of inclusive background triggers and the number of
# inclusive background triggers louder than each foreground trigger
bg_far, fg_far = significance.get_far(
bg_far, fg_far, sig_info = significance.get_far(
back_stat,
fore_stat,
all_trigs.decimation_factor[back_locs],
Expand All @@ -250,7 +250,7 @@ bg_far, fg_far = significance.get_far(

# Cumulative array of exclusive background triggers and the number
# of exclusive background triggers louder than each foreground trigger
bg_far_exc, fg_far_exc = significance.get_far(
bg_far_exc, fg_far_exc, exc_sig_info = significance.get_far(
exc_zero_trigs.stat,
fore_stat,
exc_zero_trigs.decimation_factor,
Expand Down Expand Up @@ -288,10 +288,14 @@ if fore_locs.sum() > 0:
fap = 1 - numpy.exp(- coinc_time / ifar)
f['foreground/ifar'] = conv.sec_to_year(ifar)
f['foreground/fap'] = fap
for key, value in sig_info.items():
f['foreground'].attrs[key] = value
ifar_exc = 1. / fg_far_exc
fap_exc = 1 - numpy.exp(- coinc_time_exc / ifar_exc)
f['foreground/ifar_exc'] = conv.sec_to_year(ifar_exc)
f['foreground/fap_exc'] = fap_exc
for key, value in exc_sig_info.items():
f['foreground'].attrs[key + '_exc'] = value
else:
f['foreground/ifar'] = numpy.array([])
f['foreground/fap'] = numpy.array([])
Expand Down Expand Up @@ -425,7 +429,7 @@ while numpy.any(ifar_foreground >= background_time):
logging.info("Calculating FAN from background statistic values")
back_stat = all_trigs.stat[back_locs]
fore_stat = all_trigs.stat[fore_locs]
bg_far, fg_far = significance.get_far(
bg_far, fg_far, sig_info = significance.get_far(
back_stat,
fore_stat,
all_trigs.decimation_factor[back_locs],
Expand All @@ -452,7 +456,7 @@ while numpy.any(ifar_foreground >= background_time):
# Exclusive background doesn't change when removing foreground triggers.
# So we don't have to take background ifar, just repopulate ifar_foreground
else :
_, fg_far_exc = significance.get_far(
_, fg_far_exc, _ = significance.get_far(
exc_zero_trigs.stat,
fore_stat,
exc_zero_trigs.decimation_factor,
Expand All @@ -479,6 +483,8 @@ while numpy.any(ifar_foreground >= background_time):
fap = 1 - numpy.exp(- coinc_time / ifar)
f['foreground_h%s/ifar' % h_iterations] = conv.sec_to_year(ifar)
f['foreground_h%s/fap' % h_iterations] = fap
for key, value in sig_info.items():
f['foreground_h%' % h_iterations].attrs[key] = value

# Update ifar and fap for other foreground triggers
for i in range(len(ifar)):
Expand Down
4 changes: 3 additions & 1 deletion bin/all_sky_search/pycbc_coinc_statmap_inj
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ f.attrs['foreground_time'] = coinc_time

if len(zdata) > 0:

_, fg_far_exc = significance.get_far(
_, fg_far_exc, exc_sig_info = significance.get_far(
back_stat,
zdata.stat,
dec_fac,
Expand All @@ -108,6 +108,8 @@ if len(zdata) > 0:
fap_exc = 1 - numpy.exp(- coinc_time / ifar_exc)
f['foreground/ifar_exc'] = conv.sec_to_year(ifar_exc)
f['foreground/fap_exc'] = fap_exc
for key, value in exc_sig_info.items():
f['foreground'].attrs[key + '_exc'] = value

else:
f['foreground/ifar_exc'] = numpy.array([])
Expand Down
4 changes: 3 additions & 1 deletion bin/all_sky_search/pycbc_exclude_zerolag
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ for k in filtered_trigs.data:
f_out['background_exc/%s' % k] = filtered_trigs.data[k]

logging.info('Recalculating IFARs')
bg_far, fg_far = significance.get_far(
bg_far, fg_far, sig_info = significance.get_far(
filtered_trigs.data['stat'],
f_in['foreground/stat'][:],
filtered_trigs.data['decimation_factor'],
Expand All @@ -110,6 +110,8 @@ bg_ifar_exc = 1. / bg_far
logging.info('Writing updated ifars to file')
f_out['foreground/ifar_exc'][:] = conv.sec_to_year(fg_ifar_exc)
f_out['background_exc/ifar'][:] = conv.sec_to_year(bg_ifar_exc)
for key, value in sig_info.items():
f_out['foreground'].attrs[key + '_exc'] = value

fg_time_exc = conv.sec_to_year(f_in.attrs['foreground_time_exc'])
f_out['foreground/fap_exc'][:] = 1 - np.exp(-fg_time_exc / fg_ifar_exc)
Expand Down
25 changes: 20 additions & 5 deletions bin/all_sky_search/pycbc_sngls_statmap
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,7 @@ assert ifo + '/time' in all_trigs.data
logging.info("We have %s triggers" % len(all_trigs.stat))
logging.info("Clustering triggers")
all_trigs = all_trigs.cluster(args.cluster_window)
logging.info("%s triggers remain" % len(all_trigs.stat))

fg_time = float(all_trigs.attrs['foreground_time'])

Expand Down Expand Up @@ -139,12 +140,13 @@ significance_dict = significance.digest_significance_options([ifo], args)

# Cumulative array of inclusive background triggers and the number of
# inclusive background triggers louder than each foreground trigger
bg_far, fg_far = significance.get_far(
bg_far, fg_far, sig_info = significance.get_far(
back_stat,
fore_stat,
bkg_dec_facs,
fg_time,
**significance_dict[ifo])
**significance_dict[ifo]
)

fg_far = significance.apply_far_limit(
fg_far,
Expand Down Expand Up @@ -192,7 +194,7 @@ back_exc_locs = back_exc_locs[to_keep]

# Cumulative array of exclusive background triggers and the number
# of exclusive background triggers louder than each foreground trigger
bg_far_exc, fg_far_exc = significance.get_far(
bg_far_exc, fg_far_exc, exc_sig_info = significance.get_far(
back_stat_exc,
fore_stat,
bkg_exc_dec_facs,
Expand Down Expand Up @@ -231,6 +233,10 @@ f['foreground/fap'] = fap
fap_exc = 1 - numpy.exp(- fg_time_exc / fg_ifar_exc)
f['foreground/ifar_exc'] = conv.sec_to_year(fg_ifar_exc)
f['foreground/fap_exc'] = fap_exc
for key, value in sig_info.items():
f['foreground'].attrs[key] = value
for key, value in exc_sig_info.items():
f['foreground'].attrs[f'{key}_exc'] = value

if 'name' in all_trigs.attrs:
f.attrs['name'] = all_trigs.attrs['name']
Expand Down Expand Up @@ -290,6 +296,10 @@ while numpy.any(ifar_louder > hier_ifar_thresh_s):
f['foreground_h%s/ifar' % h_iterations] = conv.sec_to_year(fg_ifar)
f['foreground_h%s/ifar_exc' % h_iterations] = conv.sec_to_year(fg_ifar_exc)
f['foreground_h%s/fap' % h_iterations] = fap
for key, value in sig_info.items():
f['foreground_h%s' % h_iterations].attrs[key] = value
for key, value in exc_sig_info.items():
f['foreground_h%s' % h_iterations].attrs[key + "_exc"] = value
for k in all_trigs.data:
f['foreground_h%s/' % h_iterations + k] = all_trigs.data[k]
# Add the iteration number of hierarchical removals done.
Expand Down Expand Up @@ -342,7 +352,7 @@ while numpy.any(ifar_louder > hier_ifar_thresh_s):
logging.info("Calculating FAN from background statistic values")
back_stat = fore_stat = all_trigs.stat

bg_far, fg_far = significance.get_far(
bg_far, fg_far, sig_info = significance.get_far(
back_stat,
fore_stat,
numpy.ones_like(back_stat),
Expand All @@ -368,11 +378,12 @@ while numpy.any(ifar_louder > hier_ifar_thresh_s):
# triggers are being removed via inclusive or exclusive background.
if is_bkg_inc:
ifar_louder = fg_ifar
exc_sig_info = {}

# Exclusive background doesn't change when removing foreground triggers.
# So we don't have to take bg_far_exc, just repopulate fg_ifar_exc
else:
_, fg_far_exc = significance.get_far(
_, fg_far_exc, exc_sig_info = significance.get_far(
back_stat_exc,
fore_stat,
numpy.ones_like(back_stat_exc),
Expand Down Expand Up @@ -400,6 +411,10 @@ while numpy.any(ifar_louder > hier_ifar_thresh_s):
# Write ranking statistic to file just for downstream plotting code
f['foreground_h%s/stat' % h_iterations] = fore_stat

for key, value in sig_info.items():
f['foreground_h%s' % h_iterations].attrs[key] = value
for key, value in exc_sig_info.items():
f['foreground_h%s' % h_iterations].attrs[key + "_exc"] = value
fap = 1 - numpy.exp(- fg_time / fg_ifar)
f['foreground_h%s/ifar' % h_iterations] = conv.sec_to_year(fg_ifar)
f['foreground_h%s/fap' % h_iterations] = fap
Expand Down
5 changes: 4 additions & 1 deletion bin/all_sky_search/pycbc_sngls_statmap_inj
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,7 @@ significance_dict = significance.digest_significance_options([ifo], args)

# Cumulative array of exclusive background triggers and the number
# of exclusive background triggers louder than each foreground trigger
bg_far_exc, fg_far_exc = significance.get_far(
bg_far_exc, fg_far_exc, sig_info = significance.get_far(
back_stat_exc,
fore_stat,
bkg_exc_dec_facs,
Expand All @@ -137,6 +137,9 @@ fap_exc = 1 - numpy.exp(- fg_time_exc / fg_ifar_exc)
f['foreground/ifar_exc'] = conv.sec_to_year(fg_ifar_exc)
f['foreground/fap_exc'] = fap_exc

for key, value in sig_info.items():
f['foreground'].attrs[key + '_exc'] = value

if 'name' in all_trigs.attrs:
f.attrs['name'] = all_trigs.attrs['name']

Expand Down
4 changes: 0 additions & 4 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,6 @@
# a list of builtin themes.
import sphinx_rtd_theme
html_theme = 'sphinx_rtd_theme'
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]

# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
Expand All @@ -117,9 +116,6 @@
'logo_only':True,
}

# Add any paths that contain custom themes here, relative to this directory.
#html_theme_path = []

html_context = {
'display_github': True,
'github_user': 'gwastro',
Expand Down
Loading

0 comments on commit 5593465

Please sign in to comment.