Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
9abfa21
Created new notebooks for examples
eduardojsbarroso Dec 8, 2025
f61da3b
Improving examples
eduardojsbarroso Dec 8, 2025
f559fa2
Updated
eduardojsbarroso Jan 5, 2026
7fe0539
Started documenting code
eduardojsbarroso Jan 5, 2026
dae4e00
Removed generated files
eduardojsbarroso Jan 5, 2026
958634d
Merge branch 'main' into issue/58/documentation
caioolivv Feb 19, 2026
7e89100
Merge branch 'notebook_examples' into issue/58/documentation
eduardojsbarroso Mar 27, 2026
836f5ea
Merge branch 'main' into issue/58/documentation
eduardojsbarroso Mar 27, 2026
dc3fa0e
Notebooks Checked
eduardojsbarroso Mar 27, 2026
b69d8d6
Bump CI version
eduardojsbarroso Mar 27, 2026
75575da
Added documentation to grid recipe
eduardojsbarroso Mar 27, 2026
a9c5e52
Improved documentation
eduardojsbarroso Apr 1, 2026
01443c0
Added instructions to readme
eduardojsbarroso Apr 2, 2026
f16c5bf
Bump version number
eduardojsbarroso Apr 2, 2026
0c95b3a
Right version number
eduardojsbarroso Apr 3, 2026
f45f79a
Read the docs
eduardojsbarroso Apr 3, 2026
90ae6d4
Fix readthedocs
eduardojsbarroso Apr 3, 2026
0c168e5
Readthedocs update
eduardojsbarroso Apr 3, 2026
fb6d6b3
Remove duplicated config in readthedocs
eduardojsbarroso Apr 3, 2026
129b46f
Change
eduardojsbarroso Apr 3, 2026
56c1ba1
Minimal doc conda env
eduardojsbarroso Apr 3, 2026
fa7e745
Automatic documentation
eduardojsbarroso Apr 3, 2026
1d816f4
Changed requirements file
eduardojsbarroso Apr 3, 2026
d78e1ec
Read the docs update
eduardojsbarroso Apr 3, 2026
5be4a04
Remove package install for documentation
eduardojsbarroso Apr 3, 2026
3394eff
Add numpy as req
eduardojsbarroso Apr 3, 2026
bc54579
Fix doc
eduardojsbarroso Apr 3, 2026
4bbdfe7
FIx conf
eduardojsbarroso Apr 3, 2026
c0f4993
COntinuing
eduardojsbarroso Apr 3, 2026
49a8399
Trying to not load API folder
eduardojsbarroso Apr 3, 2026
d418b17
Ignore crow/api
eduardojsbarroso Apr 3, 2026
77a6694
Added codecov threshold of 1% to avoid bugs
eduardojsbarroso Apr 3, 2026
1a440a5
FIx typo
eduardojsbarroso Apr 3, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 14 additions & 0 deletions .github/workflows/build_check.yml
Original file line number Diff line number Diff line change
Expand Up @@ -215,3 +215,17 @@ jobs:
use_oidc: false
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
doc-valid:
name: Documentation build
runs-on: ubuntu-latest
needs: quick-validation
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.11"
- name: Install doc dependencies
run: pip install -r docs/requirements.txt numpy scipy
- name: Build the docs
run: make -C docs/ html
10 changes: 9 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -12,4 +12,12 @@ lsstdesc_crow.egg-info/
.vscode/

# hypothesis
.hypothesis/
.hypothesis/


# Documentation
docs/api/
docs/api.rst
docs/index.rst
docs/compiled-examples/
docs/_build/
84 changes: 84 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,86 @@
# CROW
**C**luster **R**econstruction of **O**bservables **W**orkbench: **CROW**

The LSST-DESC Cluster Reconstruction of Observables Workbench (CROW) code is a DESC tool consisting of a Python library for predicting galaxy cluster observabless.

## Table of contents
1. [Installing CROW](#installing)
2. [Using CROW](#using)
3. [Contributing to CROW](#contributing)
5. [Contact](#contact)

# Installing Crow <a name="installing"></a>

Crow can be installed with `pip` or `conda`.

For a `pip` installation, run:

```bash
pip install lsstdesc-crow
```

For a `conda` installation, run:

```bash
conda install -c conda-forge lsstdesc-crow
```
After, to use is in your code, just do

```bash
import crow
```

## Requirements <a name="requirements"></a>

Crow requires Python version 3.11 or later.

### Dependencies <a name="dependencies"></a>

Crow has the following dependencies:

- [NumPy](https://www.numpy.org/) (v2 or later)
- [SciPy](https://scipy.org/) (v1.12 or later)
- [Pyccl](https://ccl.readthedocs.io/en/latest/index.html)
- [CLMM](https://lsstdesc.org/CLMM/)
- [NumCosmo](https://numcosmo.readthedocs.io/en/latest/)

# Using Crow <a name="using"></a>

This code has been released by DESC, although it is still under active
development. You are welcome to re-use the code, which is open source and available under
terms consistent with our
[LICENSE](https://github.com/LSSTDESC/crow/blob/main/LICENSE) ([BSD
3-Clause](https://opensource.org/licenses/BSD-3-Clause)).

Example usage can be found in the `notebooks` folder.

**DESC Projects**: External contributors and DESC members wishing to
use Crow for DESC projects should consult with the DESC Clusters analysis
working group (CL WG) conveners, ideally before the work has started, but
definitely before any publication or posting of the work to the arXiv.

**Non-DESC Projects by DESC members**: If you are in the DESC
community, but planning to use Crow in a non-DESC project, it would be
good practice to contact the CL WG co-conveners and/or the Crow
Team leads as well (see Contact section). A desired outcome would be for your
non-DESC project concept and progress to be presented to the working group,
so working group members can help co-identify tools and/or ongoing development
that might mutually benefit your non-DESC project and ongoing DESC projects.

**External Projects by Non-DESC members**: If you are not from the DESC
community, you are also welcome to contact Crow Team leads to introduce
your project and share feedback.


# Contributing to Crow <a name="contributing"></a>

You are welcome to contribute to the code. To do so, please make sure
you use `isort` and `black` on your code and assure you provide unit tests.

# Contact <a name="contact"></a>

If you have comments, questions, or feedback, please contact the current leads
of the LSST DESC Crow Team: Michel Aguena
(m-aguena, aguena@inaf.it) and Eduardo Barroso (eduardojsbarroso,
barroso@lapp.in2p3.fr)

12 changes: 12 additions & 0 deletions codecov.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
coverage:
status:
project:
default:
threshold: 1% # allow up to 1% drop without failing
patch:
default:
target: auto # compare patch coverage to base, not project total
threshold: 1%

ignore:
- "docs/**"
2 changes: 1 addition & 1 deletion crow/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,4 +8,4 @@
from .recipes.binned_grid import GridBinnedClusterRecipe
from .recipes.binned_parent import BinnedClusterRecipe

__version__ = "1.0.3"
__version__ = "1.0.4"
30 changes: 30 additions & 0 deletions crow/cluster_modules/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
"""Cluster modules package.

Keep this module lightweight: do not import the submodules or classes eagerly
here because several submodules import the top-level ``crow`` package and that
can produce circular imports when the top-level ``crow.__init__`` imports
things from ``cluster_modules``.

To import classes or modules use the explicit submodule path, for example::

from crow.cluster_modules.abundance import ClusterAbundance
from crow.cluster_modules.shear_profile import ClusterShearProfile

Sphinx/apidoc will still find and document the submodules even if they are not
imported here; this file only needs to exist so Python treats the directory as
a package.
"""

# Expose the expected subpackage/module names for convenience. Do NOT import
# the modules at package import time to avoid circular import problems.
__all__ = [
"abundance",
"completeness_models",
"kernel",
"parameters",
"purity_models",
"_clmm_patches",
"shear_profile",
"shear_profile_parallel",
"mass_proxy",
]
58 changes: 54 additions & 4 deletions crow/cluster_modules/abundance.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,18 @@ class ClusterAbundance:
an area on the sky, a halo mass function, as well as multiple kernels, where
each kernel represents a different distribution involved in the final cluster
abundance integrand.

Attributes
----------
cosmo : pyccl.cosmology.Cosmology or None
Cosmology object used for predictions. Set via the `cosmo` property.
halo_mass_function : callable
Halo mass function object compatible with PyCCL's MassFunc interface.
parameters : Parameters
Container for optional model parameters.
_hmf_cache : dict
Cache for previously computed halo mass function evaluations keyed by
(log_mass, scale_factor).
"""

@property
Expand All @@ -47,9 +59,28 @@ def __init__(
def comoving_volume(
self, z: npt.NDArray[np.float64], sky_area: float = 0
) -> npt.NDArray[np.float64]:
"""The differential comoving volume given area sky_area at redshift z.

:param sky_area: The area of the survey on the sky in square degrees.
"""Differential comoving volume for a given sky area.

Parameters
----------
z : array_like
Redshift or array of redshifts at which to compute the differential
comoving volume (dV/dz per steradian).
sky_area : float, optional
Survey area in square degrees. Default 0 returns per-steradian volume.

Returns
-------
numpy.ndarray
Differential comoving volume (same shape as `z`) multiplied by the
survey area (converted to steradians). Units: [Mpc/h]^3 (consistent
with the internal PyCCL conventions used here).

Notes
-----
This uses PyCCL background helpers to evaluate the angular diameter
distance and h(z) factor. If `sky_area` is zero the returned array is
the per-steradian dV/dz.
"""
assert self.cosmo is not None
scale_factor = 1.0 / (1.0 + z)
Expand All @@ -73,7 +104,26 @@ def mass_function(
log_mass: npt.NDArray[np.float64],
z: npt.NDArray[np.float64],
) -> npt.NDArray[np.float64]:
"""The mass function at z and mass."""
"""Evaluate the halo mass function at given log-mass and redshift.

Parameters
----------
log_mass : array_like
Array of log10 halo masses (M_sun).
z : array_like
Array of redshifts matching `log_mass`.

Returns
-------
numpy.ndarray
Array with mass function values (dn/dlnM or as provided by the
configured `halo_mass_function`) evaluated at each (mass, z).

Notes
-----
Results are cached in `_hmf_cache` keyed by (log_mass, scale_factor)
to avoid repeated expensive evaluations for identical inputs.
"""
scale_factor = 1.0 / (1.0 + z)
return_vals = []

Expand Down
Loading
Loading