There are currently different implementations for constructing delta field ~ data / randoms - 1 (smoothing of the data and randoms fields before computing delta, or after computing delta, threshold in number of randoms per cell), which follow the prescriptions used in original Bautista and White codes, e.g.:
|
def set_density_contrast(self, ran_min=0.01, smoothing_radius=15.): |
vs.
|
def set_density_contrast(self, ran_min=0.75, smoothing_radius=15., **kwargs): |
It'd be good to converge on a common prescription. Gaussian mocks (e.g.
https://github.com/cosmodesi/pyrecon/blob/main/nb/e2e_examples.ipynb) with selection function in them may assist the choice.
WIthin current implementation, if randoms density is too small there will be a lot of delta points set to 0, hence highly inefficient reconstruction.
There are currently different implementations for constructing delta field ~ data / randoms - 1 (smoothing of the data and randoms fields before computing delta, or after computing delta, threshold in number of randoms per cell), which follow the prescriptions used in original Bautista and White codes, e.g.:
pyrecon/pyrecon/iterative_fft_particle.py
Line 50 in 9549890
pyrecon/pyrecon/recon.py
Line 192 in 9549890
It'd be good to converge on a common prescription. Gaussian mocks (e.g. https://github.com/cosmodesi/pyrecon/blob/main/nb/e2e_examples.ipynb) with selection function in them may assist the choice.
WIthin current implementation, if randoms density is too small there will be a lot of delta points set to 0, hence highly inefficient reconstruction.