Skip to content

Optimize tide-predictor for predicting multiple stations at a time #222

@joeberkovitz

Description

@joeberkovitz

It is often valuable to efficiently predict tides and currents for many stations at a time, say, in the 100 to 1000 station range.

For example, a useful class of visual interfaces for understanding tides and currents displays predictions for many stations at once, within some region of interest, at a single time. In this approach, plots against time are not used; instead, predictions for a time of interest are represented graphically over a base map. The prediction time can be changed smoothly by the user, yielding corresponding smooth changes in the graphics. One example of this is the website floatingtrails.com.

The present tide prediction approach in neaps seems to optimize the expense of node corrections for a single station over a period of time. Which is great but when dealing with a large number of stations, it looks like it recomputes the node corrections for each station, even though those corrections will be identical for all stations for any single prediction time.

It's not clear how much of a problem this is yet, but as the corrections computations are already optimized for a single station over a time series (caching node corrections for a given station within a 24-hour interval) I figure it's likely to be a problem in the many-stations use case too:

const prepareParams = (correctionTime: Date): ConstituentParam[] => {
const correctionAstro = astro(correctionTime);
const params: ConstituentParam[] = [];
for (const constituent of constituents) {
if (constituent.amplitude === 0) continue;
const model = constituentModels[constituent.name];
if (!model) continue;
const V0 = d2r * model.value(baseAstro);
const speed = d2r * model.speed;
const correction = model.correction(correctionAstro, fundamentals);
params.push({
A: constituent.amplitude * correction.f,
w: speed,
phi: V0 + d2r * correction.u - constituent.phase,
});
}
return params;
};
/**
* Create a function that returns constituent params with node corrections
* recomputed at CORRECTION_INTERVAL_HOURS. Returns a new array reference
* when corrections are recomputed, so callers can detect changes via `!==`.
*/
const correctedParams = (): ((hour: number) => ConstituentParam[]) => {
const firstChunkEnd = Math.min(CORRECTION_INTERVAL_HOURS, endHour);
let params = prepareParams(new Date(startMs + (firstChunkEnd / 2) * 3600000));
let nextCorrectionAt = CORRECTION_INTERVAL_HOURS;
return (hour: number): ConstituentParam[] => {
if (hour >= nextCorrectionAt) {
const chunkEnd = Math.min(nextCorrectionAt + CORRECTION_INTERVAL_HOURS, endHour);
params = prepareParams(new Date(startMs + ((nextCorrectionAt + chunkEnd) / 2) * 3600000));
nextCorrectionAt += CORRECTION_INTERVAL_HOURS;
}
return params;
};
};

One fix (I'm sure there are others, this is just an example) would be to allow the prediction machinery to use a set of precomputed corrections, passed in via one of the various options interfaces. Then the caller could use some other function to get those corrections. They wouldn't be already combined with a station's base amplitudes or phases for the constituents.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions