Skip to content
Merged
Show file tree
Hide file tree
Changes from 111 commits
Commits
Show all changes
122 commits
Select commit Hold shift + click to select a range
c65ce52
dual sampler, queue, and batch handler with obs. modifying Sup3rDatas…
bnb32 Dec 20, 2024
16ca65c
training with obs test
bnb32 Dec 20, 2024
8630629
split up interface and abstact model
bnb32 Dec 21, 2024
9e35ce0
made dual batch queue flexible enough to account for additional obs m…
bnb32 Dec 22, 2024
7c52bf5
tensorboard mixin moved to model utilities. dual queue completely abs…
bnb32 Dec 22, 2024
709792f
integrated dual sampler with obs into base dual sampler.
bnb32 Dec 23, 2024
019d022
examples added to DataHandler doc string. Some instructions on sup3rw…
bnb32 Dec 23, 2024
7ff2fba
removed namedtuple from Sup3rDataset to make Sup3rDataset picklable.
bnb32 Dec 26, 2024
5a483c4
parallel batch queue test added.
bnb32 Dec 27, 2024
53bdf46
namedtuple -> DsetTuple missing attr fix
bnb32 Dec 27, 2024
f25beb0
gust added to era download variables. len dunder added to ``Container…
bnb32 Dec 27, 2024
5e10120
computing before reshaping is 2x faster.
bnb32 Dec 28, 2024
f01f136
obs_index fix - sampler needs to use hr_out_features for the obs member.
bnb32 Dec 28, 2024
01d9e60
split up ``calc_loss`` and ``calc_loss_obs``
bnb32 Dec 29, 2024
507b158
Optional run_qa flag in ``DualRasterizer``. Queue shape fix for queue…
bnb32 Dec 29, 2024
af047da
``run_qa=True`` default for ``DualRasterizer``
bnb32 Dec 29, 2024
1388cd7
better tracking of batch counting. (this can be tricky for parallel q…
bnb32 Dec 29, 2024
950ca9e
missed compute call for slow batching. this was hidden by queueing an…
bnb32 Dec 29, 2024
0b489b6
Included convert to tensor in ``sample_batch``. Test for training wit…
bnb32 Dec 30, 2024
d7ca6cd
cc batch handler test fix
bnb32 Dec 31, 2024
07251bb
added test for new disc with "valid" padding
bnb32 Dec 31, 2024
0b1a43d
parallel sampling batch sampling test.
bnb32 Jan 1, 2025
81bad5c
removed workers tests. max_workers > 1 still not consistently faster.…
bnb32 Jan 2, 2025
ed2cc21
``Sup3rGanWithObs`` model subclass. Other misc model refactoring.
bnb32 Jan 3, 2025
1ee7e31
moved ``_run`` method to bias correction interface ``AbstractBiasCorr…
bnb32 Jan 5, 2025
e353ebc
moved ``_run`` method to bias correction interface ``AbstractBiasCorr…
bnb32 Jan 5, 2025
707a624
fix: tensorboard issue with loss obs details
bnb32 Jan 8, 2025
2816b8a
Adding obs loss to logging of loss gen
bnb32 Jan 19, 2025
9a30089
Adding ``loss_obs`` to ``loss_gen`` so the total loss shows in log ou…
bnb32 Jan 21, 2025
1a66955
generalized min pad width for padding slices so that this can accomod…
bnb32 Jan 11, 2025
a784290
min padding depends on the ``.paddings`` attribute of the ``FlexibleP…
bnb32 Jan 15, 2025
827640a
`max_paddings` method in `interface` instead of in `strategy.py`.
bnb32 Jan 21, 2025
a9a84c9
conditioning gan on observations with obs included through ``Sup3Fixe…
bnb32 Jan 22, 2025
ff4d764
simplified conditioning on obs with masked concat layer, so we don't …
bnb32 Jan 23, 2025
7e43ae9
Cleaned up ``Sup3rGanFixedObs`` to rely more on parent methods. Remov…
bnb32 Jan 25, 2025
81b3a76
removed additional gen loss term from obs loss. Need loss to be unifo…
bnb32 Jan 28, 2025
543a751
spatial frac uniform sampling
bnb32 Jan 29, 2025
e1dd305
new custom layers, which require special treatment based on naming, i…
bnb32 Feb 4, 2025
2f87f50
fix: epoch always started with disc_loss = 0, resulting in disc never…
bnb32 Feb 5, 2025
335d0b7
load check for disc training fix
bnb32 Feb 5, 2025
d838135
added option to add extra content loss term for obs sites, with speci…
bnb32 Feb 6, 2025
8d519e1
added gen loss to previous epoch details, as part of running means
bnb32 Feb 6, 2025
8639ff9
Using running dataframe records of training and validation batch loss…
bnb32 Feb 6, 2025
ad0ef04
use index to append loss details record
bnb32 Feb 7, 2025
5e055eb
changed trained_frac naming - doesn't make sense to prefix these with…
bnb32 Feb 7, 2025
ecbde08
added `loss_mean_window` arg, material derivative loss with extremes,…
bnb32 Feb 12, 2025
8fa47f0
material derivative loss test fix
bnb32 Feb 12, 2025
13a8fe0
typo
bnb32 Feb 12, 2025
e145aaa
removing bespoke multi term loss functions which can now be created b…
bnb32 Feb 13, 2025
bed8609
Removing old loss functions in tests
bnb32 Feb 13, 2025
ab2f4e4
adding obs loss to gen content in loss details
bnb32 Feb 13, 2025
ec54317
gen_loss_content functions now return total loss value and dictionary…
bnb32 Feb 17, 2025
5a83f55
Starting fwp integration of conditional obs models
bnb32 Feb 18, 2025
c9606de
test fixes and missed function rename
bnb32 Feb 18, 2025
7aa6339
fix: multi-term loss for dc models and multiterm loss test
bnb32 Feb 19, 2025
afda2b9
``min_width`` as optional kwarg instead of determined from generator …
bnb32 Feb 19, 2025
5758a97
renaming `_get_batch_loss_details`. moved weight into to start of `tr…
bnb32 Feb 19, 2025
a31163d
don't need this `max_paddings` method since `min_width` is a user arg…
bnb32 Feb 19, 2025
78eaacf
`if not self.generator_weights` condition added to obs model `init_we…
bnb32 Feb 22, 2025
3c703bf
Removing `Sup3rGanWithObs` model. Keeping just `Sup3rGanFixedObs` mod…
bnb32 Mar 3, 2025
e0968ea
Removing tf.function decorator from some top level functions to preve…
bnb32 Mar 3, 2025
b806691
Removing unused experimental layers: `Sup3rImpute`, `Sup3rConcatObsBl…
bnb32 Mar 4, 2025
7dc4e40
Added relativistic discriminator loss used in ESRGAN paper.
bnb32 Mar 6, 2025
020278d
Concatenating less / more realistic terms to get "mean" - disc gettin…
bnb32 Mar 6, 2025
57a5c4f
Changed `Sup3rGanFixedObs` to Sup3rGanWithObs`
bnb32 Mar 18, 2025
e99d7d8
Initial commit with `ObsRasterizer` for integrating sparse observatio…
bnb32 Mar 19, 2025
8130f52
moved `obs_features` to interface to use base `init_weights` method. …
bnb32 Mar 26, 2025
bff8b71
Clipped negative lower limit for observation mask sampling. Era downl…
bnb32 Apr 1, 2025
dbae7ff
xarray future warning - changed ds.dims to ds.sizes
bnb32 Apr 1, 2025
f01d8b4
example notebook for running sup3rwind models. added skin temp to era…
bnb32 Apr 9, 2025
2c534c6
lin interp threshold - was previously getting excessively large wind …
bnb32 Apr 11, 2025
901baa0
Combined ExoRasterizerNC / H5. Added functionality for 3d regridding …
bnb32 Apr 11, 2025
29231ca
Added some preproc to xr_open_mfdataset for handling problematic era …
bnb32 Apr 11, 2025
5eec440
Added check for correct number of model outputs
bnb32 Apr 11, 2025
7b68f16
Adding time index to coords of 3d exo data
bnb32 Apr 11, 2025
c5c6078
simpler running sup3r models notebook
bnb32 Apr 12, 2025
925f448
netcdf version cap comment in pyproject.toml
bnb32 Apr 12, 2025
fd0ac66
dont need to copy inputs in np.meshgrid. set copy=False in `get_lat_lon`
bnb32 Apr 12, 2025
6310777
reverting netcdf version cap
bnb32 Apr 12, 2025
2f14707
numpy < 2.0 - need this to use modules compiled with np 1.x
bnb32 Apr 12, 2025
6126d3e
Update pyproject.toml
bnb32 Apr 12, 2025
8050d1d
time dependent lat / lon features - need the time dimension for dual …
bnb32 Apr 16, 2025
9bd96bf
updated sup3rwind example readme with example notebook.
bnb32 Apr 16, 2025
889f01c
Added `compute_disc` to solar model `calc_loss`. Split up the sub day…
bnb32 Apr 17, 2025
0d4f681
fix: wasn't updating batch loss details correctly when both disc and …
bnb32 Apr 17, 2025
29d15a9
NaN checks for fwp input and output and some QOL adds.
bnb32 Apr 19, 2025
ac96bd1
add some log info on output values outside of good range.
bnb32 Apr 20, 2025
07b86ea
feat: add nn_fill option to handle data outside limits in forward pas…
bnb32 Apr 21, 2025
543f464
refactor: update nn_fill_array import path and remove unused function…
bnb32 Apr 21, 2025
d676594
feat: enabled forward passes with obs models to run without obs data;…
bnb32 Apr 22, 2025
489b2f7
refactor: remove weighted observation layers and update related confi…
bnb32 Apr 23, 2025
f628a91
feat: enhance output handling by ensuring float32 data type for lat/l…
bnb32 Apr 24, 2025
33e4305
fix: remove duplicate compute_disc parameter in SolarCC class constru…
bnb32 Apr 24, 2025
c37619f
Update pyproject.toml
bnb32 Apr 24, 2025
2f9ec3f
fix: only compute obs loss calc when training the generator; typo in …
bnb32 Apr 25, 2025
fbcb039
more robust height interpolation which uses lower and higher levels i…
bnb32 Apr 30, 2025
1f57348
Interp method to use closest levels if there are no levels lower or h…
bnb32 May 1, 2025
1de8cb3
remove redundant calcs in Sup3rGanDC val loss
bnb32 May 1, 2025
78c1987
feat: previously using time_slice=slice(None) when initializing the i…
bnb32 Apr 30, 2025
e2b452f
Adding cli entry points to pyproject.toml
bnb32 May 5, 2025
5383593
`_write_single` as classmethod in cacher
bnb32 May 5, 2025
93735f3
using _write_single instead of write_netcdf - absorbs handling of tmp…
bnb32 May 6, 2025
5046c7b
using _write_single instead of write_netcdf - absorbs handling of tmp…
bnb32 May 6, 2025
7082c87
ensuring correct ordering of input features to generator
bnb32 May 7, 2025
d2e7692
less verbose logging
bnb32 May 7, 2025
87254ef
using _write_single instead of write_netcdf - absorbs handling of tmp…
bnb32 May 6, 2025
8741fb9
`_write_single` feature parsing fix
bnb32 May 7, 2025
17d02ee
fix: already have file paths in input_handler_kwargs
bnb32 May 7, 2025
d7276a2
feat: sample obs frac for each batch element to build obs mask
bnb32 May 18, 2025
264fc47
removed embedded obs layer and fix tf.gather calls.
bnb32 May 18, 2025
b0bda95
test fix - obs mask to numpy from tensor.
bnb32 May 18, 2025
db1be8e
feat: enhance loss function handling in Sup3rGan and Sup3rGanWithObs …
bnb32 May 25, 2025
7de5183
bug fix with temporal_pad = 0
bnb32 May 28, 2025
b06b92c
updated docs for obs model use info. corrected some old doc strings. …
bnb32 May 29, 2025
4482ea4
test fix -> renamed get_obs_mask to _get_full_obs_mask to make privat…
bnb32 May 30, 2025
6ed7795
asserts instead of checks in ExoData init.
bnb32 May 30, 2025
4134834
bump required phygnn version
bnb32 May 30, 2025
20c4eb3
enabling loss_obs functions which require 2d or 3d tensors
bnb32 May 30, 2025
86ea30b
changed notebook to use all keys required for ExoData
bnb32 May 30, 2025
a5002b0
run_exo_layer methods for handling sup3r obs model layers
bnb32 May 30, 2025
f79983e
fix: check for means is None before norm in run exo layer
bnb32 May 31, 2025
c76fea1
refactor: use tf methods for obs loss calc and gpu splitting
bnb32 Jun 2, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -78,4 +78,4 @@ Brandon Benton, Grant Buster, Guilherme Pimenta Castelao, Malik Hassanaly, Pavlo
Acknowledgments
===============

This work was authored by the National Renewable Energy Laboratory, operated by Alliance for Sustainable Energy, LLC, for the U.S. Department of Energy (DOE) under Contract No. DE-AC36-08GO28308. This research was supported by the Grid Modernization Initiative of the U.S. Department of Energy (DOE) as part of its Grid Modernization Laboratory Consortium, a strategic partnership between DOE and the national laboratories to bring together leading experts, technologies, and resources to collaborate on the goal of modernizing the nation’s grid. Funding provided by the the DOE Office of Energy Efficiency and Renewable Energy (EERE), the DOE Office of Electricity (OE), DOE Grid Deployment Office (GDO), the DOE Office of Fossil Energy and Carbon Management (FECM), and the DOE Office of Cybersecurity, Energy Security, and Emergency Response (CESER), the DOE Advanced Scientific Computing Research (ASCR) program, the DOE Solar Energy Technologies Office (SETO), the DOE Wind Energy Technologies Office (WETO), the United States Agency for International Development (USAID), and the Laboratory Directed Research and Development (LDRD) program at the National Renewable Energy Laboratory. The research was performed using computational resources sponsored by the Department of Energy's Office of Energy Efficiency and Renewable Energy and located at the National Renewable Energy Laboratory. The views expressed in the article do not necessarily represent the views of the DOE or the U.S. Government. The U.S. Government retains and the publisher, by accepting the article for publication, acknowledges that the U.S. Government retains a nonexclusive, paid-up, irrevocable, worldwide license to publish or reproduce the published form of this work, or allow others to do so, for U.S. Government purposes.
This work was authored by the National Renewable Energy Laboratory, operated by Alliance for Sustainable Energy, LLC, for the U.S. Department of Energy (DOE) under Contract No. DE-AC36-08GO28308. This research was supported by the Grid Modernization Initiative of the U.S. Department of Energy (DOE) as part of its Grid Modernization Laboratory Consortium, a strategic partnership between DOE and the national laboratories to bring together leading experts, technologies, and resources to collaborate on the goal of modernizing the nation’s grid. Funding provided by the the DOE Office of Energy Efficiency and Renewable Energy (EERE), the DOE Office of Electricity (OE), DOE Grid Deployment Office (GDO), the DOE Office of Fossil Energy and Carbon Management (FECM), and the DOE Office of Cybersecurity, Energy Security, and Emergency Response (CESER), the DOE Advanced Scientific Computing Research (ASCR) program, the DOE Solar Energy Technologies Office (SETO), the DOE Wind Energy Technologies Office (WETO), the United States Agency for International Development (USAID), and the Laboratory Directed Research and Development (LDRD) program at the National Renewable Energy Laboratory. The research was performed using computational resources sponsored by the Department of Energy's Office of Energy Efficiency and Renewable Energy and located at the National Renewable Energy Laboratory. The views expressed in the article do not necessarily represent the views of the DOE or the U.S. Government. The U.S. Government retains and the publisher, by accepting the article for publication, acknowledges that the U.S. Government retains a nonexclusive, paid-up, irrevocable, worldwide license to publish or reproduce the published form of this work, or allow others to do so, for U.S. Government purposes.
Comment thread
bnb32 marked this conversation as resolved.
Outdated
41 changes: 38 additions & 3 deletions examples/sup3rwind/README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
Sup3rWind Examples
###################

Super-Resolution for Renewable Energy Resource Data with Wind from Reanalysis Data (Sup3rWind) is one application of the sup3r software. In this work, we train generative models to create high-resolution (2km 5-minute) wind data based on coarse (30km hourly) ERA5 data. The generative models and high-resolution output data is publicly available via the `Open Energy Data Initiative (OEDI) <https://data.openei.org/s3_viewer?bucket=nrel-pds-wtk&prefix=sup3rwind%2F>`__ and via HSDS at the bucket ``nrel-pds-hsds`` and path ``/nrel/wtk/sup3rwind``. This data covers recent historical time periods for an expanding selection of countries.
Super-Resolution for Renewable Energy Resource Data with Wind from Reanalysis Data (Sup3rWind) is one application of the sup3r software. In this work, we train generative models to create high-resolution (2km 5-minute) wind data based on coarse (30km hourly) ERA5 data. The generative models, high-resolution output data, and training data is publicly available via the `Open Energy Data Initiative (OEDI) <https://data.openei.org/s3_viewer?bucket=nrel-pds-wtk&prefix=sup3rwind%2F>`__ and via HSDS at the bucket ``nrel-pds-hsds`` and path ``/nrel/wtk/sup3rwind``. This data covers recent historical time periods for an expanding selection of countries.

Sup3rWind Data Access
----------------------
Expand All @@ -11,8 +11,8 @@ The Sup3rWind data and models are publicly available in a public AWS S3 bucket.

The Sup3rWind data is also loaded into `HSDS <https://www.hdfgroup.org/solutions/highly-scalable-data-service-hsds/>`__ so that you may stream the data via the `NREL developer API <https://developer.nrel.gov/signup/>`__ or your own HSDS server. This is the best option if you're not going to want a full annual dataset. See these `rex instructions <https://nrel.github.io/rex/misc/examples.hsds.html>`__ for more details on how to access this data with HSDS and rex.

Example Sup3rWind Data Usage
-----------------------------
Sup3rWind Data Usage
---------------------

Sup3rWind data can be used in generally the same way as `Sup3rCC <https://nrel.github.io/sup3r/examples/sup3rcc.html>`__ data, with the condition that Sup3rWind includes only wind data and ancillary variables for modeling wind energy generation. Refer to the Sup3rCC `example notebook <https://github.com/NREL/sup3r/tree/main/examples/sup3rcc/using_the_data.ipynb>`__ for usage patterns.

Expand All @@ -32,6 +32,41 @@ The process for running the Sup3rWind models is much the same as for `Sup3rCC <h
#. If you're running on a slurm cluster, this will kick off a number of jobs that you can see with the ``squeue`` command. If you're running locally, your terminal should now be running the Sup3rWind models. The software will create a ``./logs/`` directory in which you can monitor the progress of your jobs.
#. The ``sup3r-pipeline`` is designed to run several modules in serial, with each module running multiple chunks in parallel. Once the first module (forward-pass) finishes, you'll want to run ``python -m sup3r.cli -c config_pipeline.json pipeline`` again. This will clean up status files and kick off the next step in the pipeline (if the current step was successful).

You can also checkout the `example notebook <https://github.com/NREL/sup3r/tree/main/examples/sup3rwind/running_sup3r_models.ipynb>`__ for how to run models without config files.

Training from scratch
---------------------

To train Sup3rWind models from scratch use the public training `data <https://data.openei.org/s3_viewer?bucket=nrel-pds-wtk&prefix=sup3rwind%2Ftraining_data%2F>`__. This data is for training the spatial enhancement models only. The 2024-01 `models <https://data.openei.org/s3_viewer?bucket=nrel-pds-wtk&prefix=sup3rwind%2Fmodels%2Fsup3rwind_models_202401%2F>`__ perform spatial enhancement in two steps, 3x from ERA5 to coarsened WTK and 5x from coarsened WTK to uncoarsened WTK. The currently used approach performs spatial enhancement in a single 15x step.

For a given year and training domain, initialize low-resolution and high-resolution data handlers and wrap these in a dual rasterizer object. Do this for as many years and training regions as desired, and use these containers to initialize a batch handler. To train models for 3x spatial enhancement use ``hr_spatial_coarsen=5`` in the ``hr_dh``. To train models for 15x (the currently used approach) ``hr_spatial_coarsen=1``. (Refer to tests and docs for information on additional arguments, denoted by the ellipses)::

from sup3r.preprocessing import DataHandler, DualBatchHandler, DualRasterizer
containers = []
for tdir in training_dirs:
lr_dh = DataHandler(f"{tdir}/lr_*.h5", ...)
hr_dh = DataHandler(f"{tdir}/hr_*.h5", hr_spatial_coarsen=...)
container = DualRasterizer({'low_res': lr_dh, 'high_res': hr_dh}, ...)
containers.append(container)
bh = DualBatchHandler(train_containers=containers, ...)

To train a 5x model use the ``hr_*.h5`` files for both the ``lr_dh`` and the ``hr_dh``. Use ``hr_spatial_coarsen=3`` in the ``lr_dh`` and ``hr_spatial_coarsen=1`` in the ``hr_dh``::

for tdir in training_dirs:
lr_dh = DataHandler(f"{tdir}/hr_*.h5", hr_spatial_coarsen=3, ...)
hr_dh = DataHandler(f"{tdir}/hr_*.h5", hr_spatial_coarsen=1, ...)
container = DualRasterizer({'low_res': lr_dh, 'high_res': hr_dh}, ...)
containers.append(container)
bh = DualBatchHandler(train_containers=containers, ...)


Initialize a 3x, 5x, or 15x spatial enhancement model, with 14 output channels, and train for the desired number of epochs. (The 3x and 5x generator configs can be copied from the ``model_params.json`` files in each OEDI model `directory <https://data.openei.org/s3_viewer?bucket=nrel-pds-wtk&prefix=sup3rwind%2Fmodels%2Fsup3rwind_models_202401%2F>`__. The 15x generator config can be created from the OEDI model configs by changing the spatial enhancement factor or from the configs in the repo by changing the enhancement factor and the number of output channels)::

from sup3r.models import Sup3rGan
model = Sup3rGan(gen_layers="./gen_config.json", disc_layers="./disc_config.json", ...)
model.train(batch_handler, ...)


Sup3rWind Versions
-------------------

Expand Down
706 changes: 706 additions & 0 deletions examples/sup3rwind/running_sup3r_models.ipynb

Large diffs are not rendered by default.

27 changes: 24 additions & 3 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ classifiers=[
]
dependencies = [
"NREL-rex>=0.2.91",
"NREL-phygnn>=0.0.23",
"NREL-phygnn>=0.0.31",
"NREL-gaps>=0.6.13",
"NREL-farms>=1.0.4",
"dask>=2022.0",
Expand All @@ -42,10 +42,21 @@ dependencies = [
"pytest>=5.2",
"scipy>=1.0.0",
"sphinx>=7.0",
"tensorflow>2.4,<2.16",
"xarray>=2023.0"
]

# If used, cause glibc conflict
# [tool.pixi.target.linux-64.dependencies]
# cuda = ">=11.8"
# cudnn = {version = ">=8.6.0", channel = "conda-forge"}
# # 8.9.7

[tool.pixi.target.linux-64.pypi-dependencies]
tensorflow = {version = "~=2.15.1", extras = ["and-cuda"] }

[tool.pixi.target.osx-arm64.dependencies]
tensorflow = {version = "~=2.15.0", channel = "conda-forge"}

[project.optional-dependencies]
dev = [
"build>=0.5",
Expand All @@ -64,6 +75,16 @@ test = [
"pytest-env"
]

[project.scripts]
sup3r = "sup3r.cli:main"
sup3r-pipeline = "sup3r.pipeline.pipeline_cli:main"
sup3r-batch = "sup3r.batch.batch_cli:main"
sup3r-qa = "sup3r.qa.qa_cli:main"
sup3r-bias-calc = "sup3r.bias.bias_calc_cli:main"
sup3r-solar = "sup3r.solar.solar_cli:main"
sup3r-forward-pass = "sup3r.pipeline.forward_pass_cli:main"
sup3r-collect = "sup3r.postprocessing.data_collect_cli:main"

[project.urls]
homepage = "https://github.com/NREL/sup3r"
documentation = "https://nrel.github.io/sup3r/"
Expand Down Expand Up @@ -272,7 +293,6 @@ matplotlib = ">=3.1"
numpy = "~=1.7"
pandas = ">=2.0"
scipy = ">=1.0.0"
tensorflow = ">2.4,<2.16"
xarray = ">=2023.0"

[tool.pixi.pypi-dependencies]
Expand All @@ -284,6 +304,7 @@ NREL-farms = { version = ">=1.0.4" }

[tool.pixi.environments]
default = { solve-group = "default" }
kestrel = { features = ["kestrel"], solve-group = "default" }
dev = { features = ["dev", "doc", "test"], solve-group = "default" }
doc = { features = ["doc"], solve-group = "default" }
test = { features = ["test"], solve-group = "default" }
Expand Down
1 change: 1 addition & 0 deletions sup3r/models/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
from .multi_step import MultiStepGan, MultiStepSurfaceMetGan, SolarMultiStepGan
from .solar_cc import SolarCC
from .surface import SurfaceSpatialMetModel
from .with_obs import Sup3rGanWithObs

SPATIAL_FIRST_MODELS = (MultiStepSurfaceMetGan,
SolarMultiStepGan)
Loading