- class foxes.input.states.DatasetStates(foxes.core.States)[source]
Abstract base class for heterogeneous ambient states that are based on data from NetCDF files or an xarray Dataset.
Attributes¶
- data_source: str or xarray.Dataset
The data or the file search pattern, should end with suffix ‘.nc’. One or many files.
- ovars: list of str
The output variables
- var2ncvar: dict
Mapping from variable names to variable names in the nc file
- fixed_vars: dict
Uniform values for output variables, instead of reading from data
- load_mode: str
The load mode, choices: preload, lazy, fly. preload loads all data during initialization, lazy lazy-loads the data using dask, and fly reads only states index and weights during initialization and then opens the relevant files again within the chunk calculation
- time_format: str
The datetime parsing format string
- sel: dict, optional
Subset selection via xr.Dataset.sel()
- isel: dict, optional
Subset selection via xr.Dataset.isel()
- weight_factor: float
The factor to multiply the weights with
- check_times: bool
Whether to check the time coordinates for consistency
- check_input_nans: bool
Whether to check input data for NaNs
- preprocess_nc: callable, optional
A function to preprocess the netcdf Dataset before use
- interp_pars: dict
Additional parameters the interpolation
Public members¶
-
DatasetStates(data_source, output_vars, var2ncvar=
{}, ...)[source] Constructor.
- property data_source
The data source
- preproc_first(algo, data, cmap, vars, bounds_extra_space, ...)[source]
Preprocesses the first file.
- gen_states_split_size()[source]
Generator for suggested states split sizes for output writing.
-
load_data(algo, cmap, variables, bounds_extra_space=
None, ...)[source] Load and/or create all model data that is subject to chunking.
-
set_running(algo, data_stash, sel=
None, isel=None, verbosity=0)[source] Sets this model status to running, and moves all large data to stash.
-
unset_running(algo, data_stash, sel=
None, isel=None, verbosity=0)[source] Sets this model status to not running, recovering large data from stash
- output_point_vars(algo)[source]
The variables which are being modified by the model.
- get_calc_data(mdata, cmap, variables)[source]
Gathers data for calculations.
-
reset(algo=
None, states_sel=None, states_loc=None, verbosity=0)[source] Reset the states, optionally select states
- classmethod new(states_type, *args, **kwargs)[source]
Run-time states factory.
- output_coords()[source]
Gets the coordinates of all output arrays
- ensure_output_vars(algo, tdata)[source]
Ensures that the output variables are present in the target data.
- run_calculation(algo, *data, out_vars, **calc_pars)[source]
Starts the model calculation in parallel.
- property model_id
Unique id based on the model type.
- property initialized
Initialization flag.
- sub_models()[source]
List of all sub-models
-
initialize(algo, verbosity=
0, force=False)[source] Initializes the model.
- property running
Flag for currently running models
-
get_data(variable, target, lookup=
'smfp', mdata=None, ...)[source] Getter for a data entry in the model object or provided data sources