class foxes.input.states.NEWAStates(foxes.input.states.DatasetStates)[source]

Heterogeneous ambient states in NEWA-WRF format.

Attributes

states_coord: str

The states coordinate name in the data

x_coord: str

The x coordinate name in the data

y_coord: str

The y coordinate name in the data

h_coord: str

The height coordinate name in the data

weight_ncvar: str

Name of the weight data variable in the nc file(s)

interpn_pars: dict, optional

Additional parameters for scipy.interpolate.interpn

bounds_extra_space: float or str

The extra space, either float in m, or str for units of D, e.g. ‘2.5D’

height_bounds: tuple, optional

The (h_min, h_max) height bounds in m. Defaults to H +/- 0.5*D

Examples

Example of one of the NetCDF input files in NEWA format:

>>>     Dimensions:      (time: 144, south_north: 165, west_east: 234, height: 15)
>>>     Coordinates:
>>>      * time         (time) datetime64[ns] 1kB 2006-01-04 ... 2006-01-04T23:54:00
>>>      * south_north  (south_north) float32 660B -1.79e+05 -1.77e+05 ... 1.49e+05
>>>      * west_east    (west_east) float32 936B -2.48e+05 -2.46e+05 ... 2.18e+05
>>>      * height       (height) float32 60B 25.0 50.0 75.0 90.0 ... 400.0 500.0 1e+03
>>>        XLAT         (south_north, west_east) float32 154kB ...
>>>        XLON         (south_north, west_east) float32 154kB ...
>>>    Data variables: (12/24)
>>>        WS           (time, height, south_north, west_east) float32 334MB ...
>>>        ...

Public members

NEWAStates(input_files_nc, time_coord='time', ...)[source]

Constructor.

preproc_first(algo, data, cmap, vars, bounds_extra_space, ...)[source]

Preprocesses the first file.

load_data(algo, verbosity=0)[source]

Load and/or create all model data that is subject to chunking.

interpolate_data(idims, icrds, d, pts, vrs, times)[source]

Interpolates data to points.

property data_source

The data source

gen_states_split_size()[source]

Generator for suggested states split sizes for output writing.

set_running(algo, data_stash, sel=None, isel=None, verbosity=0)[source]

Sets this model status to running, and moves all large data to stash.

unset_running(algo, data_stash, sel=None, isel=None, verbosity=0)[source]

Sets this model status to not running, recovering large data from stash

output_point_vars(algo)[source]

The variables which are being modified by the model.

size()[source]

The total number of states.

index()[source]

The index list

get_calc_data(mdata, cmap, variables)[source]

Gathers data for calculations.

calculate(algo, mdata, fdata, tdata)[source]

The main model calculation.

reset(algo=None, states_sel=None, states_loc=None, verbosity=0)[source]

Reset the states, optionally select states

classmethod new(states_type, *args, **kwargs)[source]

Run-time states factory.

output_coords()[source]

Gets the coordinates of all output arrays

ensure_output_vars(algo, tdata)[source]

Ensures that the output variables are present in the target data.

run_calculation(algo, *data, out_vars, **calc_pars)[source]

Starts the model calculation in parallel.

__repr__()[source]

Return repr(self).

property model_id

Unique id based on the model type.

var(v)[source]

Creates a model specific variable name.

unvar(vnm)[source]

Translates model specific variable name to origninal variable name.

property initialized

Initialization flag.

sub_models()[source]

List of all sub-models

initialize(algo, verbosity=0, force=False)[source]

Initializes the model.

property running

Flag for currently running models

finalize(algo, verbosity=0)[source]

Finalizes the model.

get_data(variable, target, lookup='smfp', mdata=None, ...)[source]

Getter for a data entry in the model object or provided data sources