API reference

This page provides an auto-generated summary of xarray’s API. For more details and examples, refer to the relevant chapters in the main part of the documentation.

Top-level functions

This provides the main interface for ESEm and should be the starting point for most users.

gp_model

Create a Gaussian process (GP) based emulator with provided training_params (X) and training_data (Y) which assumes independent inputs (and outputs).

cnn_model

Create a simple two layer Convolutional Neural Network Emulator using Keras.

rf_model

Create a simple Random Forest Emulator using sklearn.

Emulator

Emulator

A class wrapping a statistical emulator

Emulator.train

Train on the training data

Emulator.predict

Make a prediction using a trained emulator

Emulator._predict

The (internal) predict interface used by e.g., a sampler.

Emulator.batch_stats

Return mean and standard deviation in model predictions over samples, without storing the intermediate predicions in memory to enable evaluating large models over more samples than could fit in memory

Sampler

This class defines the sampling interface currently used by the ABC and MCMC sampling implementations.

Sampler

A class that efficiently samples a Model object for posterior inference

Sampler.sample

This is the call that does the actual inference.

MCMCSampler

MCMCSampler

Sample from the posterior using the TensorFlow Markov-Chain Monte-Carlo (MCMC) sampling tools.

MCMCSampler.sample

This is the call that does the actual inference.

ABCSampler

ABCSampler

Sample from the posterior using Approximate Bayesian Computation (ABC).

ABCSampler.sample

Sample the emulator over prior_x and compare with the observations, returning n_samples of the posterior distribution (those points for which the model is compatible with the observations).

ABCSampler.get_implausibility

Calculate the implausibility of the provided sample points, optionally in batches.

ABCSampler.batch_constrain

Constrain the supplied sample points based on the tolerance threshold, optionally in bathes.

Wrappers

ProcessWrapper

This class handles applying any data pre- and post-processing by any provided DataProcessor

DataWrapper

Provide a unified interface for numpy arrays, Iris Cube’s and xarray DataArrays.

DataWrapper.name

DataWrapper.data

DataWrapper.dtype

DataWrapper.wrap

Wrap back in a cube if one was provided

CubeWrapper

DataArrayWrapper

ModelAdaptor

ModelAdaptor

Provides a unified interface for all emulation engines within ESEm.

SKLearnModel

A wrapper around scikit-learn models.

KerasModel

A wrapper around Keras models

GPFlowModel

A wrapper around GPFlow regression models

DataProcessor

DataProcessor

A utility class for transparently processing (transforming) numpy arrays and un-processing TensorFlow Tensors to aid in emulation.

Log

Return log(x + c) where c can be specified.

Whiten

Scale the data to have zero mean and unit variance

Normalise

Linearly scale the data to lie between [0, 1]

Flatten

Flatten all dimensions except the leading one

Reshape

Ensure the training data is the right shape for the ConvNet

Recast

Cast the data to a given type

Utilities

A collection of associated utilities which might be of use when performing typical ESEm workflows.

plot_results

Validation plot for LeaveOneOut

validation_plot

plot_parameter_space

get_uniform_params

Slightly convoluted method for getting a flat set of points evenly

get_random_params

Get points randomly sampling a (unit) N-dimensional space

ensemble_collocate

Efficiently collocate (interpolate) many ensemble members on to a set of (un-gridded) observations

leave_one_out

Function to perform LeaveOneOut cross-validation with different models.

get_param_mask

Determine the most relevant parameters in the input space using a regularised linear model and either the Aikake or Baysian Information Criterion.