API reference¶
This page provides an auto-generated summary of xarray’s API. For more details and examples, refer to the relevant chapters in the main part of the documentation.
Top-level functions¶
This provides the main interface for ESEm and should be the starting point for most users.
Create a Gaussian process (GP) based emulator with provided training_params (X) and training_data (Y) which assumes independent inputs (and outputs). |
|
Create a simple two layer Convolutional Neural Network Emulator using Keras. |
|
Create a simple Random Forest Emulator using sklearn. |
Emulator¶
A class wrapping a statistical emulator |
|
Train on the training data |
|
Make a prediction using a trained emulator |
|
The (internal) predict interface used by e.g., a sampler. |
|
Return mean and standard deviation in model predictions over samples, without storing the intermediate predicions in memory to enable evaluating large models over more samples than could fit in memory |
Sampler¶
This class defines the sampling interface currently used by the ABC and MCMC sampling implementations.
A class that efficiently samples a Model object for posterior inference |
|
This is the call that does the actual inference. |
MCMCSampler¶
Sample from the posterior using the TensorFlow Markov-Chain Monte-Carlo (MCMC) sampling tools. |
|
This is the call that does the actual inference. |
ABCSampler¶
Sample from the posterior using Approximate Bayesian Computation (ABC). |
|
Sample the emulator over prior_x and compare with the observations, returning n_samples of the posterior distribution (those points for which the model is compatible with the observations). |
|
Calculate the implausibility of the provided sample points, optionally in batches. |
|
Constrain the supplied sample points based on the tolerance threshold, optionally in bathes. |
Wrappers¶
This class handles applying any data pre- and post-processing by any provided DataProcessor |
|
Provide a unified interface for numpy arrays, Iris Cube’s and xarray DataArrays. |
|
Wrap back in a cube if one was provided |
|
ModelAdaptor¶
Provides a unified interface for all emulation engines within ESEm. |
|
A wrapper around scikit-learn models. |
|
A wrapper around Keras models |
|
A wrapper around GPFlow regression models |
DataProcessor¶
A utility class for transparently processing (transforming) numpy arrays and un-processing TensorFlow Tensors to aid in emulation. |
|
Return log(x + c) where c can be specified. |
|
Scale the data to have zero mean and unit variance |
|
Linearly scale the data to lie between [0, 1] |
|
Flatten all dimensions except the leading one |
|
Ensure the training data is the right shape for the ConvNet |
|
Cast the data to a given type |
Utilities¶
A collection of associated utilities which might be of use when performing typical ESEm workflows.
Validation plot for LeaveOneOut |
|
Slightly convoluted method for getting a flat set of points evenly |
|
Get points randomly sampling a (unit) N-dimensional space |
|
Efficiently collocate (interpolate) many ensemble members on to a set of (un-gridded) observations |
|
Function to perform LeaveOneOut cross-validation with different models. |
|
Determine the most relevant parameters in the input space using a regularised linear model and either the Aikake or Baysian Information Criterion. |