Using low-level forecast/analysis steps

Work in progress

This section should be expanded. Give a complete snippet ?

AugKF

We present here how to use the forecast and analysis steps of the AugKF algorithm.

Forecast/Analyse a CoreState

Principle

An algorithm is implemented through a AugkfAlgo object. It contains all the informations needed for the algorithm (configuration, priors, correlation matrices, drift_matrices). It uses a method forecast_step to forecast CoreStates that is, under the hood, delegated to a AugkfForecaster that is an object created at the initialisation of AugkfAlgo.

Similarly, the AugkfAlgo also implements analysis_step that is delegated to an AugkfAnalyser object, that also stores the observations needed for the analysis_step.

The easiest way to set up these objects is to use the an create_augkf_algo function from a Config that will take care of initialising all objects and data needed:

from pygeodyn.inout.config import ComputationConfig
from pygeodyn.augkf.algo import create_augkf_algo

config = ComputationConfig('augkf.conf') # Or whatever config path
nb_realisations = 10
algo = create_augkf_algo(config, nb_realisations)

Then, we can create CoreState objects to perform the forecast/analysis on them.

forecast_step takes a 1D CoreState as input (dim: nb_coeffs) as it is done for a single realisation. However, analysis_step takes a 2D CoreState as input (dim: nb_realisations x nb_coeffs) as all realisations are needed.

Example

The following code shows an example of a forecast step followed by an analysis step:

# Build a complete CoreState with 10 realisations, 3 times and the number of coefficients given in the config.
nb_reals = 10
nb_times = 3
cs = CoreState()

cs.addMeasure('MF', np.zeros(nb_reals, nb_times, config.Nb))
cs.addMeasure('SV', np.zeros(nb_reals, nb_times, config.Nsv))
cs.addMeasure('ER', np.zeros(nb_reals, nb_times, config.Nsv))
cs.addMeasure('U',  np.zeros(nb_reals, nb_times, config.Nu2))

# Do whatever operation to initialise the CoreState at t=0 (ex: set all realisations of B according to a magnetic model)
cs.B[:, 0] = my_awesome_magnetic_model

# Do a forecast step on all realisations
for i_real in range(nb_reals):
    # The input is the core state of a realisation at t=0
    forecast_result = algo.forecast_step(cs[i_real, 0])
    # The result is a realisation CoreState that we can use to update our CoreState at t=1
    cs[i_real, 1] = forecast_result

# Now do an analysis step on the forecast CoreState at t=1
obs_date = np.datetime64('1980-01') # Will use the observation data in Jan 1980
# The input is the core state at t=1 (with all realisations) and the date of observations
analysis_result = algo.analysis_step(cs[:, 1], obs_date)
# The result is a CoreState containing all analysed realisations that we can again use to update our CoreState (stored here at t=2)
cs[:, 2] = analysis_result

# Then we can access the data as previously:
cs.B[:, 0, 0] # Initial value of g10 for all realisations
cs.B[:, 1, 0] # Forecast value of g10 for all realisations
cs.U[0, 2] # Analysed values of all core flow coefs for the realisation 0
# and so on...

More information

See the API reference of the Augkf objects:

Common operations are implemented in the motherclass Computer from which AugkfForecaster and AugkfAnalyser inherit. The version of the algorithm with a PCA is implemented in PCA-specific objects (AugkfAlgoPCA, AugkfForecasterPCA, AugkfAnalyserPCA).

ZonKF

The principle is exactly the same but use the create_zonkf_algo function instead:

from pygeodyn.inout.config import ComputationConfig
from pygeodyn.zonkf.algo import create_zonkf_algo

config = ComputationConfig('zonkf.conf') # Or whatever config path
nb_realisations = 10
algo = create_zonkf_algo(config, nb_realisations)

More information

See the API reference of the Zonkf objects:

and also PCA-specific objects: