Thanks to visit codestin.com
Credit goes to github.com

Skip to content

anthony-frion/Stochastic_CODA

Repository files navigation

stochastic_CODA

This project studies stochastic extensions of the original (deterministic) CODA model:

Zinchenko, Vadim, and David S. Greenberg. "Combined Optimization of Dynamics and Assimilation with End-to-End Learning on Sparse Observations." arXiv preprint arXiv:2409.07137 (2024).

The code is based on the one from this original work, which is available here: https://codebase.helmholtz.cloud/m-dml/hidden-process-learning

Usage:

One can generate a Lorenz-96 dataset using the mdml-tools repository, or directly download pre-generated data from here.

Afterwards, stochastic models can be trained by running the "main.py" script with the appropriate arguments.

For example, one can train a single model with a dropout of probability p=0.2 with:

python main.py +experiment=data_assimilation output_dir_base_path="." datamodule.path_to_load_data="data/L96_small.h5" rollout_length=25 input_window_extend=25 loss_alpha=0.5 random_seed=111 assimilation_network.dropout=0.2

A model parameterizing a diagonal Gaussian distribution conditioned on a window of observations can be trained with:

python main.py +experiment=data_assimilation_gaussian output_dir_base_path="." datamodule.path_to_load_data="data/L96_small.h5" rollout_length=20 input_window_extend=25 loss_alpha=0.4 random_seed=111

One can also do the same for the parameterization of a Gaussian distribution with a diagonal plus low-rank covariance matrix, with:

python main.py +experiment=data_assimilation_gaussian_LR output_dir_base_path="." datamodule.path_to_load_data="data/L96_small.h5" rollout_length=20 input_window_extend=25 loss_alpha=0.4 random_seed=111

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published