Distributed Calibration Tutorial Using Julia Workers

This example will teach you how to use ClimaCalibrate to parallelize your calibration with workers. Workers are additional processes spun up to run code in a distributed fashion. In this tutorial, we will run ensemble members' forward models on different workers.

The example calibration uses CliMA's atmosphere model, ClimaAtmos.jl, in a column spatial configuration for 30 days to simulate outgoing radiative fluxes. Radiative fluxes are used in the observation map to calibrate the astronomical unit.

First, we load in some necessary packages.

using Distributed
import ClimaCalibrate as CAL
import ClimaAnalysis: SimDir, get, slice, average_xy
using ClimaUtilities.ClimaArtifacts
import EnsembleKalmanProcesses: I, ParameterDistributions.constrained_gaussian

Next, we add workers. These are primarily added by Distributed.addprocs or by starting Julia with multiple processes: julia -p <nprocs>.

addprocs itself initializes the workers and registers them with the main Julia process, but there are multiple ways to call it. The simplest is just addprocs(nprocs), which will create new local processes on your machine. The other is to use SlurmManager, which will acquire and start workers on Slurm resources. You can use keyword arguments to specify the Slurm resources:

addprocs(ClimaCalibrate.SlurmManager(nprocs), gpus_per_task = 1, time = "01:00:00")

For this example, we would add one worker if it was compatible with Documenter.jl:

addprocs(1)

We can see the number of workers and their ID numbers:

nworkers()
1
workers()
1-element Vector{Int64}:
 1

We can call functions on the worker using remotecall. We pass in the function name and the worker ID followed by the function arguments.

remotecall_fetch(*, 1, 4, 4)
16

ClimaCalibrate uses this functionality to run the forward model on workers.

Since the workers start in their own Julia sessions, we need to import packages and declare variables. Distributed.@everywhere executes code on all workers, allowing us to load the code that they need.

@everywhere begin
    output_dir = joinpath("output", "climaatmos_calibration")
    import ClimaCalibrate as CAL
    import ClimaAtmos as CA
    import ClimaComms
end
output_dir = joinpath("output", "climaatmos_calibration")
mkpath(output_dir)
"output/climaatmos_calibration"

First, we need to set up the forward model, which take in the sampled parameters, runs, and saves diagnostic output that can be processed and compared to observations. The forward model must override ClimaCalibrate.forward_model(iteration, member), since the workers will run this function in parallel.

Since forward_model(iteration, member) only takes in the iteration and member numbers, so we need to use these as hooks to set the model parameters and output directory. Two useful functions:

The forward model below is running ClimaAtmos.jl in a minimal column spatial configuration.

@everywhere function CAL.forward_model(iteration, member)
    config_dict = Dict(
        "dt" => "2000secs",
        "t_end" => "30days",
        "config" => "column",
        "h_elem" => 1,
        "insolation" => "timevarying",
        "output_dir" => output_dir,
        "output_default_diagnostics" => false,
        "dt_rad" => "6hours",
        "rad" => "clearsky",
        "co2_model" => "fixed",
        "log_progress" => false,
        "diagnostics" => [
            Dict(
                "reduction_time" => "average",
                "short_name" => "rsut",
                "period" => "30days",
                "writer" => "nc",
            ),
        ],
    )
    #md # Set the output path for the current member
    member_path = CAL.path_to_ensemble_member(output_dir, iteration, member)
    config_dict["output_dir"] = member_path

    #md # Set the parameters for the current member
    parameter_path = CAL.parameter_path(output_dir, iteration, member)
    if haskey(config_dict, "toml")
        push!(config_dict["toml"], parameter_path)
    else
        config_dict["toml"] = [parameter_path]
    end

    #md # Turn off default diagnostics
    config_dict["output_default_diagnostics"] = false

    comms_ctx = ClimaComms.SingletonCommsContext()
    atmos_config = CA.AtmosConfig(config_dict; comms_ctx)
    simulation = CA.get_simulation(atmos_config)
    CA.solve_atmos!(simulation)
    return simulation
end

Next, the observation map is required to process a full ensemble of model output for the ensemble update step. The observation map just takes in the iteration number, and always outputs an array. For observation map output G_ensemble, G_ensemble[:, m] must the output of ensemble member m. This is needed for compatability with EnsembleKalmanProcesses.jl.

const days = 86_400
function CAL.observation_map(iteration)
    single_member_dims = (1,)
    G_ensemble = Array{Float64}(undef, single_member_dims..., ensemble_size)

    for m in 1:ensemble_size
        member_path = CAL.path_to_ensemble_member(output_dir, iteration, m)
        simdir_path = joinpath(member_path, "output_active")
        if isdir(simdir_path)
            simdir = SimDir(simdir_path)
            G_ensemble[:, m] .= process_member_data(simdir)
        else
            G_ensemble[:, m] .= NaN
        end
    end
    return G_ensemble
end

Separating out the individual ensemble member output processing often results in more readable code.

function process_member_data(simdir::SimDir)
    isempty(simdir.vars) && return NaN
    rsut =
        get(simdir; short_name = "rsut", reduction = "average", period = "30d")
    return slice(average_xy(rsut); time = 30days).data
end
process_member_data (generic function with 1 method)

Now, we can set up the remaining experiment details:

  • ensemble size, number of iterations
  • the prior distribution
  • the observational data
ensemble_size = 30
n_iterations = 7
noise = 0.1 * I
prior = constrained_gaussian("astronomical_unit", 6e10, 1e11, 2e5, Inf)
ParameterDistribution with 1 entries: 
'astronomical_unit' with EnsembleKalmanProcesses.ParameterDistributions.Constraint{EnsembleKalmanProcesses.ParameterDistributions.BoundedBelow}[Bounds: (200000.0, ∞)] over distribution EnsembleKalmanProcesses.ParameterDistributions.Parameterized(Distributions.Normal{Float64}(μ=24.153036641203013, σ=1.1528837102037748)) 

For a perfect model, we generate observations from the forward model itself. This is most easily done by creating an empty parameter file and running the 0th ensemble member:

@info "Generating observations"
parameter_file = CAL.parameter_path(output_dir, 0, 0)
mkpath(dirname(parameter_file))
touch(parameter_file)
simulation = CAL.forward_model(0, 0)
Simulation 
├── Running on: CPUSingleThreaded
├── Output folder: output/climaatmos_calibration/iteration_000/member_000/output_0000
├── Start date: 2010-01-01T00:00:00
├── Current time: 2.592e6 seconds
└── Stop time: 2.592e6 seconds

Lastly, we use the observation map itself to generate the observations.

observations = Vector{Float64}(undef, 1)
observations .= process_member_data(SimDir(simulation.output_dir))
1-element Vector{Float64}:
 126.61571502685547

Now we are ready to run our calibration, putting it all together using the calibrate function. The WorkerBackend will automatically use all workers available to the main Julia process. Other backends are available for forward models that can't use workers or need to be parallelized internally. The simplest backend is the JuliaBackend, which runs all ensemble members sequentially and does not require Distributed.jl. For more information, see the Backends page.

eki = CAL.calibrate(
    CAL.WorkerBackend,
    ensemble_size,
    n_iterations,
    observations,
    noise,
    prior,
    output_dir,
)
EnsembleKalmanProcesses.EnsembleKalmanProcess{Float64, Int64, EnsembleKalmanProcesses.Inversion{Float64, Nothing, Nothing}, EnsembleKalmanProcesses.DataMisfitController{Float64, String}, EnsembleKalmanProcesses.NesterovAccelerator{Float64}, Vector{EnsembleKalmanProcesses.UpdateGroup}, Nothing}(EnsembleKalmanProcesses.DataContainers.DataContainer{Float64}[EnsembleKalmanProcesses.DataContainers.DataContainer{Float64}([25.152987101423427 23.113430885124696 … 23.79546679184483 23.458185685100936]), EnsembleKalmanProcesses.DataContainers.DataContainer{Float64}([25.31427716309402 23.347561730040482 … 24.025944997806477 23.691071376880746]), EnsembleKalmanProcesses.DataContainers.DataContainer{Float64}([25.806383399506167 24.210442772742176 … 24.867468257022946 24.546635774439803]), EnsembleKalmanProcesses.DataContainers.DataContainer{Float64}([25.70716947483004 25.151646476201943 … 25.689321301653376 25.44579755841678]), EnsembleKalmanProcesses.DataContainers.DataContainer{Float64}([25.749069502189784 25.924335520889414 … 25.920313971964852 25.995016539629837]), EnsembleKalmanProcesses.DataContainers.DataContainer{Float64}([25.73465885579846 25.849004524080136 … 25.73235530536332 25.725161650499494]), EnsembleKalmanProcesses.DataContainers.DataContainer{Float64}([25.730951040355617 25.772178341328164 … 25.730563649947968 25.740528474648034]), EnsembleKalmanProcesses.DataContainers.DataContainer{Float64}([25.730372663901566 25.737578582922403 … 25.730542472462027 25.738222331207012])], EnsembleKalmanProcesses.ObservationSeries{Vector{EnsembleKalmanProcesses.Observation{Vector{Vector{Float64}}, Vector{LinearAlgebra.Diagonal{Float64, Vector{Float64}}}, Vector{LinearAlgebra.Diagonal{Float64, Vector{Float64}}}, Vector{String}, Vector{UnitRange{Int64}}}}, EnsembleKalmanProcesses.FixedMinibatcher{Vector{Vector{Int64}}, String, Random.TaskLocalRNG}, Vector{String}, Vector{Vector{Vector{Int64}}}}(EnsembleKalmanProcesses.Observation{Vector{Vector{Float64}}, Vector{LinearAlgebra.Diagonal{Float64, Vector{Float64}}}, Vector{LinearAlgebra.Diagonal{Float64, Vector{Float64}}}, Vector{String}, Vector{UnitRange{Int64}}}[EnsembleKalmanProcesses.Observation{Vector{Vector{Float64}}, Vector{LinearAlgebra.Diagonal{Float64, Vector{Float64}}}, Vector{LinearAlgebra.Diagonal{Float64, Vector{Float64}}}, Vector{String}, Vector{UnitRange{Int64}}}([[126.61571502685547]], LinearAlgebra.Diagonal{Float64, Vector{Float64}}[[0.1;;]], LinearAlgebra.Diagonal{Float64, Vector{Float64}}[[10.0;;]], ["observation"], UnitRange{Int64}[1:1])], EnsembleKalmanProcesses.FixedMinibatcher{Vector{Vector{Int64}}, String, Random.TaskLocalRNG}([[1]], "order", Random.TaskLocalRNG()), ["series_1"], Dict("minibatch" => 1, "epoch" => 8), [[[1]], [[1]], [[1]], [[1]], [[1]], [[1]], [[1]], [[1]]]), 30, EnsembleKalmanProcesses.DataContainers.DataContainer{Float64}[EnsembleKalmanProcesses.DataContainers.DataContainer{Float64}([39.85642623901367 0.6747526526451111 … 2.639538049697876 1.3445310592651367]), EnsembleKalmanProcesses.DataContainers.DataContainer{Float64}([55.02065658569336 1.0777435302734375 … 4.185023307800293 2.1422202587127686]), EnsembleKalmanProcesses.DataContainers.DataContainer{Float64}([147.13343811035156 6.053121566772461 … 22.52063751220703 11.85666275024414]), EnsembleKalmanProcesses.DataContainers.DataContainer{Float64}([120.66803741455078 39.750022888183594 … 116.4495620727539 71.56922149658203]), EnsembleKalmanProcesses.DataContainers.DataContainer{Float64}([131.2148895263672 186.23236083984375 … 184.7412567138672 214.4703369140625]), EnsembleKalmanProcesses.DataContainers.DataContainer{Float64}([127.4895248413086 160.21511840820312 … 126.89612579345703 125.09296417236328]), EnsembleKalmanProcesses.DataContainers.DataContainer{Float64}([126.54023742675781 137.41030883789062 … 126.45391845703125 128.99266052246094])], [46706.84518044956, 103003.21208109344, 49428.44531640068, 3166.922183805224, 2095.4059617797234, 764.4601408112545, 30.84575747383539], EnsembleKalmanProcesses.DataMisfitController{Float64, String}([7], 1.0, "stop"), EnsembleKalmanProcesses.NesterovAccelerator{Float64}([25.73113902009675 25.74529400812698 … 25.730966610260335 25.734608604403395], 0.20434762801820305), [2.9686494051022252e-6, 2.8039053591162422e-5, 2.3468575252442214e-5, 3.1652969690488206e-5, 4.203247531342898e-5, 7.464527307380994e-5, 0.0005678924387473507], EnsembleKalmanProcesses.UpdateGroup[EnsembleKalmanProcesses.UpdateGroup([1], [1], Dict("[1,...,1]" => "[1,...,1]"))], EnsembleKalmanProcesses.Inversion{Float64, Nothing, Nothing}(nothing, nothing, false, 0.0), Random.MersenneTwister(1234, (0, 1002, 0, 245)), EnsembleKalmanProcesses.FailureHandler{EnsembleKalmanProcesses.Inversion, EnsembleKalmanProcesses.SampleSuccGauss}(EnsembleKalmanProcesses.var"#failsafe_update#146"()), EnsembleKalmanProcesses.Localizers.Localizer{EnsembleKalmanProcesses.Localizers.SECNice, Float64}(EnsembleKalmanProcesses.Localizers.var"#13#14"{EnsembleKalmanProcesses.Localizers.SECNice{Float64}}(EnsembleKalmanProcesses.Localizers.SECNice{Float64}(1000, 1.0, 1.0))), 0.1, nothing, false)

This page was generated using Literate.jl.