API Reference

MLMC Estimation

MultilevelMonteCarlo.mlmc_estimateFunction
mlmc_estimate(levels, qoi_functions, samples_per_level, draw_parameters;
              parallel=false) -> Vector{Float64}

Compute MLMC estimates of E[qⱼ] for each quantity of interest, using pre-specified sample counts per level. Statistics are accumulated online.

The estimator for each QoI qⱼ is the telescoping sum:

\[\hat{Q}_j = \frac{1}{N_1}\sum_{i=1}^{N_1} q_j\bigl(L_1(p^{(i)})\bigr) + \sum_{l=2}^{L} \frac{1}{N_l}\sum_{i=1}^{N_l} \Bigl[q_j\bigl(L_l(p^{(i)})\bigr) - q_j\bigl(L_{l-1}(p^{(i)})\bigr)\Bigr]\]

For a single level this reduces to standard Monte Carlo.

Arguments

  • levels: [L₁, …, Lₗ] — model evaluators, one per fidelity level. Each Lₗ(params) -> model_output.
  • qoi_functions: [q₁, …, qQ] — QoI extractors. Each qⱼ(model_output) -> scalar.
  • samples_per_level: [N₁, …, Nₗ] — sample counts at each level.
  • draw_parameters: () -> params.

Keyword Arguments

  • parallel::Bool = false: Use threading within each level.

Returns

Vector{Float64} of length Q with estimates [E[q₁], …, E[qQ]].

source
MultilevelMonteCarlo.mlmc_estimate_adaptiveFunction
mlmc_estimate_adaptive(levels, qoi_functions, draw_parameters,
                       cost_per_level, tolerance;
                       variance_qoi=nothing, initial_samples=100,
                       parallel=false, max_iterations=20)
                       -> (estimates, samples_per_level)

Adaptively compute MLMC estimates by running pilot samples, computing optimal allocation, and iteratively adding samples until the budget is met.

All QoI means and the allocation variance are maintained as online accumulators, so new batches are merged without reprocessing earlier data.

Algorithm

  1. Run initial_samples at every level to obtain pilot variance estimates.
  2. Use optimal_samples_per_level to compute target sample counts.
  3. Draw additional samples at any level where the current count is below target.
  4. Re-estimate variances from all accumulated samples and repeat from step 2.
  5. Converge when no level needs more samples.

Arguments

  • levels: Model evaluators [L₁, …, Lₗ], one per fidelity level.
  • qoi_functions: QoI extractors [q₁, …, qQ], each q(model_output) -> scalar.
  • draw_parameters: () -> params.
  • cost_per_level: Cost per sample at each level.
  • tolerance: Target RMSE.

Keyword Arguments

  • variance_qoi: Scalar v(model_output) -> scalar for allocation. Default: v(out) = Σᵢ qᵢ(out)².
  • initial_samples::Int = 100: Pilot samples per level (≥ 2).
  • parallel::Bool = false: Thread samples within each level.
  • max_iterations::Int = 20: Safety limit on refinement iterations.

Returns

(estimates, samples_per_level) where estimates::Vector{Float64} has length Q and samples_per_level::Vector{Int} is the final allocation.

source
MultilevelMonteCarlo.mlmc_estimate_vector_qoiFunction
mlmc_estimate_vector_qoi(levels, qoi_function, samples_per_level, draw_parameters;
                          parallel=false) -> Vector{Float64}

MLMC estimator for a single vector-valued QoI function. The function must return a vector of the same length for every evaluation. Statistics are accumulated online per component using the standard telescoping sum.

Arguments

  • levels: [L₁, …, Lₗ] — model evaluators, one per fidelity level.
  • qoi_function: q(model_output) -> AbstractVector — a single QoI that returns a fixed-length vector.
  • samples_per_level: [N₁, …, Nₗ] — sample counts at each level.
  • draw_parameters: () -> params.

Keyword Arguments

  • parallel::Bool = false: Use threading within each level.

Returns

Vector{Float64} of the same length as qoi_function's output — the component-wise MLMC mean estimate.

source
MultilevelMonteCarlo.evaluate_monte_carloFunction
evaluate_monte_carlo(n_samples, draw_parameters, level, qoi_functions;
                     parallel=false) -> Vector{Float64}

Standard (single-level) Monte Carlo estimator for multiple QoIs.

Arguments

  • n_samples: Number of samples.
  • draw_parameters: () -> params.
  • level: Model evaluator L(params) -> model_output.
  • qoi_functions: [q₁, …, qQ] — QoI extractors.

Returns

Vector{Float64} of length Q with estimates [E[q₁], …, E[qQ]].

source
evaluate_monte_carlo(n_samples, draw_parameters, qoi_function;
                     parallel=false) -> Float64

Standard Monte Carlo estimator of E[Q] for a single scalar QoI.

source
MultilevelMonteCarlo.evaluate_monte_carlo_vector_qoiFunction
evaluate_monte_carlo_vector_qoi(n_samples, draw_parameters, level, qoi_function;
                                 parallel=false) -> Vector{Float64}

Standard (single-level) Monte Carlo estimator for a single vector-valued QoI.

Arguments

  • n_samples: Number of samples.
  • draw_parameters: () -> params.
  • level: Model evaluator L(params) -> model_output.
  • qoi_function: q(model_output) -> AbstractVector — must return a fixed-length vector.

Returns

Vector{Float64} — the component-wise MC mean estimate.

source

Sample Allocation

MultilevelMonteCarlo.variance_per_levelFunction
variance_per_level(levels, qoi_functions, samples_per_level, draw_parameters;
                   variance_qoi=nothing, parallel=false)
                   -> (variances_corrections, variances)

Estimate the variance of a scalar QoI and its multilevel corrections at each level. Statistics are computed online (Welford's algorithm).

The returned variances drive optimal sample allocation via optimal_samples_per_level. A single scalar variance_qoi is used for allocation; the individual qoi_functions are only needed to construct the default variance_qoi.

Arguments

  • levels: Vector of model evaluators [L₁, …, Lₗ], one per fidelity level. Each Lₗ(params) -> model_output runs the forward model.
  • qoi_functions: Vector of QoI extractors [q₁, …, qQ]. Each qⱼ(model_output) -> scalar. Used to construct the default variance_qoi.
  • samples_per_level: Pilot sample counts, one per level.
  • draw_parameters: () -> params.

Keyword Arguments

  • variance_qoi: v(model_output) -> scalar whose correction variance drives allocation. Default: v(out) = Σᵢ qᵢ(out)².
  • parallel::Bool = false: Use threading within each level.

Returns

(variances_corrections, variances):

  • variances_corrections[l] = Var[v(Lₗ₊₁(p)) − v(Lₗ(p))] for l = 1…L−1
  • variances[l] = Var[v(Lₗ(p))] for l = 1…L

For a single-level ensemble variances_corrections is an empty vector.

source
MultilevelMonteCarlo.optimal_samples_per_levelFunction
optimal_samples_per_level(variances_corrections, variances, cost_per_level, tolerance)

Compute optimal sample counts using the classical MLMC allocation formula that minimises total work for a target mean-square-error budget.

Arguments

  • variances_corrections: Var[correction] at levels 2…L (length L−1).
  • variances: Var[QoI] at each level (length L). Only variances[1] is used directly; levels 2+ draw from variances_corrections.
  • cost_per_level: Cost per sample at each level (length L).
  • tolerance: Target RMSE. Variance budget = tolerance² / 2.

Returns

Vector of optimal sample counts (integers).

source

Sample Collection & Storage

MultilevelMonteCarlo.MLMCSamplesType
MLMCSamples

Container holding raw QoI samples from an MLMC run.

Fields

  • fine::Vector{Matrix{Float64}}: fine[lvl] is (n_qois × N_lvl) — QoI values evaluated at the fine level.
  • coarse::Vector{Matrix{Float64}}: coarse[lvl] is (n_qois × N_lvl) — QoI values evaluated at the coarse level (lvl-1). For level 1 this is a matrix of zeros (no coarser level exists).
  • corrections::Vector{Matrix{Float64}}: corrections[lvl] = fine[lvl] - coarse[lvl].
  • n_levels::Int
  • n_qois::Int
source
MultilevelMonteCarlo.mlmc_sampleFunction
mlmc_sample(levels, qoi_functions, samples_per_level, draw_parameters;
            parallel=false, netcdf_path=nothing) -> MLMCSamples

Run an MLMC sampling campaign and return (or save) all raw QoI samples.

For each level l and each sample i, the model is evaluated at the fine level l and the coarse level l−1 (with the same parameters), and every QoI extractor is applied to both outputs.

Arguments

  • levels, qoi_functions, samples_per_level, draw_parameters: same as mlmc_estimate.

Keyword Arguments

  • parallel::Bool = false: thread samples within each level.
  • netcdf_path::Union{String,Nothing} = nothing: if a file path is given, all samples are written to a single NetCDF-4 file in addition to being returned.

Returns

An MLMCSamples struct containing fine, coarse, and corrections matrices for every level.

source

PDF & CDF Estimation

MultilevelMonteCarlo.estimate_pdf_mlmc_kernel_densityFunction
estimate_pdf_mlmc_kernel_density(samples::MLMCSamples, qoi_index::Int;
                                 bandwidth=nothing) -> Function

Construct a kernel density estimate of the PDF for QoI qoi_index from MLMC samples, using the multilevel telescoping-sum structure.

The estimator is:

\[\hat{f}(u) = \frac{1}{N_1} \sum_{i=1}^{N_1} K_h(u - Q_1^{(i)}) + \sum_{l=2}^{L} \frac{1}{N_l} \sum_{i=1}^{N_l} \bigl[K_h(u - Q_l^{(i)}) - K_h(u - Q_{l-1}^{(i)})\bigr]\]

where $K_h(x) = \frac{1}{h} \varphi(x/h)$ is a Gaussian kernel with bandwidth $h$.

Arguments

  • samples: An MLMCSamples struct.
  • qoi_index: Which QoI (column) to estimate the PDF for.

Keyword Arguments

  • bandwidth: Kernel bandwidth. If nothing, Silverman's rule of thumb is used based on the finest-level fine samples.

Returns

A callable f(u) -> Float64 that evaluates the estimated PDF at point u.

source
MultilevelMonteCarlo.estimate_cdf_mlmc_kernel_densityFunction
estimate_cdf_mlmc_kernel_density(samples::MLMCSamples, qoi_index::Int;
                                 bandwidth=nothing) -> Function

Construct a kernel CDF estimate for QoI qoi_index from MLMC samples.

Uses the same MLMC telescoping sum as the PDF estimator, but with the Gaussian CDF kernel $\Phi((u - x)/h)$ instead of the PDF kernel.

Returns

A callable F(u) -> Float64 that evaluates the estimated CDF at point u.

source
MultilevelMonteCarlo.estimate_pdf_maxentFunction
estimate_pdf_maxent(samples::MLMCSamples, qoi_index::Int;
                    R::Int=10, maxiter::Int=100, tol::Float64=1e-12,
                    support=nothing) -> (pdf_func, λ, a, b)

Estimate the PDF via the Maximum Entropy method using moments estimated from MLMC samples.

The PDF is approximated as:

\[\tilde{f}_U(u) = \exp\!\left(\sum_{k=0}^{R} \lambda_k \phi_k\!\left(\frac{2(u-a)}{b-a} - 1\right)\right)\]

where $\phi_k$ are orthonormal Legendre polynomials on $[-1,1]$, and $\lambda_k$ are determined by matching generalized moments via Newton's method. The Jacobian is computed using ForwardDiff.jl.

Arguments

  • samples: MLMCSamples struct.
  • qoi_index: Which QoI to estimate the PDF for.

Keyword Arguments

  • R::Int = 10: Number of polynomial moments (polynomial degree).
  • maxiter::Int = 100: Maximum Newton iterations.
  • tol::Float64 = 1e-12: Newton convergence tolerance (‖residual‖₂).
  • support: Tuple (a, b) for the support interval. If nothing, estimated from the sample range with 10% padding.

Returns

A tuple (pdf_func, λ, a, b) where:

  • pdf_func(u) evaluates the MaxEnt PDF at u (in original coordinates).
  • λ is the coefficient vector of length R+1.
  • a, b are the support bounds.
source
MultilevelMonteCarlo.estimate_cdf_maxentFunction
estimate_cdf_maxent(samples::MLMCSamples, qoi_index::Int; kwargs...)
                    -> (cdf_func, λ, a, b)

Estimate the CDF via the Maximum Entropy method. First computes the MaxEnt PDF via estimate_pdf_maxent, then integrates it numerically.

All keyword arguments are forwarded to estimate_pdf_maxent.

Returns

A tuple (cdf_func, λ, a, b) where cdf_func(u) evaluates the CDF at u.

source

2-D PDF & CDF Estimation

MultilevelMonteCarlo.estimate_pdf_mlmc_kernel_density_2dFunction
estimate_pdf_mlmc_kernel_density_2d(samples::MLMCSamples,
    qoi_indices::NTuple{2,Int}; bandwidth=nothing) -> Function

Construct a 2-D kernel density estimate of the joint PDF for two QoIs using the MLMC telescoping sum with a bivariate Gaussian kernel:

\[K_{\mathbf{h}}(\mathbf{x}) = \frac{1}{h_1 h_2} \varphi\!\left(\frac{x_1}{h_1}\right) \varphi\!\left(\frac{x_2}{h_2}\right)\]

Arguments

  • samples: An MLMCSamples struct.
  • qoi_indices: Tuple (j₁, j₂) of the two QoI row indices.

Keyword Arguments

  • bandwidth: Per-dimension bandwidths (h₁, h₂). Default: Silverman's rule per dimension using finest-level fine samples.

Returns

A callable f(x₁, x₂) -> Float64 evaluating the estimated joint PDF.

source
MultilevelMonteCarlo.estimate_cdf_mlmc_kernel_density_2dFunction
estimate_cdf_mlmc_kernel_density_2d(samples::MLMCSamples,
    qoi_indices::NTuple{2,Int}; bandwidth=nothing) -> Function

Construct a 2-D kernel CDF estimate for two QoIs using product Gaussian CDF kernels and the MLMC telescoping sum:

\[\hat{F}(x_1, x_2) = \frac{1}{N_1} \sum_{i=1}^{N_1} \Phi\!\left(\frac{x_1 - Q_{1,1}^{(i)}}{h_1}\right) \Phi\!\left(\frac{x_2 - Q_{1,2}^{(i)}}{h_2}\right) + \sum_{l=2}^{L} \frac{1}{N_l}\sum_{i=1}^{N_l}\bigl[\cdots\bigr]\]

Returns

A callable F(x₁, x₂) -> Float64 evaluating the estimated joint CDF.

source
MultilevelMonteCarlo.estimate_pdf_maxent_2dFunction
estimate_pdf_maxent_2d(samples::MLMCSamples, qoi_indices::NTuple{2,Int};
    R::Int=4, maxiter::Int=200, tol::Float64=1e-10,
    support=nothing) -> (pdf_func, Λ, a, b)

Estimate the 2-D joint PDF via Maximum Entropy using tensor-product Legendre moments estimated from MLMC samples.

The PDF is:

\[f(x,y) = \frac{4}{(b_1-a_1)(b_2-a_2)} \exp\!\left(\sum_{k=0}^{R}\sum_{l=0}^{R} \Lambda_{kl}\, P_k(\xi_1)\, P_l(\xi_2)\right)\]

where $\xi_i = 2(x_i - a_i)/(b_i - a_i) - 1$ maps to $[-1,1]$.

Arguments

Keyword Arguments

  • R::Int = 4: Polynomial degree per dimension. Total parameters = (R+1)².
  • maxiter, tol: Newton iteration controls.
  • support: ((a₁,b₁), (a₂,b₂)). Default: sample range + 10% padding.

Returns

(pdf_func, Λ, (a₁,b₁), (a₂,b₂)) where pdf_func(x₁,x₂) evaluates the PDF.

source
MultilevelMonteCarlo.estimate_cdf_maxent_2dFunction
estimate_cdf_maxent_2d(samples::MLMCSamples, qoi_indices::NTuple{2,Int};
    kwargs...) -> (cdf_func, Λ, (a₁,b₁), (a₂,b₂))

Estimate the 2-D joint CDF via the Maximum Entropy method. Computes the MaxEnt PDF via estimate_pdf_maxent_2d and integrates it numerically.

Returns

(cdf_func, Λ, bounds₁, bounds₂) where cdf_func(x₁,x₂) evaluates the CDF.

source

Rank Histograms

MultilevelMonteCarlo.ml_gregory_resampleFunction
ml_gregory_resample(samples::MLMCSamples, qoi_index::Int,
                    number_of_resamples::Int) -> Vector{Float64}

Generate resampled values from an MLMC ensemble using Gregory's multilevel inverse-CDF method.

Fine and coarse QoI samples are sorted independently at each level to form per-level empirical quantile functions, which are then combined via the MLMC telescoping sum:

\[\hat{Q}^{-1}(u) = \hat{F}_1^{-1}(u) + \sum_{l=2}^{L}\bigl[\hat{F}_l^{-1}(u) - \hat{G}_l^{-1}(u)\bigr]\]

where $\hat{F}_l^{-1}$ and $\hat{G}_l^{-1}$ are the empirical quantile functions of the fine and coarse samples at level $l$. Random draws $u \sim \mathrm{Uniform}(0,1)$ are passed through this inverse CDF to produce resamples.

Arguments

  • samples: An MLMCSamples struct.
  • qoi_index: Which QoI (row index) to resample.
  • number_of_resamples: How many i.i.d. resampled values to generate.

Returns

Vector{Float64} of resampled QoI values.

source
MultilevelMonteCarlo.rank_histogram_gregoryFunction
rank_histogram_gregory(observations, levels, qoi_functions,
                       samples_per_level, draw_parameters;
                       number_of_resamples=1000, parallel=false)
                       -> Vector{Int}

Compute a rank histogram using Gregory's MLMC inverse-CDF resampling for a given list of observations.

For each observation $y$ in observations:

  1. Run a fresh MLMC sampling campaign via mlmc_sample.
  2. Generate number_of_resamples values from the MLMC ensemble using ml_gregory_resample.
  3. Sort the resampled array and record the rank of $y$.

If the MLMC ensemble correctly represents the distribution of the observations, the returned ranks are approximately $\mathrm{Uniform}\{1,\ldots,N_r+1\}$ where $N_r$ = number_of_resamples.

Arguments

  • observations: Vector{<:Real} of observed values to rank.
  • levels: Model evaluators [L₁, …, L_L].
  • qoi_functions: [q(model_output) -> scalar] — a vector with a single QoI.
  • samples_per_level: Sample counts for each MLMC ensemble.
  • draw_parameters: () -> params.

Keyword Arguments

  • number_of_resamples::Int = 1000: Resampled values per ensemble.
  • parallel::Bool = false: Thread MLMC sampling within each ensemble.

Returns

Vector{Int} of ranks, each in $\{1,\ldots,N_r+1\}$.

source
rank_histogram_gregory(levels, qoi_function, draw_parameters,
                       number_of_rank_samples, samples_per_level;
                       number_of_resamples=1000, parallel=false)
                       -> Vector{Int}

Convenience wrapper that draws number_of_rank_samples observations from the finest level and then calls the observation-based rank_histogram_gregory.

source
MultilevelMonteCarlo.rank_histogram_cdfFunction
rank_histogram_cdf(observations, levels, qoi_functions,
                   samples_per_level, cdf_method, draw_parameters;
                   parallel=false) -> Vector{Float64}

Compute a probability-integral-transform (PIT) histogram using an MLMC-based CDF estimate for a given list of observations.

For each observation $y$ in observations:

  1. Run a fresh MLMC sampling campaign.
  2. Estimate the CDF $\hat{F}$ from the MLMC samples using cdf_method.
  3. Record $\hat{F}(y)$.

If the CDF estimate is well calibrated for the observations, the returned PIT values are approximately $\mathrm{Uniform}(0,1)$.

Arguments

  • observations: Vector{<:Real} of observed values to evaluate.
  • levels: Model evaluators [L₁, …, L_L].
  • qoi_functions: [q(model_output) -> scalar] — a vector with a single QoI.
  • samples_per_level: Sample counts for each MLMC ensemble.
  • cdf_method: (samples::MLMCSamples, qoi_index::Int) -> cdf_func where cdf_func(x)::Float64. Pass estimate_cdf_mlmc_kernel_density for KDE, or (s, i) -> first(estimate_cdf_maxent(s, i; R=6)) for MaxEnt.
  • draw_parameters: () -> params.

Keyword Arguments

  • parallel::Bool = false: Thread MLMC sampling within each ensemble.

Returns

Vector{Float64} of PIT values, each ideally in $[0, 1]$.

source
rank_histogram_cdf(levels, qoi_function, draw_parameters,
                   number_of_rank_samples, samples_per_level,
                   cdf_method; parallel=false) -> Vector{Float64}

Convenience wrapper that draws number_of_rank_samples observations from the finest level and then calls the observation-based rank_histogram_cdf.

source

Multivariate Rank Histogram

MultilevelMonteCarlo.estimate_cdf_multivariate_mlmc_kernel_densityFunction
estimate_cdf_multivariate_mlmc_kernel_density(samples::MLMCSamples,
    qoi_indices::AbstractVector{<:Integer}; bandwidth=nothing) -> Function

Estimate the joint multivariate CDF for the specified QoIs from MLMC samples using a product of Gaussian-CDF kernels with the MLMC telescoping sum.

The estimator uses a product of univariate Gaussian CDF kernels:

\[\hat{F}(\mathbf{x}) = \frac{1}{N_1} \sum_{i=1}^{N_1} \prod_{k=1}^{d} \Phi\!\left(\frac{x_k - Q_{1,k}^{(i)}}{h_k}\right) + \sum_{l=2}^{L} \frac{1}{N_l} \sum_{i=1}^{N_l} \left[\prod_{k} \Phi\!\left(\frac{x_k - F_{l,k}^{(i)}}{h_k}\right) - \prod_{k} \Phi\!\left(\frac{x_k - C_{l,k}^{(i)}}{h_k}\right)\right]\]

where $\Phi$ is the standard normal CDF and $h_k$ is the per-dimension bandwidth (Silverman's rule by default).

Arguments

  • samples: An MLMCSamples struct.
  • qoi_indices: Which QoI rows to include as dimensions of the joint CDF.

Keyword Arguments

  • bandwidth: Per-dimension bandwidths as a vector. If nothing, Silverman's rule is applied per dimension using finest-level fine samples.

Returns

A callable F(x::AbstractVector) -> Float64 evaluating the estimated joint CDF.

source
MultilevelMonteCarlo.ml_bootstrap_resample_multivariateFunction
ml_bootstrap_resample_multivariate(samples::MLMCSamples,
    qoi_indices::AbstractVector{<:Integer},
    number_of_resamples::Int) -> Matrix{Float64}

Generate multivariate resamples from an MLMC ensemble using a bootstrap-style telescoping sum.

For each resample, a random sample index is drawn independently at each level (shared across all QoI dimensions within that level), and the MLMC telescoping sum is formed:

\[\hat{\mathbf{x}} = \mathbf{F}_1^{(i_1)} + \sum_{l=2}^{L} \bigl[\mathbf{F}_l^{(i_l)} - \mathbf{C}_l^{(i_l)}\bigr]\]

This preserves within-sample correlations between QoI dimensions at each level.

Arguments

  • samples: An MLMCSamples struct.
  • qoi_indices: Which QoI rows to include as dimensions.
  • number_of_resamples: How many resampled vectors to generate.

Returns

Matrix{Float64} of size (d, number_of_resamples).

source
MultilevelMonteCarlo.multivariate_rank_histogramFunction
multivariate_rank_histogram(observations, levels, qoi_functions,
    samples_per_level, draw_parameters;
    number_of_resamples=1000, cdf_method=nothing, parallel=false)
    -> Vector{Float64}

Compute a multivariate rank histogram (MRH) using the MLMC-estimated multivariate CDF for a given matrix of observations.

Implements the approach of Gneiting et al. (2008): the multivariate rank of an observation $\mathbf{x}_0$ relative to an ensemble is

\[\mathrm{rank}_{\mathrm{MRH}}(\mathbf{x}_0, \hat{F}_m) = \hat{F}_{G}\bigl(G(\mathbf{x}_0)\bigr)\]

where $G(\mathbf{x}) = \hat{F}(\mathbf{x})$ is the multivariate CDF and $\hat{F}_G$ is the empirical CDF of the $G$-values over the ensemble.

For each observation column in observations:

  1. Run mlmc_sample to obtain MLMC samples.
  2. Estimate $G = \hat{F}$ via the chosen CDF method.
  3. Generate an ensemble via ml_bootstrap_resample_multivariate.
  4. Compute $g_0 = G(\mathbf{y})$ and $g_j = G(\mathbf{x}_j)$ for each ensemble member.
  5. Record the PIT value $\hat{F}_G(g_0) = \frac{1}{m}\#\{j : g_j \le g_0\}$.

If the MLMC ensemble is well calibrated, the returned PIT values are approximately $\mathrm{Uniform}(0,1)$.

Arguments

  • observations: Matrix{<:Real} of size (d, n) — each column is one d-dimensional observation.
  • levels: Model evaluators [L₁, …, L_L]. Each returns a d-dimensional output (or an object from which qoi_functions extract d scalars).
  • qoi_functions: [q₁, …, q_d] — d QoI extractors, each qⱼ(model_output) -> scalar. Currently must be exactly 2 QoIs.
  • samples_per_level: MLMC sample counts per level.
  • draw_parameters: () -> params.

Keyword Arguments

  • number_of_resamples::Int = 1000: Ensemble size for each rank evaluation.
  • cdf_method: (samples, (j₁,j₂)) -> F̂(x₁,x₂), a callable that takes MLMCSamples and a tuple of QoI indices and returns a 2-D CDF function. Default: estimate_cdf_mlmc_kernel_density_2d. For MaxEnt pass (s, idx) -> first(estimate_cdf_maxent_2d(s, idx; R=4)).
  • parallel::Bool = false: Thread MLMC sampling within each ensemble.

Returns

Vector{Float64} of PIT values, each ideally in $[0, 1]$.

source
multivariate_rank_histogram(levels, qoi_functions, draw_parameters,
    number_of_rank_samples, samples_per_level;
    number_of_resamples=1000, cdf_method=nothing, parallel=false)
    -> Vector{Float64}

Convenience wrapper that draws number_of_rank_samples observations from the finest level and then calls the observation-based multivariate_rank_histogram.

source

Online Statistics (Internals)

MultilevelMonteCarlo.OnlineAccumulatorType
OnlineAccumulator()

Welford's online accumulator for computing running mean and variance in a single pass without storing individual samples.

Two accumulators can be combined with merge (Chan's parallel algorithm), making this safe for threaded use with per-thread instances.

source
Base.mergeMethod

Merge two accumulators using Chan's parallel algorithm.

source
MultilevelMonteCarlo.accumulate_samplesFunction
accumulate_samples(f, n; parallel=false) -> OnlineAccumulator

Call f() exactly n times and accumulate scalar return values online. When parallel=true, per-thread accumulators are merged via Chan's algorithm.

acc = accumulate_samples(1000; parallel=true) do
    my_expensive_computation()
end
source
MultilevelMonteCarlo.accumulate_vector_samplesFunction
accumulate_vector_samples(body!, n, k; parallel=false) -> Vector{OnlineAccumulator}

Execute body!(accs) exactly n times, where accs is a k-element vector of OnlineAccumulators. The caller should call update!(accs[j], val) inside body! to record observations. When parallel=true each thread gets its own accumulator vector; results are merged afterwards.

This avoids per-sample allocation — only the accumulators are mutated.

accs = accumulate_vector_samples(1000, 2; parallel=true) do accs
    x = rand()
    update!(accs[1], x)
    update!(accs[2], x^2)
end
source