DISInclusiveKL

class viabel.DISInclusiveKL(approx, model, num_mc_samples, ess_target, temper_prior, temper_prior_params, use_resampling=True, num_resampling_batches=1, w_clip_threshold=10)[source]

Inclusive Kullback-Leibler divergence using Distilled Importance Sampling.

Attributes:
approx

The approximation family.

model

The model.

num_mc_samples

Number of Monte Carlo samples to use to approximate the objective.

Methods

__call__(var_param)

Evaluate objective and its gradient.

update(var_param, direction)

Update the variational parameter in optimization.

__init__(approx, model, num_mc_samples, ess_target, temper_prior, temper_prior_params, use_resampling=True, num_resampling_batches=1, w_clip_threshold=10)[source]
Parameters:
approxApproximationFamily object
modelModel object
num_mc_sampleint

Number of Monte Carlo samples to use to approximate the KL divergence. (N in the paper)

ess_target: `int`

The ess target to adjust epsilon (M in the paper). It is also the number of samples in resampling.

temper_prior: `Model` object

A prior distribution to temper the model. Typically multivariate normal.

temper_prior_params: `numpy.ndarray` object

Parameters for the temper prior. Typically mean 0 and variance 1.

use_resampling: `bool`

Whether to use resampling.

num_resampling_batches: `int`

Number of resampling batches. The resampling batch is max(1, ess_target / num_resampling_batches).

w_clip_threshold: `float`

The maximum weight.