Inference Library

Inference Library

Importance Sampling

(traces, log_norm_weights, lml_est) = importance_sampling(model::GenerativeFunction,
    model_args::Tuple, observations::ChoiceMap, num_samples::Int)

(traces, log_norm_weights, lml_est) = importance_sampling(model::GenerativeFunction,
    model_args::Tuple, observations::ChoiceMap,
    proposal::GenerativeFunction, proposal_args::Tuple,
    num_samples::Int)

Run importance sampling, returning a vector of traces with associated log weights.

The log-weights are normalized. Also return the estimate of the marginal likelihood of the observations (lml_est). The observations are addresses that must be sampled by the model in the given model arguments. The first variant uses the internal proposal distribution of the model. The second variant uses a custom proposal distribution defined by the given generative function. All addresses of random choices sampled by the proposal should also be sampled by the model function.

source
(trace, lml_est) = importance_resampling(model::GenerativeFunction,
    model_args::Tuple, observations::ChoiceMap, num_samples::Int)

(traces, lml_est) = importance_resampling(model::GenerativeFunction,
    model_args::Tuple, observations::ChoiceMap,
    proposal::GenerativeFunction, proposal_args::Tuple,
    num_samples::Int)

Run sampling importance resampling, returning a single trace.

Unlike importance_sampling, the memory used constant in the number of samples.

source

Markov Chain Monte Carlo

The following inference library methods take a trace and return a new trace.

(new_trace, accepted) = metropolis_hastings(trace, selection::AddressSet)

Perform a Metropolis-Hastings update that proposes new values for the selected addresses from the internal proposal (often using ancestral sampling).

source
(new_trace, accepted) = metropolis_hastings(trace, proposal::GenerativeFunction, proposal_args::Tuple)

Perform a Metropolis-Hastings update that proposes new values for some subset of random choices in the given trace using the given proposal generative function.

The proposal generative function should take as its first argument the current trace of the model, and remaining arguments proposal_args. If the proposal modifies addresses that determine the control flow in the model, values must be provided by the proposal for any addresses that are newly sampled by the model.

source
(new_trace, accepted) = metropolis_hastings(trace, proposal::GenerativeFunction, proposal_args::Tuple, involution::Function)

Perform a generalized Metropolis-Hastings update based on an involution (bijection that is its own inverse) on a space of assignments.

The `involution' Julia function has the following signature:

(new_trace, bwd_choices::ChoiceMap, weight) = involution(trace, fwd_choices::ChoiceMap, fwd_ret, proposal_args::Tuple)

The generative function proposal is executed on arguments (trace, proposal_args...), producing an assignment fwd_choices and return value fwd_ret. For each value of model arguments (contained in trace) and proposal_args, the involution function applies an involution that maps the tuple (get_choices(trace), fwd_choices) to the tuple (get_choices(new_trace), bwd_choices). Note that fwd_ret is a deterministic function of fwd_choices and proposal_args. When only discrete random choices are used, the weight must be equal to get_score(new_trace) - get_score(trace).

Including Continuous Random Choices When continuous random choices are used, the weight must include an additive term that is the determinant of the the Jacobian of the bijection on the continuous random choices that is obtained by currying the involution on the discrete random choices.

source
Gen.mhFunction.
(new_trace, accepted) = mh(trace, selection::AddressSet)
(new_trace, accepted) = mh(trace, proposal::GenerativeFunction, proposal_args::Tuple)
(new_trace, accepted) = mh(trace, proposal::GenerativeFunction, proposal_args::Tuple, involution::Function)

Alias for metropolis_hastings. Perform a Metropolis-Hastings update on the given trace.

source
Gen.malaFunction.
(new_trace, accepted) = mala(trace, selection::AddressSet, tau::Real)

Apply a Metropolis-Adjusted Langevin Algorithm (MALA) update.

Reference URL

source
Gen.hmcFunction.
(new_trace, accepted) = hmc(trace, selection::AddressSet, mass=0.1, L=10, eps=0.1)

Apply a Hamiltonian Monte Carlo (HMC) update.

Neal, Radford M. "MCMC using Hamiltonian dynamics." Handbook of Markov Chain Monte Carlo 2.11 (2011): 2.

Reference URL

source

Optimization over Random Choices

Gen.map_optimizeFunction.
new_trace = map_optimize(trace, selection::AddressSet, 
    max_step_size=0.1, tau=0.5, min_step_size=1e-16, verbose=false)

Perform backtracking gradient ascent to optimize the log probability of the trace over selected continuous choices.

Selected random choices must have support on the entire real line.

source

Particle Filtering

state = initialize_particle_filter(model::GenerativeFunction, model_args::Tuple,
    observations::ChoiceMap proposal::GenerativeFunction, proposal_args::Tuple,
    num_particles::Int)

Initialize the state of a particle filter using a custom proposal for the initial latent state.

source
state = initialize_particle_filter(model::GenerativeFunction, model_args::Tuple,
    observations::ChoiceMap, num_particles::Int)

Initialize the state of a particle filter, using the default proposal for the initial latent state.

source
particle_filter_step!(state::ParticleFilterState, new_args::Tuple, argdiff,
    observations::ChoiceMap, proposal::GenerativeFunction, proposal_args::Tuple)

Perform a particle filter update, where the model arguments are adjusted, new observations are added, and a custom proposal is used for new latent state.

source
particle_filter_step!(state::ParticleFilterState, new_args::Tuple, argdiff,
    observations::ChoiceMap)

Perform a particle filter update, where the model arguments are adjusted, new observations are added, and the default proposal is used for new latent state.

source
Gen.maybe_resample!Function.
did_resample::Bool = maybe_resample!(state::ParticleFilterState;
    ess_threshold::Float64=length(state.traces)/2, verbose=false)

Do a resampling step if the effective sample size is below the given threshold.

source
Gen.log_ml_estimateFunction.
estimate = log_ml_estimate(state::ParticleFilterState)

Return the particle filter's current estimate of the log marginal likelihood.

source
Gen.get_tracesFunction.
traces = get_traces(state::ParticleFilterState)

Return the vector of traces in the current state, one for each particle.

source
Gen.get_log_weightsFunction.
log_weights = get_log_weights(state::ParticleFilterState)

Return the vector of log weights for the current state, one for each particle.

The weights are not normalized, and are in log-space.

source
traces::Vector = sample_unweighted_traces(state::ParticleFilterState, num_samples::Int)

Sample a vector of num_samples traces from the weighted collection of traces in the given particle filter state.

source

Supervised Training

Gen.train!Function.
train!(gen_fn::GenerativeFunction, data_generator::Function,
       update::ParamUpdate,
       num_epoch, epoch_size, num_minibatch, minibatch_size; verbose::Bool=false)

Train the given generative function to maximize the expected conditional log probability (density) that gen_fn generates the assignment constraints given inputs, where the expectation is taken under the output distribution of data_generator.

The function data_generator is a function of no arguments that returns a tuple (inputs, constraints) where inputs is a Tuple of inputs (arguments) to gen_fn, and constraints is an ChoiceMap. conf configures the optimization algorithm used. param_lists is a map from generative function to lists of its parameters. This is equivalent to minimizing the expected KL divergence from the conditional distribution constraints | inputs of the data generator to the distribution represented by the generative function, where the expectation is taken under the marginal distribution on inputs determined by the data generator.

source

Variational Inference

Gen.black_box_vi!Function.
black_box_vi!(model::GenerativeFunction, args::Tuple,
              observations::ChoiceMap,
              proposal::GenerativeFunction, proposal_args::Tuple,
              update::ParamUpdate;
              iters=1000, samples_per_iter=100, verbose=false)

Fit the parameters of a generative function (proposal) to the posterior distribution implied by the given model and observations using stochastic gradient methods.

source