Documentation | Workflows | Code Coverage | Quality Assurance |
---|---|---|---|
PDMPFlux.jl
provides a fast and efficient implementation of Piecewise Deterministic Markov Process (PDMP) samplers, using a grid-based Poisson thinning approach proposed in Andral and Kamatani (2024).
By the means of the automatic differentiation engines, PDMPFlux.jl
only requires dim
and U
, which is the negative log density of the target distribution (e.g., posterior).
$$
U(x) = -\log p(x) + \text{const}.
$$
Currently, julia >= 1.11
is required for compatibility.
To install the package, use Julia's package manager:
(@v1.11) pkg> add PDMPFlux
The following example demonstrates how to sample from a standard Gaussian distribution using a Zig-Zag sampler.
using PDMPFlux
function U_Gauss(x::Vector)
return sum(x.^2) / 2
end
dim = 10
sampler = ZigZagAD(dim, U_Gauss)
N_sk, N, xinit, vinit = 1_000_000, 1_000_000, zeros(dim), ones(dim)
samples = sample(sampler, N_sk, N, xinit, vinit, seed=2024)
jointplot(samples)
For more control, you can manually provide the gradient.
Also, by breaking down the sample()
function into two steps: sample_skeleton()
and sample_from_skeleton()
, you can use plot_traj()
and diagnostic()
functions to diagnose the sampler:
using PDMPFlux
using Zygote
N_sk = 1_000_000 # number of skeleton points
N = 1_000_000 # number of samples
function ∇U_banana(x::Vector)
mean_x2 = (x[1]^2 - 1)
return - (- x[1] + -(x[2] - mean_x2) - sum(x[3:end])) # don't forget the minus sign!
end
dim = 50
xinit = ones(dim)
vinit = ones(dim)
grid_size = 0 # use constant bounds
sampler = ZigZag(dim, ∇U_banana, grid_size=grid_size) # manually providing the gradient
output = sample_skeleton(sampler, N_sk, xinit, vinit) # simulate skeleton points
samples = sample_from_skeleton(sampler, N, output) # get samples from the skeleton points
plot_traj(output, 10000)
diagnostic(output)
jointplot(samples)
Markov Chain Monte Carlo (MCMC) methods are standard in sampling from distributions with unknown normalizing constants.
However, PDMPs (also known as Event Chain Monte Carlo) offer a promising alternative due to their
- rejection-free simulation strategy,
- continuous and non-reversible dynamics,
particularly in high-dimensional and big data contexts, as discussed in Bouchard-Côté et. al. (2018) and Bierkens et. al. (2019).
Despite their potential, practical applications of PDMPs remained limited by the lack of efficient and flexible implementations.
Inspired by Andral and Kamatani (2024) and their jax
based implementation in pdmp_jax
,PDMPFlux.jl
is my attempt to fill this gap, with the aid of the existing automatic differentiation engines.
proposed by Bierkens, Fearnhead & Roberts (2019).
proposed by Bouchard-Côte et. al. (2018).
proposed by Michel, Durmus & Sénécal (2020).
proposed by Bierkens et. al. (2020).
proposed by Vasdekis and Roberts (2023).
proposed in Bierkens et. al. (2023).
1D Zig-Zag on Cauchy | 1D Zig-Zag on Gaussian | Cauchy vs. Gaussian Density Plot |
- The automatic Poisson thinning implementation in
PDMPFlux.jl
is based on the paper Andral and Kamatani (2024) Automated Techniques for Efficient Sampling of Piecewise-Deterministic Markov Processes and its accompanying Python packagepdmp_jax
. pdmp_jax
has ajax
based implementation, and typically about four times faster than currentPDMPFlux.jl
.- Both
ForwardDiff.jl
andZygote.jl
are used for automatic differentiation, each with their own trade-offs.
pdmp_jax
by Charly Andral, on which this repository is strongly based on and indebted to.ForwardDiff.jl
andZygote.jl
are used for automatic differentiation.- Other PDMP packages: