diff --git a/previews/PR106/.documenter-siteinfo.json b/previews/PR106/.documenter-siteinfo.json index 82a785b..eac6936 100644 --- a/previews/PR106/.documenter-siteinfo.json +++ b/previews/PR106/.documenter-siteinfo.json @@ -1 +1 @@ -{"documenter":{"julia_version":"1.11.2","generation_timestamp":"2024-12-06T13:55:01","documenter_version":"1.8.0"}} \ No newline at end of file +{"documenter":{"julia_version":"1.11.2","generation_timestamp":"2024-12-06T14:28:25","documenter_version":"1.8.0"}} \ No newline at end of file diff --git a/previews/PR106/api/index.html b/previews/PR106/api/index.html index 563ddc3..b6a7b55 100644 --- a/previews/PR106/api/index.html +++ b/previews/PR106/api/index.html @@ -2,8 +2,8 @@ AdvancedMH.jl · AdvancedMH

AdvancedMH.jl

Documentation for AdvancedMH.jl

Structs

AdvancedMH.MetropolisHastingsType
MetropolisHastings{D}

MetropolisHastings has one field, proposal. proposal is a Proposal, NamedTuple of Proposal, or Array{Proposal} in the shape of your data. For example, if you wanted the sampler to return a NamedTuple with shape

x = (a = 1.0, b=3.8)

The proposal would be

proposal = (a=StaticProposal(Normal(0,1)), b=StaticProposal(Normal(0,1)))

Other allowed proposals are

p1 = StaticProposal(Normal(0,1))
 p2 = StaticProposal([Normal(0,1), InverseGamma(2,3)])
 p3 = StaticProposal((a=Normal(0,1), b=InverseGamma(2,3)))
-p4 = StaticProposal((x=1.0) -> Normal(x, 1))

The sampler is constructed using

spl = MetropolisHastings(proposal)

When using MetropolisHastings with the function sample, the following keyword arguments are allowed:

  • initial_params defines the initial parameterization for your model. If

none is given, the initial parameters will be drawn from the sampler's proposals.

  • param_names is a vector of strings to be assigned to parameters. This is only

used if chain_type=Chains.

  • chain_type is the type of chain you would like returned to you. Supported

types are chain_type=Chains if MCMCChains is imported, or chain_type=StructArray if StructArrays is imported.

source

Functions

AdvancedMH.DensityModelType
DensityModel{F} <: AbstractModel

DensityModel wraps around a self-contained log-liklihood function logdensity.

Example:

l(x) = logpdf(Normal(), x)
-DensityModel(l)
source

Samplers

AdvancedMH.RobustAdaptiveMetropolis.RAMType
RAM

Robust Adaptive Metropolis-Hastings (RAM).

This is a simple implementation of the RAM algorithm described in [VIH12].

Fields

  • α: target acceptance rate

  • γ: negative exponent of the adaptation decay rate

  • S: initial lower-triangular Cholesky factor

  • eigenvalue_lower_bound: lower bound on eigenvalues of the adapted covariance matrix

  • eigenvalue_upper_bound: upper bound on eigenvalues of the adapted covariance matrix

Examples

The following demonstrates how to implement a simple Gaussian model and sample from it using the RAM algorithm.

julia> using AdvancedMH, Random, Distributions, MCMCChains, LogDensityProblems, LinearAlgebra
+p4 = StaticProposal((x=1.0) -> Normal(x, 1))

The sampler is constructed using

spl = MetropolisHastings(proposal)

When using MetropolisHastings with the function sample, the following keyword arguments are allowed:

  • initial_params defines the initial parameterization for your model. If

none is given, the initial parameters will be drawn from the sampler's proposals.

  • param_names is a vector of strings to be assigned to parameters. This is only

used if chain_type=Chains.

  • chain_type is the type of chain you would like returned to you. Supported

types are chain_type=Chains if MCMCChains is imported, or chain_type=StructArray if StructArrays is imported.

source

Functions

AdvancedMH.DensityModelType
DensityModel{F} <: AbstractModel

DensityModel wraps around a self-contained log-liklihood function logdensity.

Example:

l(x) = logpdf(Normal(), x)
+DensityModel(l)
source

Samplers

AdvancedMH.RobustAdaptiveMetropolis.RAMType
RAM

Robust Adaptive Metropolis-Hastings (RAM).

This is a simple implementation of the RAM algorithm described in [VIH12].

Fields

  • α: target acceptance rate. Default: 0.234.

  • γ: negative exponent of the adaptation decay rate. Default: 0.6.

  • S: initial lower-triangular Cholesky factor. Default: nothing.

  • eigenvalue_lower_bound: lower bound on eigenvalues of the adapted Cholesky factor. Default: 0.0.

  • eigenvalue_upper_bound: upper bound on eigenvalues of the adapted Cholesky factor. Default: Inf.

Examples

The following demonstrates how to implement a simple Gaussian model and sample from it using the RAM algorithm.

julia> using AdvancedMH, Random, Distributions, MCMCChains, LogDensityProblems, LinearAlgebra
 
 julia> # Define a Gaussian with zero mean and some covariance.
        struct Gaussian{A}
@@ -35,7 +35,7 @@
        Random.seed!(1234);
 
 julia> # Sample!
-       chain = sample(model, RAM(), 10_000; chain_type=Chains, num_warmup=10_000, progress=false, initial_params=zeros(2));
+       chain = sample(model, RAM(), 10_000; chain_type=Chains, num_warmup, progress=false, initial_params=zeros(2));
 
 julia> norm(cov(Array(chain)) - [1.0 0.5; 0.5 1.0]) < 0.2
-true

References

source
  • VIH12Vihola (2012) Robust adaptive Metropolis algorithm with coerced acceptance rate, Statistics and computing.
+true

It's also possible to restrict the eigenvalues to avoid either too small or too large values. See p. 13 in [VIH12].

``jldoctest ram-gaussian julia> chain = sample( model, RAM(eigenvaluelowerbound=0.1, eigenvalueupperbound=2.0), 10000; chaintype=Chains, numwarmup, progress=false, initialparams=zeros(2) );

julia> norm(cov(Array(chain)) - [1.0 0.5; 0.5 1.0]) < 0.2 true ````

References

source
diff --git a/previews/PR106/index.html b/previews/PR106/index.html index 7fdd6c8..c478f20 100644 --- a/previews/PR106/index.html +++ b/previews/PR106/index.html @@ -109,4 +109,4 @@ # Sample from the posterior. chain = sample(model, spl, 100000; initial_params=ones(2), chain_type=StructArray, param_names=["μ", "σ"])

Usage with LogDensityProblems.jl

As above, we can define the model with the LogDensityProblems.jl interface. We can implement the gradient of the log density function manually, or use LogDensityProblemsAD.jl to provide us with the gradient computation used in MALA. Using our implementation of the LogDensityProblems.jl interface above:

using LogDensityProblemsAD
 model_with_ad = LogDensityProblemsAD.ADgradient(Val(:ForwardDiff), LogTargetDensity())
-sample(model_with_ad, spl, 100000; initial_params=ones(2), chain_type=StructArray, param_names=["μ", "σ"])
+sample(model_with_ad, spl, 100000; initial_params=ones(2), chain_type=StructArray, param_names=["μ", "σ"]) diff --git a/previews/PR106/search_index.js b/previews/PR106/search_index.js index 3ce878a..e0ceef1 100644 --- a/previews/PR106/search_index.js +++ b/previews/PR106/search_index.js @@ -1,3 +1,3 @@ var documenterSearchIndex = {"docs": -[{"location":"api/#AdvancedMH.jl","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"","category":"section"},{"location":"api/","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"Documentation for AdvancedMH.jl","category":"page"},{"location":"api/#Structs","page":"AdvancedMH.jl","title":"Structs","text":"","category":"section"},{"location":"api/","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"MetropolisHastings","category":"page"},{"location":"api/#AdvancedMH.MetropolisHastings","page":"AdvancedMH.jl","title":"AdvancedMH.MetropolisHastings","text":"MetropolisHastings{D}\n\nMetropolisHastings has one field, proposal. proposal is a Proposal, NamedTuple of Proposal, or Array{Proposal} in the shape of your data. For example, if you wanted the sampler to return a NamedTuple with shape\n\nx = (a = 1.0, b=3.8)\n\nThe proposal would be\n\nproposal = (a=StaticProposal(Normal(0,1)), b=StaticProposal(Normal(0,1)))\n\nOther allowed proposals are\n\np1 = StaticProposal(Normal(0,1))\np2 = StaticProposal([Normal(0,1), InverseGamma(2,3)])\np3 = StaticProposal((a=Normal(0,1), b=InverseGamma(2,3)))\np4 = StaticProposal((x=1.0) -> Normal(x, 1))\n\nThe sampler is constructed using\n\nspl = MetropolisHastings(proposal)\n\nWhen using MetropolisHastings with the function sample, the following keyword arguments are allowed:\n\ninitial_params defines the initial parameterization for your model. If\n\nnone is given, the initial parameters will be drawn from the sampler's proposals.\n\nparam_names is a vector of strings to be assigned to parameters. This is only\n\nused if chain_type=Chains.\n\nchain_type is the type of chain you would like returned to you. Supported\n\ntypes are chain_type=Chains if MCMCChains is imported, or chain_type=StructArray if StructArrays is imported.\n\n\n\n\n\n","category":"type"},{"location":"api/#Functions","page":"AdvancedMH.jl","title":"Functions","text":"","category":"section"},{"location":"api/","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"DensityModel","category":"page"},{"location":"api/#AdvancedMH.DensityModel","page":"AdvancedMH.jl","title":"AdvancedMH.DensityModel","text":"DensityModel{F} <: AbstractModel\n\nDensityModel wraps around a self-contained log-liklihood function logdensity.\n\nExample:\n\nl(x) = logpdf(Normal(), x)\nDensityModel(l)\n\n\n\n\n\n","category":"type"},{"location":"api/#Samplers","page":"AdvancedMH.jl","title":"Samplers","text":"","category":"section"},{"location":"api/","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"RAM","category":"page"},{"location":"api/#AdvancedMH.RobustAdaptiveMetropolis.RAM","page":"AdvancedMH.jl","title":"AdvancedMH.RobustAdaptiveMetropolis.RAM","text":"RAM\n\nRobust Adaptive Metropolis-Hastings (RAM).\n\nThis is a simple implementation of the RAM algorithm described in [VIH12].\n\nFields\n\nα: target acceptance rate\nγ: negative exponent of the adaptation decay rate\nS: initial lower-triangular Cholesky factor\neigenvalue_lower_bound: lower bound on eigenvalues of the adapted covariance matrix\neigenvalue_upper_bound: upper bound on eigenvalues of the adapted covariance matrix\n\nExamples\n\nThe following demonstrates how to implement a simple Gaussian model and sample from it using the RAM algorithm.\n\njulia> using AdvancedMH, Random, Distributions, MCMCChains, LogDensityProblems, LinearAlgebra\n\njulia> # Define a Gaussian with zero mean and some covariance.\n struct Gaussian{A}\n Σ::A\n end\n\njulia> # Implement the LogDensityProblems interface.\n LogDensityProblems.dimension(model::Gaussian) = size(model.Σ, 1)\n\njulia> function LogDensityProblems.logdensity(model::Gaussian, x)\n d = LogDensityProblems.dimension(model)\n return logpdf(MvNormal(zeros(d),model.Σ), x)\n end\n\njulia> LogDensityProblems.capabilities(::Gaussian) = LogDensityProblems.LogDensityOrder{0}()\n\njulia> # Construct the model. We'll use a correlation of 0.5.\n model = Gaussian([1.0 0.5; 0.5 1.0]);\n\njulia> # Number of samples we want in the resulting chain.\n num_samples = 10_000;\n\njulia> # Number of warmup steps, i.e. the number of steps to adapt the covariance of the proposal.\n # Note that these are not included in the resulting chain, as `discard_initial=num_warmup`\n # by default in the `sample` call. To include them, pass `discard_initial=0` to `sample`.\n num_warmup = 10_000;\n\njulia> # Set the seed so get some consistency.\n Random.seed!(1234);\n\njulia> # Sample!\n chain = sample(model, RAM(), 10_000; chain_type=Chains, num_warmup=10_000, progress=false, initial_params=zeros(2));\n\njulia> norm(cov(Array(chain)) - [1.0 0.5; 0.5 1.0]) < 0.2\ntrue\n\nReferences\n\n[VIH12]: Vihola (2012) Robust adaptive Metropolis algorithm with coerced acceptance rate, Statistics and computing.\n\n\n\n\n\n","category":"type"},{"location":"#AdvancedMH.jl","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"","category":"section"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"(Image: Stable) (Image: Dev) (Image: AdvancedMH-CI)","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"AdvancedMH.jl currently provides a robust implementation of random walk Metropolis-Hastings samplers.","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"Further development aims to provide a suite of adaptive Metropolis-Hastings implementations.","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"AdvancedMH works by allowing users to define composable Proposal structs in different formats.","category":"page"},{"location":"#Usage","page":"AdvancedMH.jl","title":"Usage","text":"","category":"section"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"First, construct a DensityModel, which is a wrapper around the log density function for your inference problem. The DensityModel is then used in a sample call.","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"# Import the package.\nusing AdvancedMH\nusing Distributions\nusing MCMCChains\n\nusing LinearAlgebra\n\n# Generate a set of data from the posterior we want to estimate.\ndata = rand(Normal(0, 1), 30)\n\n# Define the components of a basic model.\ninsupport(θ) = θ[2] >= 0\ndist(θ) = Normal(θ[1], θ[2])\ndensity(θ) = insupport(θ) ? sum(logpdf.(dist(θ), data)) : -Inf\n\n# Construct a DensityModel.\nmodel = DensityModel(density)\n\n# Set up our sampler with a joint multivariate Normal proposal.\nspl = RWMH(MvNormal(zeros(2), I))\n\n# Sample from the posterior.\nchain = sample(model, spl, 100000; param_names=[\"μ\", \"σ\"], chain_type=Chains)","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"Output:","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"Object of type Chains, with data of type 100000×3×1 Array{Float64,3}\n\nIterations = 1:100000\nThinning interval = 1\nChains = 1\nSamples per chain = 100000\ninternals = lp\nparameters = μ, σ\n\n2-element Array{ChainDataFrame,1}\n\nSummary Statistics\n\n│ Row │ parameters │ mean │ std │ naive_se │ mcse │ ess │ r_hat │\n│ │ Symbol │ Float64 │ Float64 │ Float64 │ Float64 │ Any │ Any │\n├─────┼────────────┼──────────┼──────────┼─────────────┼────────────┼─────────┼─────────┤\n│ 1 │ μ │ 0.156152 │ 0.19963 │ 0.000631285 │ 0.00323033 │ 3911.73 │ 1.00009 │\n│ 2 │ σ │ 1.07493 │ 0.150111 │ 0.000474693 │ 0.00240317 │ 3707.73 │ 1.00027 │\n\nQuantiles\n\n│ Row │ parameters │ 2.5% │ 25.0% │ 50.0% │ 75.0% │ 97.5% │\n│ │ Symbol │ Float64 │ Float64 │ Float64 │ Float64 │ Float64 │\n├─────┼────────────┼──────────┼───────────┼──────────┼──────────┼──────────┤\n│ 1 │ μ │ -0.23361 │ 0.0297006 │ 0.159139 │ 0.283493 │ 0.558694 │\n│ 2 │ σ │ 0.828288 │ 0.972682 │ 1.05804 │ 1.16155 │ 1.41349 │\n","category":"page"},{"location":"#Usage-with-[LogDensityProblems.jl](https://github.com/tpapp/LogDensityProblems.jl)","page":"AdvancedMH.jl","title":"Usage with LogDensityProblems.jl","text":"","category":"section"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"Alternatively, you can define your model with the LogDensityProblems.jl interface:","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"using LogDensityProblems\n\n# Use a struct instead of `typeof(density)` for sake of readability.\nstruct LogTargetDensity end\n\nLogDensityProblems.logdensity(p::LogTargetDensity, θ) = density(θ) # standard multivariate normal\nLogDensityProblems.dimension(p::LogTargetDensity) = 2\nLogDensityProblems.capabilities(::LogTargetDensity) = LogDensityProblems.LogDensityOrder{0}()\n\nsample(LogTargetDensity(), spl, 100000; param_names=[\"μ\", \"σ\"], chain_type=Chains)","category":"page"},{"location":"#Proposals","page":"AdvancedMH.jl","title":"Proposals","text":"","category":"section"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"AdvancedMH offers various methods of defining your inference problem. Behind the scenes, a MetropolisHastings sampler simply holds some set of Proposal structs. AdvancedMH will return posterior samples in the \"shape\" of the proposal provided – currently supported methods are Array{Proposal}, Proposal, and NamedTuple{Proposal}. For example, proposals can be created as:","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"# Provide a univariate proposal.\nm1 = DensityModel(x -> logpdf(Normal(x,1), 1.0))\np1 = StaticProposal(Normal(0,1))\nc1 = sample(m1, MetropolisHastings(p1), 100; chain_type=Vector{NamedTuple})\n\n# Draw from a vector of distributions.\nm2 = DensityModel(x -> logpdf(Normal(x[1], x[2]), 1.0))\np2 = StaticProposal([Normal(0,1), InverseGamma(2,3)])\nc2 = sample(m2, MetropolisHastings(p2), 100; chain_type=Vector{NamedTuple})\n\n# Draw from a `NamedTuple` of distributions.\nm3 = DensityModel(x -> logpdf(Normal(x.a, x.b), 1.0))\np3 = (a=StaticProposal(Normal(0,1)), b=StaticProposal(InverseGamma(2,3)))\nc3 = sample(m3, MetropolisHastings(p3), 100; chain_type=Vector{NamedTuple})\n\n# Draw from a functional proposal.\nm4 = DensityModel(x -> logpdf(Normal(x,1), 1.0))\np4 = StaticProposal((x=1.0) -> Normal(x, 1))\nc4 = sample(m4, MetropolisHastings(p4), 100; chain_type=Vector{NamedTuple})","category":"page"},{"location":"#Static-vs.-Random-Walk","page":"AdvancedMH.jl","title":"Static vs. Random Walk","text":"","category":"section"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"Currently there are only two methods of inference available. Static MH simply draws from the prior, with no conditioning on the previous sample. Random walk will add the proposal to the previously observed value. If you are constructing a Proposal by hand, you can determine whether the proposal is a StaticProposal or a RandomWalkProposal using","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"static_prop = StaticProposal(Normal(0,1))\nrw_prop = RandomWalkProposal(Normal(0,1))","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"Different methods are easily composeable. One parameter can be static and another can be a random walk, each of which may be drawn from separate distributions.","category":"page"},{"location":"#Multiple-chains","page":"AdvancedMH.jl","title":"Multiple chains","text":"","category":"section"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"AdvancedMH.jl implements the interface of AbstractMCMC which means sampling of multiple chains is supported for free:","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"# Sample 4 chains from the posterior serially, without thread or process parallelism.\nchain = sample(model, spl, MCMCSerial(), 100000, 4; param_names=[\"μ\",\"σ\"], chain_type=Chains)\n\n# Sample 4 chains from the posterior using multiple threads.\nchain = sample(model, spl, MCMCThreads(), 100000, 4; param_names=[\"μ\",\"σ\"], chain_type=Chains)\n\n# Sample 4 chains from the posterior using multiple processes.\nchain = sample(model, spl, MCMCDistributed(), 100000, 4; param_names=[\"μ\",\"σ\"], chain_type=Chains)","category":"page"},{"location":"#Metropolis-adjusted-Langevin-algorithm-(MALA)","page":"AdvancedMH.jl","title":"Metropolis-adjusted Langevin algorithm (MALA)","text":"","category":"section"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"AdvancedMH.jl also offers an implementation of MALA if the ForwardDiff and DiffResults packages are available. ","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"A MALA sampler can be constructed by MALA(proposal) where proposal is a function that takes the gradient computed at the current sample. It is required to specify an initial sample initial_params when calling sample.","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"# Import the package.\nusing AdvancedMH\nusing Distributions\nusing MCMCChains\nusing ForwardDiff\nusing StructArrays\n\nusing LinearAlgebra\n\n# Generate a set of data from the posterior we want to estimate.\ndata = rand(Normal(0, 1), 30)\n\n# Define the components of a basic model.\ninsupport(θ) = θ[2] >= 0\ndist(θ) = Normal(θ[1], θ[2])\ndensity(θ) = insupport(θ) ? sum(logpdf.(dist(θ), data)) : -Inf\n\n# Construct a DensityModel.\nmodel = DensityModel(density)\n\n# Set up the sampler with a multivariate Gaussian proposal.\nσ² = 0.01\nspl = MALA(x -> MvNormal((σ² / 2) .* x, σ² * I))\n\n# Sample from the posterior.\nchain = sample(model, spl, 100000; initial_params=ones(2), chain_type=StructArray, param_names=[\"μ\", \"σ\"])","category":"page"},{"location":"#Usage-with-[LogDensityProblems.jl](https://github.com/tpapp/LogDensityProblems.jl)-2","page":"AdvancedMH.jl","title":"Usage with LogDensityProblems.jl","text":"","category":"section"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"As above, we can define the model with the LogDensityProblems.jl interface. We can implement the gradient of the log density function manually, or use LogDensityProblemsAD.jl to provide us with the gradient computation used in MALA. Using our implementation of the LogDensityProblems.jl interface above:","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"using LogDensityProblemsAD\nmodel_with_ad = LogDensityProblemsAD.ADgradient(Val(:ForwardDiff), LogTargetDensity())\nsample(model_with_ad, spl, 100000; initial_params=ones(2), chain_type=StructArray, param_names=[\"μ\", \"σ\"])","category":"page"}] +[{"location":"api/#AdvancedMH.jl","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"","category":"section"},{"location":"api/","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"Documentation for AdvancedMH.jl","category":"page"},{"location":"api/#Structs","page":"AdvancedMH.jl","title":"Structs","text":"","category":"section"},{"location":"api/","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"MetropolisHastings","category":"page"},{"location":"api/#AdvancedMH.MetropolisHastings","page":"AdvancedMH.jl","title":"AdvancedMH.MetropolisHastings","text":"MetropolisHastings{D}\n\nMetropolisHastings has one field, proposal. proposal is a Proposal, NamedTuple of Proposal, or Array{Proposal} in the shape of your data. For example, if you wanted the sampler to return a NamedTuple with shape\n\nx = (a = 1.0, b=3.8)\n\nThe proposal would be\n\nproposal = (a=StaticProposal(Normal(0,1)), b=StaticProposal(Normal(0,1)))\n\nOther allowed proposals are\n\np1 = StaticProposal(Normal(0,1))\np2 = StaticProposal([Normal(0,1), InverseGamma(2,3)])\np3 = StaticProposal((a=Normal(0,1), b=InverseGamma(2,3)))\np4 = StaticProposal((x=1.0) -> Normal(x, 1))\n\nThe sampler is constructed using\n\nspl = MetropolisHastings(proposal)\n\nWhen using MetropolisHastings with the function sample, the following keyword arguments are allowed:\n\ninitial_params defines the initial parameterization for your model. If\n\nnone is given, the initial parameters will be drawn from the sampler's proposals.\n\nparam_names is a vector of strings to be assigned to parameters. This is only\n\nused if chain_type=Chains.\n\nchain_type is the type of chain you would like returned to you. Supported\n\ntypes are chain_type=Chains if MCMCChains is imported, or chain_type=StructArray if StructArrays is imported.\n\n\n\n\n\n","category":"type"},{"location":"api/#Functions","page":"AdvancedMH.jl","title":"Functions","text":"","category":"section"},{"location":"api/","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"DensityModel","category":"page"},{"location":"api/#AdvancedMH.DensityModel","page":"AdvancedMH.jl","title":"AdvancedMH.DensityModel","text":"DensityModel{F} <: AbstractModel\n\nDensityModel wraps around a self-contained log-liklihood function logdensity.\n\nExample:\n\nl(x) = logpdf(Normal(), x)\nDensityModel(l)\n\n\n\n\n\n","category":"type"},{"location":"api/#Samplers","page":"AdvancedMH.jl","title":"Samplers","text":"","category":"section"},{"location":"api/","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"RAM","category":"page"},{"location":"api/#AdvancedMH.RobustAdaptiveMetropolis.RAM","page":"AdvancedMH.jl","title":"AdvancedMH.RobustAdaptiveMetropolis.RAM","text":"RAM\n\nRobust Adaptive Metropolis-Hastings (RAM).\n\nThis is a simple implementation of the RAM algorithm described in [VIH12].\n\nFields\n\nα: target acceptance rate. Default: 0.234.\nγ: negative exponent of the adaptation decay rate. Default: 0.6.\nS: initial lower-triangular Cholesky factor. Default: nothing.\neigenvalue_lower_bound: lower bound on eigenvalues of the adapted Cholesky factor. Default: 0.0.\neigenvalue_upper_bound: upper bound on eigenvalues of the adapted Cholesky factor. Default: Inf.\n\nExamples\n\nThe following demonstrates how to implement a simple Gaussian model and sample from it using the RAM algorithm.\n\njulia> using AdvancedMH, Random, Distributions, MCMCChains, LogDensityProblems, LinearAlgebra\n\njulia> # Define a Gaussian with zero mean and some covariance.\n struct Gaussian{A}\n Σ::A\n end\n\njulia> # Implement the LogDensityProblems interface.\n LogDensityProblems.dimension(model::Gaussian) = size(model.Σ, 1)\n\njulia> function LogDensityProblems.logdensity(model::Gaussian, x)\n d = LogDensityProblems.dimension(model)\n return logpdf(MvNormal(zeros(d),model.Σ), x)\n end\n\njulia> LogDensityProblems.capabilities(::Gaussian) = LogDensityProblems.LogDensityOrder{0}()\n\njulia> # Construct the model. We'll use a correlation of 0.5.\n model = Gaussian([1.0 0.5; 0.5 1.0]);\n\njulia> # Number of samples we want in the resulting chain.\n num_samples = 10_000;\n\njulia> # Number of warmup steps, i.e. the number of steps to adapt the covariance of the proposal.\n # Note that these are not included in the resulting chain, as `discard_initial=num_warmup`\n # by default in the `sample` call. To include them, pass `discard_initial=0` to `sample`.\n num_warmup = 10_000;\n\njulia> # Set the seed so get some consistency.\n Random.seed!(1234);\n\njulia> # Sample!\n chain = sample(model, RAM(), 10_000; chain_type=Chains, num_warmup, progress=false, initial_params=zeros(2));\n\njulia> norm(cov(Array(chain)) - [1.0 0.5; 0.5 1.0]) < 0.2\ntrue\n\nIt's also possible to restrict the eigenvalues to avoid either too small or too large values. See p. 13 in [VIH12].\n\n``jldoctest ram-gaussian julia> chain = sample( model, RAM(eigenvaluelowerbound=0.1, eigenvalueupperbound=2.0), 10000; chaintype=Chains, numwarmup, progress=false, initialparams=zeros(2) );\n\njulia> norm(cov(Array(chain)) - [1.0 0.5; 0.5 1.0]) < 0.2 true ````\n\nReferences\n\n[VIH12]: Vihola (2012) Robust adaptive Metropolis algorithm with coerced acceptance rate, Statistics and computing.\n\n\n\n\n\n","category":"type"},{"location":"#AdvancedMH.jl","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"","category":"section"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"(Image: Stable) (Image: Dev) (Image: AdvancedMH-CI)","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"AdvancedMH.jl currently provides a robust implementation of random walk Metropolis-Hastings samplers.","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"Further development aims to provide a suite of adaptive Metropolis-Hastings implementations.","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"AdvancedMH works by allowing users to define composable Proposal structs in different formats.","category":"page"},{"location":"#Usage","page":"AdvancedMH.jl","title":"Usage","text":"","category":"section"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"First, construct a DensityModel, which is a wrapper around the log density function for your inference problem. The DensityModel is then used in a sample call.","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"# Import the package.\nusing AdvancedMH\nusing Distributions\nusing MCMCChains\n\nusing LinearAlgebra\n\n# Generate a set of data from the posterior we want to estimate.\ndata = rand(Normal(0, 1), 30)\n\n# Define the components of a basic model.\ninsupport(θ) = θ[2] >= 0\ndist(θ) = Normal(θ[1], θ[2])\ndensity(θ) = insupport(θ) ? sum(logpdf.(dist(θ), data)) : -Inf\n\n# Construct a DensityModel.\nmodel = DensityModel(density)\n\n# Set up our sampler with a joint multivariate Normal proposal.\nspl = RWMH(MvNormal(zeros(2), I))\n\n# Sample from the posterior.\nchain = sample(model, spl, 100000; param_names=[\"μ\", \"σ\"], chain_type=Chains)","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"Output:","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"Object of type Chains, with data of type 100000×3×1 Array{Float64,3}\n\nIterations = 1:100000\nThinning interval = 1\nChains = 1\nSamples per chain = 100000\ninternals = lp\nparameters = μ, σ\n\n2-element Array{ChainDataFrame,1}\n\nSummary Statistics\n\n│ Row │ parameters │ mean │ std │ naive_se │ mcse │ ess │ r_hat │\n│ │ Symbol │ Float64 │ Float64 │ Float64 │ Float64 │ Any │ Any │\n├─────┼────────────┼──────────┼──────────┼─────────────┼────────────┼─────────┼─────────┤\n│ 1 │ μ │ 0.156152 │ 0.19963 │ 0.000631285 │ 0.00323033 │ 3911.73 │ 1.00009 │\n│ 2 │ σ │ 1.07493 │ 0.150111 │ 0.000474693 │ 0.00240317 │ 3707.73 │ 1.00027 │\n\nQuantiles\n\n│ Row │ parameters │ 2.5% │ 25.0% │ 50.0% │ 75.0% │ 97.5% │\n│ │ Symbol │ Float64 │ Float64 │ Float64 │ Float64 │ Float64 │\n├─────┼────────────┼──────────┼───────────┼──────────┼──────────┼──────────┤\n│ 1 │ μ │ -0.23361 │ 0.0297006 │ 0.159139 │ 0.283493 │ 0.558694 │\n│ 2 │ σ │ 0.828288 │ 0.972682 │ 1.05804 │ 1.16155 │ 1.41349 │\n","category":"page"},{"location":"#Usage-with-[LogDensityProblems.jl](https://github.com/tpapp/LogDensityProblems.jl)","page":"AdvancedMH.jl","title":"Usage with LogDensityProblems.jl","text":"","category":"section"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"Alternatively, you can define your model with the LogDensityProblems.jl interface:","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"using LogDensityProblems\n\n# Use a struct instead of `typeof(density)` for sake of readability.\nstruct LogTargetDensity end\n\nLogDensityProblems.logdensity(p::LogTargetDensity, θ) = density(θ) # standard multivariate normal\nLogDensityProblems.dimension(p::LogTargetDensity) = 2\nLogDensityProblems.capabilities(::LogTargetDensity) = LogDensityProblems.LogDensityOrder{0}()\n\nsample(LogTargetDensity(), spl, 100000; param_names=[\"μ\", \"σ\"], chain_type=Chains)","category":"page"},{"location":"#Proposals","page":"AdvancedMH.jl","title":"Proposals","text":"","category":"section"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"AdvancedMH offers various methods of defining your inference problem. Behind the scenes, a MetropolisHastings sampler simply holds some set of Proposal structs. AdvancedMH will return posterior samples in the \"shape\" of the proposal provided – currently supported methods are Array{Proposal}, Proposal, and NamedTuple{Proposal}. For example, proposals can be created as:","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"# Provide a univariate proposal.\nm1 = DensityModel(x -> logpdf(Normal(x,1), 1.0))\np1 = StaticProposal(Normal(0,1))\nc1 = sample(m1, MetropolisHastings(p1), 100; chain_type=Vector{NamedTuple})\n\n# Draw from a vector of distributions.\nm2 = DensityModel(x -> logpdf(Normal(x[1], x[2]), 1.0))\np2 = StaticProposal([Normal(0,1), InverseGamma(2,3)])\nc2 = sample(m2, MetropolisHastings(p2), 100; chain_type=Vector{NamedTuple})\n\n# Draw from a `NamedTuple` of distributions.\nm3 = DensityModel(x -> logpdf(Normal(x.a, x.b), 1.0))\np3 = (a=StaticProposal(Normal(0,1)), b=StaticProposal(InverseGamma(2,3)))\nc3 = sample(m3, MetropolisHastings(p3), 100; chain_type=Vector{NamedTuple})\n\n# Draw from a functional proposal.\nm4 = DensityModel(x -> logpdf(Normal(x,1), 1.0))\np4 = StaticProposal((x=1.0) -> Normal(x, 1))\nc4 = sample(m4, MetropolisHastings(p4), 100; chain_type=Vector{NamedTuple})","category":"page"},{"location":"#Static-vs.-Random-Walk","page":"AdvancedMH.jl","title":"Static vs. Random Walk","text":"","category":"section"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"Currently there are only two methods of inference available. Static MH simply draws from the prior, with no conditioning on the previous sample. Random walk will add the proposal to the previously observed value. If you are constructing a Proposal by hand, you can determine whether the proposal is a StaticProposal or a RandomWalkProposal using","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"static_prop = StaticProposal(Normal(0,1))\nrw_prop = RandomWalkProposal(Normal(0,1))","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"Different methods are easily composeable. One parameter can be static and another can be a random walk, each of which may be drawn from separate distributions.","category":"page"},{"location":"#Multiple-chains","page":"AdvancedMH.jl","title":"Multiple chains","text":"","category":"section"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"AdvancedMH.jl implements the interface of AbstractMCMC which means sampling of multiple chains is supported for free:","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"# Sample 4 chains from the posterior serially, without thread or process parallelism.\nchain = sample(model, spl, MCMCSerial(), 100000, 4; param_names=[\"μ\",\"σ\"], chain_type=Chains)\n\n# Sample 4 chains from the posterior using multiple threads.\nchain = sample(model, spl, MCMCThreads(), 100000, 4; param_names=[\"μ\",\"σ\"], chain_type=Chains)\n\n# Sample 4 chains from the posterior using multiple processes.\nchain = sample(model, spl, MCMCDistributed(), 100000, 4; param_names=[\"μ\",\"σ\"], chain_type=Chains)","category":"page"},{"location":"#Metropolis-adjusted-Langevin-algorithm-(MALA)","page":"AdvancedMH.jl","title":"Metropolis-adjusted Langevin algorithm (MALA)","text":"","category":"section"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"AdvancedMH.jl also offers an implementation of MALA if the ForwardDiff and DiffResults packages are available. ","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"A MALA sampler can be constructed by MALA(proposal) where proposal is a function that takes the gradient computed at the current sample. It is required to specify an initial sample initial_params when calling sample.","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"# Import the package.\nusing AdvancedMH\nusing Distributions\nusing MCMCChains\nusing ForwardDiff\nusing StructArrays\n\nusing LinearAlgebra\n\n# Generate a set of data from the posterior we want to estimate.\ndata = rand(Normal(0, 1), 30)\n\n# Define the components of a basic model.\ninsupport(θ) = θ[2] >= 0\ndist(θ) = Normal(θ[1], θ[2])\ndensity(θ) = insupport(θ) ? sum(logpdf.(dist(θ), data)) : -Inf\n\n# Construct a DensityModel.\nmodel = DensityModel(density)\n\n# Set up the sampler with a multivariate Gaussian proposal.\nσ² = 0.01\nspl = MALA(x -> MvNormal((σ² / 2) .* x, σ² * I))\n\n# Sample from the posterior.\nchain = sample(model, spl, 100000; initial_params=ones(2), chain_type=StructArray, param_names=[\"μ\", \"σ\"])","category":"page"},{"location":"#Usage-with-[LogDensityProblems.jl](https://github.com/tpapp/LogDensityProblems.jl)-2","page":"AdvancedMH.jl","title":"Usage with LogDensityProblems.jl","text":"","category":"section"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"As above, we can define the model with the LogDensityProblems.jl interface. We can implement the gradient of the log density function manually, or use LogDensityProblemsAD.jl to provide us with the gradient computation used in MALA. Using our implementation of the LogDensityProblems.jl interface above:","category":"page"},{"location":"","page":"AdvancedMH.jl","title":"AdvancedMH.jl","text":"using LogDensityProblemsAD\nmodel_with_ad = LogDensityProblemsAD.ADgradient(Val(:ForwardDiff), LogTargetDensity())\nsample(model_with_ad, spl, 100000; initial_params=ones(2), chain_type=StructArray, param_names=[\"μ\", \"σ\"])","category":"page"}] }