Replies: 3 comments 1 reply
-
This is not possible in the current version, the error message must be improved though. A workaround is x = datavar(Vector{Float64})
...
Ax ~ ContinuousTransition(x, a, W) where {meta=ContinuousTransitionMeta(squaremat)} This might lead to problems with inference though because I'm not sure if |
Beta Was this translation helpful? Give feedback.
-
@hv10 no worries! RxInfer.jl can appear hard, especially for newcomers to Bayesian Inference. I am glad you've discovered I will drop a way on how you can make use of CTransition at the current version of using RxInfer
in_dim, out_dim = 2, 2
squaremat = a -> reshape(a, in_dim, out_dim)
@model function online_ctransition(out_dim)
μx_prev = datavar(Vector{Float64})
Σx_prev = datavar(Matrix{Float64})
μa = datavar(Vector{Float64})
Σa = datavar(Matrix{Float64})
y = datavar(Vector{Float64})
a ~ MvNormalMeanCovariance(μa, Σa)
x_prev ~ MvNormalMeanCovariance(μx_prev, Σx_prev)
x ~ CTransition(x_prev, a, diageye(out_dim)) where {meta = CTMeta(squaremat)}
y ~ MvNormalMeanCovariance(x, tiny*diageye(out_dim))
end
# so we need to impose joint constraints on the state within CTransition
online_constraints = @constraints begin
q(x_prev, x, a) = q(x_prev, x)q(a)
end;
# as you want autoupdates, we need to define them
autoupdates = @autoupdates begin
μx_prev, Σx_prev = mean_cov(q(x))
μa, Σa = mean_cov(q(a))
end;
# we need to define initial marginals as we imposed factorization
a = MvNormalMeanCovariance(zeros(in_dim*out_dim), diageye(in_dim*out_dim))
initmarginals = (a = a, x=MvNormalMeanCovariance(zeros(out_dim), diageye(out_dim)))
# some dump data
dumb_data = [randn(out_dim) for _ in 1:100]
# as I don't have a stream, I will use a static one
static_datastream = from(dumb_data) |> map(NamedTuple{(:y,), Tuple{Vector{Float64}}}, (d) -> (y = d, ))
engine = infer(
model = online_ctransition(out_dim),
constraints = online_constraints,
datastream = static_datastream,
autoupdates = autoupdates,
keephistory = length(dumb_data),
historyvars = (x = KeepLast(), a = KeepLast()),
initmarginals = initmarginals,
iterations = 10,
free_energy = true,
autostart = true,
free_energy_diagnostics = nothing,
) As it's an "online" setting, your posteriors will be stored in the history variable, e.g. |
Beta Was this translation helpful? Give feedback.
-
That really helped on clearing up some questions :) |
Beta Was this translation helpful? Give feedback.
-
Hi,
I am fairly new to the whole Bayesian Learning field, therefore I have some - probably stupid - questions.
Within the model I am trying to infer the values of a Matrix A, I thought it would be sensible to combine a ContinuousTransformation with a MvNormal for the entries.
The (relevant) code in question looks roughly like this:
I use this code in the datastream setting - therefore using autoupdate to update
μ
based on the mean ofq(a)
.Now I am encountering the following issues:
x
, leading to a very similar error:When replacing the datavars with static vectors (e.g.
rand(4)
) it can create the model without issues.I am probably misunderstanding the role of datavars or if the underlying goal of the modelling is even sensible and would be happy for any guidance you can give me on how to fix / proceed / change the approach.
Thanks, Noel
Beta Was this translation helpful? Give feedback.
All reactions