Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Missing prediction #409

Open
Sepideh-Adamiat opened this issue Jan 21, 2025 · 1 comment
Open

Missing prediction #409

Sepideh-Adamiat opened this issue Jan 21, 2025 · 1 comment

Comments

@Sepideh-Adamiat
Copy link
Contributor

I made a custom node and want to check the forwarded message out of that by using it in a model. This is the implementation:

struct MyNode end
@node MyNode Stochastic [out, in1, in2]

# rule specification
@rule MyNode(:out, Marginalisation) (m_in1::UnivariateNormalDistributionsFamily, m_in2::UnivariateNormalDistributionsFamily) = begin
    min1, vin1 = mean_var(m_in1)
    min2, vin2 = mean_var(m_in2)
    return NormalMeanVariance(min1 + min2, vin1 + vin2)
end

@rule MyNode(:in1, Marginalisation) (m_out::UnivariateNormalDistributionsFamily, m_in2::UnivariateNormalDistributionsFamily) = begin
    min2, vin2 = mean_var(m_in2)
    mout, vout = mean_var(m_out)
    return NormalMeanVariance(mout - min2, vout + vin2) 
end

@rule MyNode(:in2, Marginalisation) (m_out::UnivariateNormalDistributionsFamily, m_in1::UnivariateNormalDistributionsFamily) = begin
    min1, vin1 = mean_var(m_in1)
    mout, vout = mean_var(m_out)
    return NormalMeanVariance(mout - min1, vout + vin1) 
end

@rule MyNode(:in1, Marginalisation) (q_out::Any, m_in2::UnivariateNormalDistributionsFamily) = begin
    min2, vin2 = mean_var(m_in2)
    return NormalMeanVariance(mean(q_out) - min2, vin2)
end

@rule MyNode(:in2, Marginalisation) (q_out::Any, m_in1::UnivariateNormalDistributionsFamily) = begin
    min1, vin1 = mean_var(m_in1)
    return NormalMeanVariance(mean(q_out) - min1, vin1)
end
@model function My_model(y)
    A ~ NormalMeanVariance(2.0,1.0)
    B ~ NormalMeanVariance(1.0,1.0)
    y ~ MyNode(A,B)
end
result = infer(
    model = My_model(),
    predictvars  = (y = KeepLast(), ),
)

But when I predict the y using:

result.predictions[:y]

it gives "missing". As an alternative, I also tried

result = infer(
    model = My_model(),
    data  = (y = missing , ),
)

The result is the same.

@bvdmitri
Copy link
Member

I will repeat my message from Slack for better observability, we may fix it in the future, here is a workaround

julia> result = infer(
           model = My_model(),
           data = (y = UnfactorizedData(missing), ),
       )
Inference results:
  Posteriors       | available for (A, B)
  Predictions      | available for (y)


julia> result.predictions
Dict{Symbol, NormalMeanVariance{Float64}} with 1 entry:
  :y => NormalMeanVariance{Float64}=3.0, v=2.0)

This is an unfortunate detail of the implementation. Basically due to the fact that y is treated as data entry automatic constraints makes structured factorization for your model, which means that your written BP rules are never called. Instead RxInfer attempts to compute joint marginal over A and B , but since data is missing there is no information to compute it, so the inference engine happily returns you missing as a result.

Not great at all, but `UnfactorizedData was an attempt to override the default behavior and force BP rules. Its referenced here and was a way to fix broken example where predictions were completely off due to similar issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants