This repository has been archived by the owner on Nov 17, 2023. It is now read-only.
-
Dear all, the title says it: any recommended way to implement a Gaussian Mixture Model (with trainable pi, mu, log_sigma), from the tools we have in the gluon.probability.distribution? (kindly ask help from the expert @xidulu ) Regards |
Beta Was this translation helpful? Give feedback.
Answered by
xidulu
Mar 8, 2021
Replies: 1 comment 3 replies
-
@feevos |
Beta Was this translation helpful? Give feedback.
3 replies
Answer selected by
feevos
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
@feevos
My recommended way is to use EM,
at E step, you update \pi with probability computed by MultivariateNormal
at M step, you update pi and log_sigma by running a few optimization steps on the negative log likelihood, which can also be implemented easily using distribution.MultivariateNormal.