# Weighted loglikelihoods in mixture model

I’d like to sample from a Gaussian mixture model where each observation `x[n]` has a non-negative weight `w[n]` (i.e. each log-likelihood should be scaled by its corresponding weight): How can I fit the weights in the following scipt, so that samples are drawn from the weighted observations?

`````` for (n in 1:N)
target += log_mix(lambda,
normal_lpdf(x[n] | mu[1], sigma[1]),
normal_lpdf(x[n] | mu[2], sigma[2])));
``````

Hey the general guidance is not to do this kind of weighting for Bayesian models because the resultant model can’t really be said to be generative. I.e. once you’ve fitted it, it can’t generate any data since that requires weights.

That given, there’s a discussion here. You can do it by:

(A) just multiplying the target by the weight. I.e.

`````` for (n in 1:N)
target += log_mix(lambda,
normal_lpdf(x[n] | mu[1], sigma[1]),
normal_lpdf(x[n] | mu[2], sigma[2]))) * w[n];
``````

(B) or you may be able to model as if your variance is heteroscedastic and shrink the variance for points with bigger weights. I.e.

`````` for (n in 1:N)
target += log_mix(lambda,
normal_lpdf(x[n] | mu[1], sigma[1]/w[n]),
normal_lpdf(x[n] | mu[2], sigma[2]/w[n])));
``````
1 Like