Fit a mixture distribution with a given variance

I’m trying to model data with a mixture of 2 exponentials with means u1 and u2 and mixing probabilities p and 1-p.

It’s important that the fitted distribution have a mean of 1 (which I enforce in my model by making u2 a function of u1 and p) and a given variance.

If I simply fit the model below, the variance of the resulting distribution is too low relative the observed data, likely for 2 reasons:

  1. There is autocorrelation and dependence on latent variables.
  2. A higher number of exponentials would be a better fit, but I want to stick with a simpler model.

I will eventually be modelling the autocorrelation and latent variables, so it won’t be possible to derive constraints on p or u1 to match a given variance in the same way I did for u2 to get a mean of 1.

Any suggesetions on how to get the variance of the distribution fitted below to match a given value?

data {
  int<lower=0> N;
  real<lower=0> y[N];
parameters {
  real<lower=0,upper=1> p;
  real<lower=0,upper=1> u1;
model {
  real u2;
  p ~ beta(2,2);
  u1 ~ beta(2,2);
  u2 = (1 - p*u1) / (1 - p);
  for (i in 1:N) {
    target += log_mix(p, exponential_lpdf(y[i] | 1/u1),
exponential_lpdf(y[i] | 1/u2));