Observing non-linear function of latent variables

I gave some thought to whether a Jacobian adjustment is needed in this model, and now I believe the answer is that it is not. Since this thread has grown long, below is the model I’m going to comment on.

data {
  int<lower=0> n;
  vector<lower=0,upper=1>[n] w;
  vector[n] V;
}
transformed data {
  vector[n] lb_y = log(fmax(-V./(1 - w), 0)); // note: log(0) = -Inf
}
parameters {
  real mean_x, mean_y;
  real<lower=0> sd_x, sd_y;
  vector<lower=lb_y>[n] y;
}
transformed parameters {
  vector<lower=0>[n] exp_x = (V + (1 - w).*exp(y))./w;
}
model {
  // prior
  mean_x ~ normal(0, 1);
  sd_x ~ normal(0, 1);
  mean_y ~ normal(0, 1);
  sd_y ~ normal(0, 1);
  y ~ normal(mean_y, sd_y);

  // likelihood
  exp_x ~ lognormal(mean_x, sd_x);

  // target += ???; // a Jacobian adjustment is *not* needed in this model
}

The reason a Jacobian adjustment is not needed is very simple – we impose priors on precisely the parameters of the model (i.e., precisely the five variables that appear in the parameter block). The transformed parameter appears in the likelihood and that does not require a Jacobian adjustment. (I’d love to hear from @NikVetr or @Bob_Carpenter to put this example to bed.)