I’m developing an analysis where I need to generate a uniform distribution that will interact with the “transformed data” block and with the “transformed parameters” block, the result of which is a “parameter”. I’m referring, based on the code below, to the lambda .
data
{
int<lower=1> N;
array[N] int y;
array[N] int n;
array[N] real x;
}
transformed data
{
vector[N] x2;
for (i in 1:N){
if (lambda != 0) {
x2[i] = (pow(x[i], lambda) - 1) / (lambda);
}else{
x2[i] = log(x[i]);
}
}
}
parameters
{
real beta0;
real beta1;
real lambda;
}
transformed parameters
{
array[N] real p;
for (c in 1:N)
{
p[c] = inv_logit(beta0 + beta1 * x2[c] * lambda);
}
}
model
{
beta0 ~ normal(0, 100);
beta1 ~ normal(0, 100);
lambda ~ uniform(-5, 5);
for(j in 1:N)
{
y[j] ~ binomial(n[j], p[j]);
}
}
The error displayed is:
Semantic error in ‘string’, line 12, column 10 to column 16:
Identifier ‘lambda’ not in scope.
If lambda = 0 then beta1 * ((pow(x[i], lambda) - 1) = 0 as in the original code. You can then delete the transformed data block.
As side notes:
If you are not interested in saving the p estimates directly, the model will probably be faster without the loop and with binomal_logit . This would also get rid of the loop with y and the transformed parameter block.
Hello! I am very grateful for your help! Thank you very much! But in the block below, I have an “If/else” where if lambda = 0 the log(x[i]) is calculated.
transformed date
{
vector[N] x2;
for (i in 1:N){
if (lambda != 0) {
x2[i] = (pow(x[i], lambda) - 1) / (lambda); // Here we have that the function is divided by lambda
}else{
x2[i] = log(x[i]);
}
}
}
Thinking of the way you wrote how I could implement the “if/else” this way:
Hello @stijn thank you very much for your help!
Below is the code working as expected.
date
{
int<lower=1> N;
array[N] int y;
array[N] int n;
array[N] real x;
}
parameters
{
actual beta0;
real beta1;
real<lower=-5, upper=5> lambda;
}
transformed parameters
{
array[N] real p;
vector[N] x2;
for (i in 1:N)
{
if (lambda != 0)
{
x2[i] = (pow(x[i], lambda) - 1) / lambda;
}
else
{
x2[i] = log(x[i]);
}
}
for (c in 1:N)
{
p[c] = inv_logit(beta0 + beta1 * x2[c] * lambda);
}
}
model
{
beta0 ~ normal(0, 100);
beta1 ~ normal(0, 100);