Generated quantities block problems to use

i’m building a standard linear regression model and i want to include the generated quantities block and i want to use the dot_self() function. The problem is I can´t get simulation samples. The error is: Stan model 'LinearRegression' does not contain samples. . I think the the function dot_self() is not being recognized as a function. I show stan code and R code here. Thanks in advance.

Note: I am sure that the data entered is correct because the model without the generated quantities block works perfectly.


data {          
  int<lower=1> N;           
  int<lower=1> K;           
  matrix[N, K] X;           
  vector[N] y;              
parameters {                
  vector[K] beta;           
  real<lower=0> sigma;      
  vector[N] mu;             
  mu = X * beta;             

  beta ~ normal(0, 10);
  sigma ~ cauchy(0, 5);                                     
  y ~ normal(mu, sigma);    

generated quantities {
  real rss;                
  real totalss;
  real<lower=0, upper=1> R2;                 
  vector[N] mu;
  mu=X * beta;
  R2=1 - rss/totalss;



dat=list(N=N, K=ncol(X), y=y, X=X)
fit3 = stan(file = "C:.... LinearRegression.stan", data = dat, iter = 100,chains = 4)

print(fit3, digits=3, prob=c(.025,.5,.975))

Hi @Vibass7 welcome to the Stan forums.

Can you try running your model with argument cores=1 and checking what the error message is? When using multiple cores some of the error message can get suppressed, so it’s possible that there is some useful information about the error that isn’t being displayed.

Thanks ! @jonah I ran my model with cores=1 and I have this errors now.

[1] "Error in sampler$call_sampler(args_list[[i]]) : "                                                                                                                                                              
[2] "  Exception: model307034ec1faa_LinearRegression_namespace::write_array: R2 is -6.02356, but must be greater than or equal to 0  (in 'model307034ec1faa_LinearRegression' at line 21)"
[3] "In addition: Warning message:"                                                                                                                                                                                 
[4] "In system(paste(CXX, ARGS), ignore.stdout = TRUE, ignore.stderr = TRUE) :"                                                                                                                                     
[5] "  'C:/rtools40/usr/mingw_/bin/g++' not found"                                                                                                                                                                  
[1] "error occurred during calling the sampler; sampling not done"

And this WARNING

Chain 1: Gradient evaluation took 0.001 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 10 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1: 
Chain 1: 
Chain 1: WARNING: There aren't enough warmup iterations to fit the
Chain 1:          three stages of adaptation as currently configured.
Chain 1:          Reducing each adaptation stage to 15%/75%/10% of
Chain 1:          the given number of warmup iterations:
Chain 1:            init_buffer = 7
Chain 1:            adapt_window = 38
Chain 1:            term_buffer = 5

Sorry it took me so long to respond! I try to bookmark posts I need to respond to but forgot for this one.

This is the key part of the error. It looks like you’re getting negative values for something declared to be greater than 0. In this case it must be that rss/totalss is greater than 1 so that when you subtract it from 1 you get a negative number. Have you seen this paper we have on Bayesian R^2?

I think that might help. If you don’t have access to the journal here’s a preprint from Andrew’s website:

And this warning

is saying that you’ve told it to run for too few warmup iterations for Stan to do its normal recommended warmup process. I don’t think that’s causing this error you’re seeing, but it’s something to be aware of.

@jonah Thank you so much ! I´m going to read the paper.

1 Like