I am super new to this and apologize in advance for a very basic question.
I am fitting a basic multiple regression model, using this code.
## data {
## // Define variables in data
## // Number of observations (an integer)
## int<lower=0> N;
## // Number of parameters
## int<lower=0> p;
## // Variables
## real wt82_71[N];
## int<lower=0> qsmk[N];
## int<lower=0> sex[N];
## real<lower=0> age[N];
## int<lower=0> race[N];
## real<lower=0> smokeyrs[N];
## }
##
## parameters {
## // Define parameters to estimate
## real beta[p];
##
## // standard deviation (a positive real number)
## real<lower=0> sigma;
## }
##
## transformed parameters {
## // Mean
## real mu[N];
## for (i in 1:N) {
## mu[i] <- beta[1] + beta[2]*qsmk[i] + beta[3]*sex[i] + beta[4]*age[i] + beta[5]*race[i] + beta[6]*smokeyrs[i];
## }
## }
##
## model {
## // Prior part of Bayesian inference (flat if unspecified)
##
## // Likelihood part of Bayesian inference
## wt82_71 ~ normal(mu, sigma);
## }
I have the following output.
## Bayesian
print(resStan, pars = c("beta","sigma"))
## Inference for Stan model: stan_code.
## 3 chains, each with iter=3000; warmup=500; thin=10;
## post-warmup draws per chain=250, total post-warmup draws=750.
##
## mean se_mean sd 2.5% 25% 50% 75% 97.5% n_eff Rhat
## beta[1] 9.93 0.03 0.81 8.35 9.37 9.91 10.52 11.46 709 1
## beta[2] 2.95 0.02 0.42 2.12 2.67 2.94 3.24 3.76 750 1
## beta[3] -0.11 0.01 0.38 -0.84 -0.37 -0.11 0.13 0.65 742 1
## beta[4] -0.22 0.00 0.03 -0.28 -0.24 -0.22 -0.19 -0.16 625 1
## beta[5] 0.07 0.02 0.55 -1.03 -0.25 0.05 0.40 1.27 732 1
## beta[6] 0.06 0.00 0.03 0.00 0.04 0.06 0.08 0.13 750 1
## sigma 7.48 0.00 0.13 7.24 7.39 7.48 7.57 7.76 750 1
##
## Samples were drawn using NUTS(diag_e) at Mon Feb 23 17:50:03 2015.
## For each parameter, n_eff is a crude measure of effective sample size,
## and Rhat is the potential scale reduction factor on split chains (at
## convergence, Rhat=1).
It gives me my intercept as beta1
. How to update this code and have beta0
, beta1
, beta2
instead of beta1
, beta2
, beta3
etc.
Thanks!