Large discrepencies between stan_glm and standard glm

Hi all,

I’m having a strange issue with my model that I’m hoping someone here can help me troubleshoot. I’m trying to put my first Bayesian models together on a study that I just finished, and being that they’re my first models, I’m running a “frequentist” model alongside it to make sure the coefficient estimates are roughly the same. I’m going into this using a weakly-informative prior and not trying to push the results in the slightest; yet the results of the two models are dramatically different. I don’t get any errors or warnings so I can’t figure out why.

Code is here:

Priors_MEmodel<- student_t(df=5, location = c(0, 0, 0),scale = c(2, 2, 2), autoscale = FALSE)


Main_EffectsModel=stan_glm(Accept_Reject~Discount+Floor, 
                  family = binomial(link = "logit"), 
                  data=sonadata_clean, 
                  prior = Priors_MEmodel,
                  #prior_intercept = normal(), 
                  prior_PD = TRUE, 
                  algorithm = c("sampling"), 
                  #mean_PPD = TRUE,
                  #adapt_delta = 0.95, 
                  #QR = FALSE, 
                  #sparse = FALSE,
                  chains=3,iter=5000,cores=3)

library(jmv)
frequentist_model=logRegBin(data = sonadata_clean, dep = Accept_Reject,
  covs = NULL,
  factors = vars(Floor, Discount),
  blocks = list(list("Discount","Floor")),
  refLevels = list(list(var="Accept_Reject",ref="0"),
list(var="Discount",ref="0"),
list(var="Floor",ref="0")),
  pseudoR2 = c("r2n"),
  ci=TRUE,
  OR = TRUE)

describe_posterior(Main_EffectsModel)

frequentist_model

The results of the two models are attached in a picture. If anyone has any suggestions on how to diagnose and fix this that would be amazing. Nothing I’ve tried so far (enabling/disabling various stan_glm settings; any adjustments to the prior) brings the Bayesian estimates closer to Model1.

I think your are sampling the prior here.

Try prior_PD=FALSE .

3 Likes

Thanks for catching that! Totally fixed everything