Slow sampling fitting an auction model
Hi all,
I am working on an auction model and the sampling process in Stan is very slow. It took my laptop over 19 hours to fit 4 MCMC chains with 200 iterations for each. Moreover, all transitions after warm up exceeded the maximum treedepth. I’m wondering whether the usage of integral in the modeling dramatically slow down the sampling or not. If yes, how to improve the performance. Otherwise, what can be the potential problem that slow down the sampling.
The auction model I am working on, in brief, models the bid b_{i} of bidder i by the vector of explanatory variables X_i, the number of competitors J_i -1, and the cost c_{i} through
where \mbox{F}(c; X_i^\top\beta, \sigma) is the cdf of a truncated lognormal distribution \mbox{lognormal}(X_i\beta, \sigma^2)\mbox{T}[0, \bar{c}] defined with hyperparameters \beta, \sigma^2 and the upper bound \bar{c}. The cost c_i is hierarcially modeled by the truncated lognormal distribution
The target is to obtain the posterior inference of cost c_i for all bidders and hyperparameters \beta and \sigma.
The following lists the parameter and model blocks in Stan code. The complete Stan code is in “generative_model.stan” generative_model.stan (3.0 KB) . One can reproduce the simulation by running “auction_proj.R”.auction_proj.R (1.3 KB) , which calls data.R (3.6 KB) to generate data.
parameters {
vector[K] beta;
real<lower=0> sigma;
vector<lower=0, upper = c_max>[N] costs;
}
model {
vector[N] inv_hazards;
// priors
to_vector(beta) ~ normal(0,1);
sigma ~ normal(1,0.01);
for(i in 1:N){
costs[i] ~ lognormal(X[i] * beta, sigma) T[,c_max]; //dgp cost distribution
}
// likelihood
inv_hazards = getGenerativeTotalAuctionHazard(
N,
J, // num bidders in each auction
costs,
X,
beta,
sigma,
c_max,
x_r
);
(bids - costs) ~ normal(inv_hazards , 1);
}
Thanks,
Lu Zhang
data.R (3.6 KB) auction_proj.R (1.3 KB) generative_model.stan (3.0 KB)