PyStan with pickle only saves first 100 values?


I am using python and I am saving my results as a combination of the model together with the results as follows:

import os, sys
import pystan
import pickle

# get the model
stan_model = pystan.StanModel(file=model_path)
# fit it
fit_temp = stan_model.sampling(data=data_bandit_temp, 
                        chains=4, iter=2000, warmup=1000, thin=1, init='random', verbose=True, 
                        control = {"adapt_delta":0.95, "stepsize":1, "max_treedepth":10}, n_jobs=-1)

# save both results with the model
with open(save_path, "wb") as whata:   #Pickling
    pickle.dump({"model" : stan_model, "outputs" : fit_temp}, whata)
    print('Saved results to ', save_path)

The problem is then when I’m loading it back it looks like the output was truncated the same way as when you just run fit_temp in a cell:

#save the output and the model       
with open(os.path.join(load_path), "rb") as fp:   # Unpickling
    results_load = pickle.load(fp)

and I get regardless of using print(fit_temp) or just the variable as now (run in Jupyter):

WARNING:pystan:Truncated summary with the 'fit.__repr__' method. For the full summary use 'print(fit)'

{'model': <pystan.model.StanModel at 0x19d06596b00>,
 Warning: Shown data is truncated to 100 parameters
 For the full summary use 'print(fit)'
 Inference for Stan model: anon_model_6678bc9cc0772a1eb46d99989a774102.
 4 chains, each with iter=2000; warmup=1000; thin=1; 
 post-warmup draws per chain=1000, total post-warmup draws=4000.

What i going on? Is only the first 100 values saved? How can I save ful output?

I’m on Win10,

  • numpy 1.16.2,
  • pystan
  • pickle unknown
  • sys 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)]


You need to unpack that dictionary.

Printing dict calls __repr__ for the dict values.

So how do I do that? If I tried to call a value which I was positive it is there, it gave me a key error.

fit = results_load["outputs"]

Ah, I see the problem… My stupidity, I didn’t realise how the dict looks… Thanks very much @ahartikainen!

1 Like