If it was just one coin I guess you’re asking, what’s the chance of it landing heads given y=(y_1,..,y_n) observations where y_i\in \{0,1\}. So we are interested in P(\theta\mid y). We assume y \sim bernoulli(\theta) and \theta \sim beta(\alpha,\beta). Wikipedia has a lovely page listing lots of common conjugate priors here. You can also calculate them yourself. In your case, Beta is a conjugate prior for the Bernoulli so the posterior form is quite simple: \theta \mid y \sim beta(\alpha + \sum_i^n{y_i},\beta + n - \sum_i^n{y_i}). You don’t need to do any Stan modelling for that. If you want to do it for several coins, I’d suggest just doing it for each coin category separately.

If on the other hand you want to create a model which enables you to guess \theta given a set of factors, you can turn your categorical factors (weight, size, etc) into dummy variables (i.e. 0 or 1 for every possible value) and then do a standard logistic regression; treating \theta as your response variable and your dummy variable factors as your binary predictors.

For example, let’s take just weight as a factor. Say it can have 3 values. We’d thus turn weight into 3 binary valued vectors w_1,w_2,w_3, let o be the binary outcome. We model o \sim bernoulli\_logit(c+ \alpha w_1 + \beta w_2 + \gamma w_3). The posterior parameters c, \alpha, \beta, \gamma then inform how to estimate P(o=1\mid w) = \theta for any given combination.

E.g. (untested)

```
data {
int <lower = 1> N;
int<lower=0,upper=1> o[N];
int<lower=0,upper=1> w1[N];
int<lower=0,upper=1> w2[N];
int<lower=0,upper=1> w3[N];
}
parameters {
real<lower = 0> c;
real<lower = 0> alpha;
real<lower = 0> beta;
real<lower = 0> gamma;
}
model {
for (n in 1:N)
o[n] ~ bernoulli_logit(c + alpha*w1[n] + beta*w2[n] + gamma*w3[n]);
}
```