# Categorical mixture model?

I’m trying to implement a model for a categorical response variable in which the predictors of one of the responses can take either of two states, depending on the value of one of the other variables in the data. Specifically, my data are responses from a memory-based choice experiment, and the model I’m trying to implement is one where someone either remembers a piece of information or does not, with this information affecting their relative preferences for the different responses by way of changing the strength of the signal that one response provides. The probability that they do remember it is affected by the other variable I mentioned.

Initially I thought the easiest way to go about this would be to use brms’ mixture functionality, but this appears not to work with the categorical family. My next thought was that I could incorporate the mixture directly into the model formula itself with a non-linear formula, by writing something like `nlf(a ~ (x >= y)*z + (x < y)*w)`. Here, the value of `a` (one of the responses) is `z` if `x` (estimated) is greater-than-or-equal-to `y` (a variable in the data), and `w` otherwise. However, this doesn’t quite capture what I want either, which is that the value is `z` in the first instance, and `z` or `0` with probability `x/y` and `1-x/y`, respectively, in the second instance.

My next thought was that perhaps a mixture of this sort would be equivalent to weighting `z` by `x/y` when x < y, but some simple simulations quickly put paid to that idea.

At this point, I’m wondering whether there is any way I can make the model in brms. Does anyone have any bright ideas for me?