I do not completely understand what is going on and I am just a novice user but I did a few Stan runs and I think I have some hints about what is happening.

With X and Y spanning [1, 10] and b_m = 2 (using that so I can to the math in my head) then the value of ratio in the model

`ratio[i] = X[i] / (Y[i] + b_m) + 1 / (Y[i]/b_m + 1);`

will be in the range [3/12, 12/3]. If the fitting process should try a negative b_m value, say -2, then the situation is a little more complicated. When X and Y are well above 2, say in the range [3, 10], then *ratio* will span [1/8, 8/1]. Those values are certainly not the same as when b_m = 2 but they are not wildly different. When Y is near 2, the value of *ratio* blows up because Y + b_m goes to zero. In that case, the only way to increase the calculated log probability is to increase sigma so that those few wild values have some probability density. If N is large, the calculated log probability will be a very strong function of b_m, when it is negative, preferring values as far from any Y value as possible. The lp__ value will be very peaky and a chain will tend to stay near where it starts.

I think another factor that plays into the observed behavior is that Stan randomly initializes parameters in the range [-2, 2] in the unconstrained space unless explicit initial values are set. With multiple chains and large N, it is fairly likely that at least one chain will start in the bad negative b_m range and get stuck. I do not think that the prior specification modifies the initialization behavior but I may be wrong about that. If I am right, setting a prior on b_m will not help as long as b_m values below -1 are supported because a chain can still start in the region where it can get stuck due to Y values where Y + b_m goes to zero.

Here is some edited output from a run with your original model and N = 200. You can see that chain 3 finds b_m near -2 and it has a very small sd. It has an lp__ MUCH worse than the other chains but it is stuck. Also, sigma is very large for that chain because for some points the calculated ratio values are miles away from the actual r values.

```
, , chains = chain:1
stats
parameter mean sd
b_m 2.2119536 0.059135560
sigma 0.1014174 0.005317003
lp__ 356.2567261 1.074719669
, , chains = chain:2
stats
parameter mean sd
b_m 2.2106468 0.056079842
sigma 0.1014685 0.005187229
lp__ 356.3442797 1.073179447
, , chains = chain:3
stats
parameter mean sd
b_m -2.000805 0.000829513
sigma 36.134660 1.901796925
lp__ -813.771634 0.963752625
, , chains = chain:4
stats
parameter mean sd
b_m 2.2097303 0.055092151
sigma 0.1009864 0.005210844
lp__ 356.3395068 0.990030658
```

Here are results from another run with your `normal(2.0, 0.5)`

prior placed on b_m. Now two chains get stuck at negative values. Chain 1 below is stuck in a place very close to chain 3 in the previous result.

```
, , chains = chain:1
stats
parameter mean sd
b_m -2.000752 0.0008670722
sigma 36.086717 1.7148653649
lp__ -845.746086 1.0490657236
, , chains = chain:2
stats
parameter mean sd
b_m -1.816427 0.002699981
sigma 11.106288 0.537259635
lp__ -607.233117 0.865268460
, , chains = chain:3
stats
parameter mean sd
b_m 2.2053075 0.053667885
sigma 0.1011639 0.005219636
lp__ 356.2724433 1.004095468
, , chains = chain:4
stats
parameter mean sd
b_m 2.209487 0.056054877
sigma 0.101199 0.005196984
lp__ 356.240830 1.026173929
```

The solution, as you have observed, is to not allow negative b_m values or at least keep them above -1 so Y + b_m will not be zero. It would still be a good idea to have priors on all parameters.

I hope none of that is too far off base.