# Estimating random effects in nonlinear model

Hello!

I am trying to model my behavioral data using brms. I know from the literature that a model that describes them well should be a hyperbola:

ip = 1 / (1 + k * delay)

The parameter k is the parameter of interest. It takes only positive values, usually less than 1. I would like to simultaneously estimate its level for the study population and for individuals using a multilevel model. Unfortunately, the estimates I get for individuals are extremely meaningless even on simulated data. Below I have attached the code I use for simulation and model fitting.

``````# generating data frame
sim = data.frame(matrix(NA, 60, 4))
sim[,1] = as.character(rep(1:10, each = 6))
sim[,2] = rep(c(1, 6, 12, 24,60, 120), 10)
sim[,4] = round(rep(c(0.001, 0.0015, 0.005, 0.01, 0.015, 0.05, 0.1, 0.15, 0.2, 0.3), each = 6), 3)
sim[,3] = 1/ (1 + sim[,4] * sim[,2])
colnames(sim) = c("id", "delay", "ip_value", "true_k")

# fitting model
check_fit = brm(formula = bf(ip_value ~ (1) / ((1 + k * delay)), k ~ 1 + (1|id), nl = TRUE),
data = sim, family = "gaussian", chains = 2, iter = 6000,, prior = c(set_prior("lognormal(0.01, 2)",
nlpar = "k", lb = 0)),  save_pars = save_pars(all = TRUE))
ranef(check_fit)

``````

I expect that I am defining the model in a wrong way, but using the materials available online I canâ€™t tell how I should do it correctly. I would greatly appreciate your help or pointing me to relevant materials.

• Operating System: macOS Big Sur 11.4
• brms Version: 2.15

Since `k` is bounded but you want to model the person-level deviations with the normal, it might make more sense to model `log(k)`, instead.

``````bf(ip_value ~ (1) / ((1 + log(k) * delay)), k ~ 1 + (1|id), nl = TRUE)
``````