Given the parameters \mu, \sigma, and hu, how does one compute the variance of the hurdle lognormal distribution? Based on this part of the brms Github, I know we can compute the mean for the hurdle lognormal as

\exp(\mu + \sigma^2 / 2) \cdot (1 - hu)

Here’s how that works in action.

library(tidyverse)
# define population parameters
mu <- 1
sigma <- 1
hu <- 0.25
# number of sample draws
n <- 1e6
# simulate
set.seed(1)
tibble(y = c(rep(0, times = n * hu),
rlnorm(n = n * (1 - hu), meanlog = mu, sdlog = sigma))) %>%
# summarize
summarise(sample_mean = mean(y),
mean_by_formula = exp(mu + sigma^2 / 2) * (1 - hu))

Think of the hurdle lognormal as a mixture of a lognormal and a distribution whose pdf is a delta function at zero. For the derivation of the variance of a mixture of one-dimensional distributions, see Mixture distribution - Wikipedia under the Moments heading.