Multiple regression: 1 statement for all components or many stat. for each

Maybe is a stupid question, but I was wondering if in principle this:

simplex[N] pi;
observed[g] ~ normal(a[1,g]*pi[1] + a[2,g]*pi[2] + ... , sigma);

is equivalent, to this:

simplex[N] pi;
observed[g] ~ normal(a[n,g]*pi[n], sigma); //with sigma that might potentially vary per "n"

I am trying to think what are the differences in what “the model sees”.

I am not sure how the R linear regression and support vector regression (svm) implement their cost functions.

Thanks

This helped understanding better the implementation of error model.

https://stats.stackexchange.com/questions/46151/how-to-derive-the-least-square-estimator-for-multiple-linear-regression

Least squares is pretty obvious when you realize that the log of the normal density is negative one half the squared, scaled distance from the mean, i.e., -0.5 * ((y - mu)/sigma)^2. You can code a simple regression in Stan on unit scale as simply

  target -= 0.5 * dot_self(y - mu);