I am trying to calculate the mean and standard deviation of the area of a polygon of which I have sample (x,y) coordinates of the polygon’s coordinates (but I don’t know the real (x,y) coordinates of the polygon’s corners).
This is my model, but I am not sure I am doing this write (it compiles and runs, but I think I am not getting values that make sense.
data {
int<lower=0> N; // Number of data points
int<lower=0> C;
matrix[C, N] x; // x coordinates
matrix[C, N] y; // y coordinates
}
parameters {
real<lower=0> mean_a;
real<lower=0> sd_a;
}
model {
real area = 0.5 * abs(sum(x[1:N-1,] .* y[2:N,] - y[1:N-1,] .* x[2:N,]));
// Prior for mean_a and sd_a
mean_a ~ normal(0, 50);
sd_a ~ cauchy(0, 20);
// Likelihood
target += normal_lpdf(area | mean_a, sd_a);
}
and this is an example data set:
{
"N": 3,
"C": 4,
"x": [
[1, 2, 1],
[4, 4.5, 4.7],
[2, 1.5, 1.5],
[3.8, 4.2, 4]
],
"y": [
[1, 2, 1],
[4, 4.5, 4.7],
[2, 1.5, 1.5],
[3.8, 4.2, 4]
]
}
chatGPT helped with the model specification but it doesn’t look right…