Stephen Martin provides a good explanation of why a uniform prior over correlation matrices cannot lead to a uniform marginal distribution for any correlation coefficient (unless we’re dealing only with 2 dimensions):
The answer lies in the constraints of the correlation matrix. Correlation matrices must be symmetric, and positive semi-definite (PSD). That PSD constraint alters where probability mass can exist, and where it will accumulate marginally. When you only have variables, this is not an issue. But when you have variables, then one correlation’s value constrains what other correlations can be, if the psd constraint is to be met. I.e., you cannot create a correlation matrix, and fill the off-diagonals with uniform(-1,1) values and expect it to be PSD. As K increases, the chances of creating a correlation matrix from uniformly distributed elements that is also PSD rapidly increases near-zero.
…
You may want to know what the marginal priors for each correlation are, given a uniform LKJ and K variables. A useful result is provided, as always, by Ben Goodrich. He stated:
In general, when there are K variables, then the marginal distribution of a single correlation is Beta on the (-1,1) interval with both shape parameters equal to K / 2.