Loading…
Maximum entropy sampling and optimal Bayesian experimental design
When Shannon entropy is used as a criterion in the optimal design of experiments, advantage can be taken of the classical identity representing the joint entropy of parameters and observations as the sum of the marginal entropy of the observations and the preposterior conditional entropy of the para...
Saved in:
Published in: | Journal of the Royal Statistical Society. Series B, Statistical methodology Statistical methodology, 2000, Vol.62 (1), p.145-157 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | When Shannon entropy is used as a criterion in the optimal design of experiments, advantage can be taken of the classical identity representing the joint entropy of parameters and observations as the sum of the marginal entropy of the observations and the preposterior conditional entropy of the parameters. Following previous work in which this idea was used in spatial sampling, the method is applied to standard parameterized Bayesian optimal experimental design. Under suitable conditions, which include non-linear as well as linear regression models, it is shown in a few steps that maximizing the marginal entropy of the sample is equivalent to minimizing the pre-posterior entropy, the usual Bayesian criterion, thus avoiding the use of conditional distributions. It is shown using this marginal formulation that under normality assumptions every standard model which has a two-point prior distribution on the parameters gives an optimal design supported on a single point. Other results include a new asymptotic formula which applies as the error variance is large and bounds on support size. |
---|---|
ISSN: | 1369-7412 1467-9868 |
DOI: | 10.1111/1467-9868.00225 |