Loading…
Analyze as randomized—Why dropping block effects in designed experiments is a bad idea
Agricultural experiments are often laid out as blocked designs such as the randomized complete block designs (RCBD) or split‐plot designs (SPD). Statistical analysis should follow the principle “analyze as randomized.” However, block effects are often not modeled or are typically dropped from the mo...
Saved in:
Published in: | Agronomy journal 2024-05, Vol.116 (3), p.1371-1381 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Agricultural experiments are often laid out as blocked designs such as the randomized complete block designs (RCBD) or split‐plot designs (SPD). Statistical analysis should follow the principle “analyze as randomized.” However, block effects are often not modeled or are typically dropped from the model when non‐significant. Additionally, if linear mixed models are fitted by REML with a non‐negativity constraint on all variances, an implicit model reduction occurs when the variance estimate for random block effects becomes negative in the final iteration. This study investigates the consequences of these types of model reduction on the Type I error rate and the standard error of a treatment difference (s.e.d.) by Monte Carlo simulation for experiments designed as RCBD or SPD. The number of blocks and treatments and the ratio of block or main‐plot error variance to residual error variance were varied, resulting in 27 scenarios. Dropping the block effect by default resulted in deflated Type I error rates and increased the s.e.d. in the RCBD. For the SPD, Type I error rates were inflated for the sub‐plot factor and the main‐plot‐by‐sub‐plot factor interaction, but were deflated for the main‐plot factor. Adverse effects of model reduction were reduced, but did not vanish completely, for model reduction based on significance testing. Implicit model reduction led to inflated Type I error rates for small variance ratios and small datasets. We conclude that block effects as well as main‐plot error effects need to be included by all means in statistical analysis.
Core Ideas
Do not drop block effect or main‐plot error effects in the analysis of designed experiments.
Dropping the block or main‐plot error effect in blocked experiments causes bias in Type I error rates.
Dropping of the block effects can occur implicitly in REML‐based analysis if the true block variance is relatively small.
Drop block effects only when non‐significant avoids most adverse effects on Type I error rates. |
---|---|
ISSN: | 0002-1962 1435-0645 |
DOI: | 10.1002/agj2.21570 |