On the Use of a Log-Rate Model for Survey-Weighted Categorical Data |
| |
Authors: | Thomas M. Loughin Christopher R. Bilder |
| |
Affiliation: | 1. Department of Statistics and Actuarial Science , Simon Fraser University , Burnaby , British Columbia , Canada tloughin@sfu.ca;3. Department of Statistics , University of Nebraska Lincoln , Lincoln , Nebraska , USA |
| |
Abstract: | ![]() For the analysis of survey-weighted categorical data, one recommended method of analysis is a log-rate model. For each cell in a contingency table, the survey weights are averaged across subjects and incorporated into an offset for a loglinear model. Supposedly, one can then proceed with the analysis of unweighted observed cell counts. We provide theoretical and simulation-based evidence to show that the log-rate analysis is not an effective statistical analysis method and should not be used in general. The root of the problem is in its failure to properly account for variability in the individual weights within cells of a contingency table. This results in goodness-of-fit tests that have higher-than-nominal error rates and confidence intervals for odds ratios that have lower-than-nominal coverage. |
| |
Keywords: | Clogg-Eliason Contingency table Loglinear model Offset Rao-Scott Survey sampling |
|
|