Please use this identifier to cite or link to this item: https://hdl.handle.net/10419/268486 
Year of Publication: 
2023
Series/Report no.: 
Discussion Paper No. 2023/2
Publisher: 
Freie Universität Berlin, School of Business & Economics, Berlin
Abstract: 
High nonresponse rates have become a rule in survey sampling. In panel surveys there occur additional sample losses due to panel attrition, which are thought to worsen the bias resulting from initial nonresponse. However, under certain conditions an initial wave nonresponse bias may vanish in later panel waves. We study such a "Fade away" of an initial nonresponse bias in the context of regression analysis. By using a time series approach for the covariate and the error terms we derive the bias of cross-sectional OLS-estimates of the slope coefficient. In the case of no subsequent attrition and only serial correlation an initial bias converges to zero. If the nonresponse affects permanent components the initial bias will decrease to a limit which is determined by the size of the permanent components. Attrition is discussed here in a worst case scenario, where there is a steady selective drift into the same direction as in the initial panel wave. It is shown that the fade away effect dampens the attrition effect to a large extent depending on the temporal stability of the covariate and the dependent variable. The attrition effect may by further reduced by a weighted regression analysis, where the weights are estimated attrition probabilities on the basis of the lagged dependent variable. The results are discussed with respect to surveys with unsure selection procedures which are used in a longitudinal fashion, like access panels.
Subjects: 
Regression Analysis
Nonresponse Bias
Panel Attrition
Inverse Probability Weighting
Persistent Identifier of the first edition: 
Document Type: 
Working Paper

Files in This Item:
File
Size





Items in EconStor are protected by copyright, with all rights reserved, unless otherwise indicated.