Author: Nikolay Gospodinov
Publisher:
ISBN:
Category :
Languages : en
Pages : 37
Book Description
Empirical analysis often involves using inexact measures of desired predictors. The bias created by the correlation between the problematic regressors and the error term motivates the need for instrumental variables estimation. This paper considers a class of estimators that can be used when external instruments may not be available or are weak. The idea is to exploit the relation between the parameters of the model and the least squares biases. In cases when this mapping is not analytically tractable, a special algorithm is designed to simulate the latent predictors without completely specifying the processes that induce the biases. The estimators perform well in simulations of the autoregressive distributed lag model and the dynamic panel model. The methodology is used to re-examine the Phillips curve, in which the real activity gap is latent.
Minimum Distance Estimation of Dynamic Models with Errors-in-Variables
Author: Nikolay Gospodinov
Publisher:
ISBN:
Category :
Languages : en
Pages : 37
Book Description
Empirical analysis often involves using inexact measures of desired predictors. The bias created by the correlation between the problematic regressors and the error term motivates the need for instrumental variables estimation. This paper considers a class of estimators that can be used when external instruments may not be available or are weak. The idea is to exploit the relation between the parameters of the model and the least squares biases. In cases when this mapping is not analytically tractable, a special algorithm is designed to simulate the latent predictors without completely specifying the processes that induce the biases. The estimators perform well in simulations of the autoregressive distributed lag model and the dynamic panel model. The methodology is used to re-examine the Phillips curve, in which the real activity gap is latent.
Publisher:
ISBN:
Category :
Languages : en
Pages : 37
Book Description
Empirical analysis often involves using inexact measures of desired predictors. The bias created by the correlation between the problematic regressors and the error term motivates the need for instrumental variables estimation. This paper considers a class of estimators that can be used when external instruments may not be available or are weak. The idea is to exploit the relation between the parameters of the model and the least squares biases. In cases when this mapping is not analytically tractable, a special algorithm is designed to simulate the latent predictors without completely specifying the processes that induce the biases. The estimators perform well in simulations of the autoregressive distributed lag model and the dynamic panel model. The methodology is used to re-examine the Phillips curve, in which the real activity gap is latent.
Minimum Distance Estimation of the Errors-In-Variables Model Using Linear Cumulant Equations
Author: Timothy Erickson
Publisher:
ISBN:
Category :
Languages : en
Pages : 31
Book Description
We consider a multiple mismeasured regressor errors-in-variables model. We develop closed-form minimum distance estimators from any number of estimating equations, which are linear in the third and higher cumulants of the observable variables. Using the cumulant estimators alters qualitative inference relative to ordinary least squares in two applications related to investment and leverage regressions. The estimators perform well in Monte Carlos calibrated to resemble the data from our applications. Although the cumulant estimators are asymptotically equivalent to the moment estimators from Erickson and Whited (2002), the finite-sample performance of the cumulant estimators exceeds that of the moment estimators.
Publisher:
ISBN:
Category :
Languages : en
Pages : 31
Book Description
We consider a multiple mismeasured regressor errors-in-variables model. We develop closed-form minimum distance estimators from any number of estimating equations, which are linear in the third and higher cumulants of the observable variables. Using the cumulant estimators alters qualitative inference relative to ordinary least squares in two applications related to investment and leverage regressions. The estimators perform well in Monte Carlos calibrated to resemble the data from our applications. Although the cumulant estimators are asymptotically equivalent to the moment estimators from Erickson and Whited (2002), the finite-sample performance of the cumulant estimators exceeds that of the moment estimators.
Minimum Distance Measurement Errors Model Fitting
Author: Weixing Song
Publisher:
ISBN:
Category : Error analysis (Mathematics)
Languages : en
Pages : 212
Book Description
Publisher:
ISBN:
Category : Error analysis (Mathematics)
Languages : en
Pages : 212
Book Description
Minimum Distance Estimation of Nonstationary Time Series Models
Author: Hyungsik Roger Moon
Publisher:
ISBN:
Category :
Languages : en
Pages : 0
Book Description
This paper establishes the consistency and limit distribution of minimum distance (MD) estimators for time series models with deterministic or stochastic trends. We consider models that are linear in the variables, but involve nonlinear restrictions across parameters. Two complications arise. First, the unrestricted and restricted parameter space have to be rotated to separate fast converging components of the MD estimator from slowly converging components. Second, if the model includes stochastic trends it is desirable to use a random matrix to weigh the discrepancy between the unrestricted and restricted parameter estimates. In this case, the objective function of the MD estimator has a stochastic limit. We provide regularity conditions for the non-linear restriction function that are easier to verify than the stochastic equicontinuity conditions that typically arise from direct estimation of the restricted parameters. We derive the optimal weight matrix when the limit distribution of the unrestricted estimator is mixed normal and propose a goodness-of-fit test based on over-identifying restrictions. To illustrate the MD estimation we analyze a permanent-income model based on a linear-quadratic dynamic programming problem and a present-value model.
Publisher:
ISBN:
Category :
Languages : en
Pages : 0
Book Description
This paper establishes the consistency and limit distribution of minimum distance (MD) estimators for time series models with deterministic or stochastic trends. We consider models that are linear in the variables, but involve nonlinear restrictions across parameters. Two complications arise. First, the unrestricted and restricted parameter space have to be rotated to separate fast converging components of the MD estimator from slowly converging components. Second, if the model includes stochastic trends it is desirable to use a random matrix to weigh the discrepancy between the unrestricted and restricted parameter estimates. In this case, the objective function of the MD estimator has a stochastic limit. We provide regularity conditions for the non-linear restriction function that are easier to verify than the stochastic equicontinuity conditions that typically arise from direct estimation of the restricted parameters. We derive the optimal weight matrix when the limit distribution of the unrestricted estimator is mixed normal and propose a goodness-of-fit test based on over-identifying restrictions. To illustrate the MD estimation we analyze a permanent-income model based on a linear-quadratic dynamic programming problem and a present-value model.
Minimum Distance Procedures in Nonlinear Random Coefficient Models
Estimation of Dynamic Models with Error Components
Author: Theodore Wilbur Anderson
Publisher:
ISBN:
Category : Econometrics
Languages : en
Pages : 52
Book Description
Publisher:
ISBN:
Category : Econometrics
Languages : en
Pages : 52
Book Description
Errors-in-Variables Methods in System Identification
Author: Torsten Söderström
Publisher: Springer
ISBN: 3319750011
Category : Technology & Engineering
Languages : en
Pages : 495
Book Description
This book presents an overview of the different errors-in-variables (EIV) methods that can be used for system identification. Readers will explore the properties of an EIV problem. Such problems play an important role when the purpose is the determination of the physical laws that describe the process, rather than the prediction or control of its future behaviour. EIV problems typically occur when the purpose of the modelling is to get physical insight into a process. Identifiability of the model parameters for EIV problems is a non-trivial issue, and sufficient conditions for identifiability are given. The author covers various modelling aspects which, taken together, can find a solution, including the characterization of noise properties, extension to multivariable systems, and continuous-time models. The book finds solutions that are constituted of methods that are compatible with a set of noisy data, which traditional approaches to solutions, such as (total) least squares, do not find. A number of identification methods for the EIV problem are presented. Each method is accompanied with a detailed analysis based on statistical theory, and the relationship between the different methods is explained. A multitude of methods are covered, including: instrumental variables methods; methods based on bias-compensation; covariance matching methods; and prediction error and maximum-likelihood methods. The book shows how many of the methods can be applied in either the time or the frequency domain and provides special methods adapted to the case of periodic excitation. It concludes with a chapter specifically devoted to practical aspects and user perspectives that will facilitate the transfer of the theoretical material to application in real systems. Errors-in-Variables Methods in System Identification gives readers the possibility of recovering true system dynamics from noisy measurements, while solving over-determined systems of equations, making it suitable for statisticians and mathematicians alike. The book also acts as a reference for researchers and computer engineers because of its detailed exploration of EIV problems.
Publisher: Springer
ISBN: 3319750011
Category : Technology & Engineering
Languages : en
Pages : 495
Book Description
This book presents an overview of the different errors-in-variables (EIV) methods that can be used for system identification. Readers will explore the properties of an EIV problem. Such problems play an important role when the purpose is the determination of the physical laws that describe the process, rather than the prediction or control of its future behaviour. EIV problems typically occur when the purpose of the modelling is to get physical insight into a process. Identifiability of the model parameters for EIV problems is a non-trivial issue, and sufficient conditions for identifiability are given. The author covers various modelling aspects which, taken together, can find a solution, including the characterization of noise properties, extension to multivariable systems, and continuous-time models. The book finds solutions that are constituted of methods that are compatible with a set of noisy data, which traditional approaches to solutions, such as (total) least squares, do not find. A number of identification methods for the EIV problem are presented. Each method is accompanied with a detailed analysis based on statistical theory, and the relationship between the different methods is explained. A multitude of methods are covered, including: instrumental variables methods; methods based on bias-compensation; covariance matching methods; and prediction error and maximum-likelihood methods. The book shows how many of the methods can be applied in either the time or the frequency domain and provides special methods adapted to the case of periodic excitation. It concludes with a chapter specifically devoted to practical aspects and user perspectives that will facilitate the transfer of the theoretical material to application in real systems. Errors-in-Variables Methods in System Identification gives readers the possibility of recovering true system dynamics from noisy measurements, while solving over-determined systems of equations, making it suitable for statisticians and mathematicians alike. The book also acts as a reference for researchers and computer engineers because of its detailed exploration of EIV problems.
Moments of Least-squares Estimates in the Errors-in-variables Model
Author: Philippe Jules Bonan
Publisher:
ISBN:
Category :
Languages : en
Pages : 170
Book Description
Publisher:
ISBN:
Category :
Languages : en
Pages : 170
Book Description
Encyclopedia of Statistical Sciences, Volume 3
Author:
Publisher: John Wiley & Sons
ISBN: 0471743844
Category : Mathematics
Languages : en
Pages : 706
Book Description
ENCYCLOPEDIA OF STATISTICAL SCIENCES
Publisher: John Wiley & Sons
ISBN: 0471743844
Category : Mathematics
Languages : en
Pages : 706
Book Description
ENCYCLOPEDIA OF STATISTICAL SCIENCES
Minimum Distance Estimation in an Additive Effects Outliers Model
Author: Sunil Kumar Dhar
Publisher:
ISBN:
Category : Estimation theory
Languages : en
Pages : 168
Book Description
Publisher:
ISBN:
Category : Estimation theory
Languages : en
Pages : 168
Book Description