Journal of Econometrics

Published by Elsevier BV

Print ISSN: 0304-4076

Articles


Wealth Accumulation and Factors Accounting for Success
Article

March 2011

·

424 Reads

Anan Pawasutipaisit

·

Robert M Townsend
We use detailed income, balance sheet, and cash flow statements constructed for households in a long monthly panel in an emerging market economy, and some recent contributions in economic theory, to document and better understand the factors underlying success in achieving upward mobility in the distribution of net worth. Wealth inequality is decreasing over time, and many households work their way out of poverty and lower wealth over the seven year period. The accounts establish that, mechanically, this is largely due to savings rather than incoming gifts and remittances. In turn, the growth of net worth can be decomposed household by household into the savings rate and how productively that savings is used, the return on assets (ROA). The latter plays the larger role. ROA is, in turn, positively correlated with higher education of household members, younger age of the head, and with a higher debt/asset ratio and lower initial wealth, so it seems from cross-sections that the financial system is imperfectly channeling resources to productive and poor households. Household fixed effects account for the larger part of ROA, and this success is largely persistent, undercutting the story that successful entrepreneurs are those that simply get lucky. Persistence does vary across households, and in at least one province with much change and increasing opportunities, ROA changes as households move over time to higher-return occupations. But for those households with high and persistent ROA, the savings rate is higher, consistent with some micro founded macro models with imperfect credit markets. Indeed, high ROA households save by investing in their own enterprises and adopt consistent financial strategies for smoothing fluctuations. More generally growth of wealth, savings levels and/or rates are correlated with TFP and the household fixed effects that are the larger part of ROA.
Share

Discrete Factor Approximations in Simultaneous Equation Models: Estimating the Impact of a Dummy Endogenous Variable on a Continuous Outcome

November 1999

·

113 Reads

This paper contains a Monte Carlo evaluation of estimators used to control for endogeneity of dummy explanatory variables in continuous outcome regression models. When the true model has bivariate normal disturbances, estimators using discrete factor approximations compare favorably to efficient estimators in terms of precision and bias; these approximation estimators dominate all the other estimators examined when the disturbances are non-normal. The experiments also indicate that one should liberally add points of support to the discrete factor distribution. The paper concludes with an application of the discrete factor approximation to the estimation of the impact of marriage on wages.

On Spatial Processes and Asymptotic Inference under Near-Epoch Dependence

September 2012

·

123 Reads

The development of a general inferential theory for nonlinear models with cross-sectionally or spatially dependent data has been hampered by a lack of appropriate limit theorems. To facilitate a general asymptotic inference theory relevant to economic applications, this paper first extends the notion of near-epoch dependent (NED) processes used in the time series literature to random fields. The class of processes that is NED on, say, an α-mixing process, is shown to be closed under infinite transformations, and thus accommodates models with spatial dynamics. This would generally not be the case for the smaller class of α-mixing processes. The paper then derives a central limit theorem and law of large numbers for NED random fields. These limit theorems allow for fairly general forms of heterogeneity including asymptotically unbounded moments, and accommodate arrays of random fields on unevenly spaced lattices. The limit theorems are employed to establish consistency and asymptotic normality of GMM estimators. These results provide a basis for inference in a wide range of models with spatial dependence.

Specification and Estimation of Spatial Autoregressive Models with Autoregressive and Heteroskedastic Disturbances

July 2010

·

367 Reads

This study develops a methodology of inference for a widely used Cliff-Ord type spatial model containing spatial lags in the dependent variable, exogenous variables, and the disturbance terms, while allowing for unknown heteroskedasticity in the innovations. We first generalize the GMM estimator suggested in Kelejian and Prucha (1998,1999) for the spatial autoregressive parameter in the disturbance process. We also define IV estimators for the regression parameters of the model and give results concerning the joint asymptotic distribution of those estimators and the GMM estimator. Much of the theory is kept general to cover a wide range of settings.

Wages, Welfare Benefits and Migration

May 2010

·

88 Reads

Differences in economic opportunities give rise to strong migration incentives, across regions within countries, and across countries. In this paper we focus on responses to differences in welfare benefits across States. We apply the model developed in Kennan and Walker (2008), which emphasizes that migration decisions are often reversed, and that many alternative locations must be considered. We model individual decisions to migrate as a job search problem. A worker starts the life-cycle in some home location and must determine the optimal sequence of moves before settling down. The model is sparsely parameterized. We estimate the model using data from the National Longitudinal Survey of Youth (1979). Our main finding is that income differences do help explain the migration decisions of young welfare-eligible women, but large differences in benefit levels provide surprisingly weak migration incentives.

Comparing IV with Structural Models: What Simple IV Can and Cannot Identify

May 2010

·

68 Reads

This paper compares the economic questions addressed by instrumental variables estimators with those addressed by structural approaches. We discuss Marschak's Maxim: estimators should be selected on the basis of their ability to answer well-posed economic problems with minimal assumptions. A key identifying assumption that allows structural methods to be more informative than IV can be tested with data and does not have to be imposed.

Testing the Correlated Random Coefficient Model

October 2010

·

97 Reads

The recent literature on instrumental variables (IV) features models in which agents sort into treatment status on the basis of gains from treatment as well as on baseline-pretreatment levels. Components of the gains known to the agents and acted on by them may not be known by the observing economist. Such models are called correlated random coe cient models. Sorting on unobserved components of gains complicates the interpretation of what IV estimates. This paper examines testable implications of the hypothesis that agents do not sort into treatment based on gains. In it, we develop new tests to gauge the empirical relevance of the correlated random coe cient model to examine whether the additional complications associated with it are required. We examine the power of the proposed tests. We derive a new representation of the variance of the instrumental variable estimator for the correlated random coefficient model. We apply the methods in this paper to the prototypical empirical problem of estimating the return to schooling and nd evidence of sorting into schooling based on unobserved components of gains.

A random coefficient probit model with an application to a study of migration

February 1979

·

62 Reads

This paper extends the ordered response polytomous probit model to a possibly more realistic framework in which the coefficients are allowed to be random. The new method is then used to analyze family migration behavior. It is seen that the new method allows us to make inferences that were not possible in the fixed coefficient model.

Conception intervals and the substitution of fertility over time

May 1985

·

19 Reads

PIP This paper applies the waiting-time regression methods of Olsen and Wolpin (1983) to an analysis of fertility. A utility maximizing model is set up and used to provide some guidance for an empirical analysis. The data are from an experimental guaranteed job program, the Youth Incentive Entitlement Pilot Project, aimed at young women 16 to 20 years old, from poverty-level families, and not yet high school graduates. The waiting-time regression method of estimation permits the youth in question to be used as her own control revealing how eligibility for the jobs program changes the durations of periods between live-birth conceptions. 3890 women surveyed had 1 birth, 429 had 2, 112 had 3, 26 had 4, and 7 had 5. Without this person specific control described here, the most important factors affecting fertility are number of siblings (negative effect), labor market attachment by parents, especially the father, and the presence of the natural father. With the person specific control, the results predicted from economic theory do emerge: even adolescent and young women consider the economic consequences of fertility reflected in effects of fertility when wages are high in favor of fertility with lower wages. Post program effects (taking place after youths lose eligibility for the program) are a rather rapid making up for foregone fertility, reducing likelihood of net reductions of total fertility.

Family size and social utility: income distribution dominance criteria

October 1989

·

95 Reads

"This paper generalizes previous results on income distribution dominance in the case where the population of income recipients is broken down into groups with distinct utility functions. The example taken here is that of income redistribution across families of different sizes. The paper first investigates the simplest assumptions that can be made about family utility functions. A simple dominance criterion is then derived under the only assumptions that family functions are increasing and concave with income and the marginal utility of income increases with family size."

National Estimates of Gross Employment and Job Flows from the Quarterly Workforce Indicators with Demographic and Industry Detail

March 2011

·

61 Reads

The Quarterly Workforce Indicators (QWI) are local labor market data produced and released every quarter by the United States Census Bureau. Unlike any other local labor market series produced in the U.S. or the rest of the world, QWI measure employment flows for workers (accession and separations), jobs (creations and destructions) and earnings for demographic subgroups (age and gender), economic industry (NAICS industry groups), detailed geography (block (experimental), county, Core-Based Statistical Area, and Workforce Investment Area), and ownership (private, all) with fully interacted publication tables. The current QWI data cover 47 states, about 98% of the private workforce in those states, and about 92% of all private employment in the entire economy. State participation is sufficiently extensive to permit us to present the first national estimates constructed from these data. We focus on worker, job, and excess (churning) reallocation rates, rather than on levels of the basic variables. This permits comparison to existing series from the Job Openings and Labor Turnover Survey and the Business Employment Dynamics Series from the Bureau of Labor Statistics (BLS). The national estimates from the QWI are an important enhancement to existing series because they include demographic and industry detail for both worker and job flow data compiled from underlying micro-data that have been integrated at the job and establishment levels by the Longitudinal Employer-Household Dynamics Program at the Census Bureau. The estimates presented herein were compiled exclusively from public-use data series and are available for download.

Limit Theory for Panel Data Models with Cross Sectional Dependence and Sequential Exogeneity

June 2013

·

84 Reads

The paper derives a general Central Limit Theorem (CLT) and asymptotic distributions for sample moments related to panel data models with large n. The results allow for the data to be cross sectionally dependent, while at the same time allowing the regressors to be only sequentially rather than strictly exogenous. The setup is sufficiently general to accommodate situations where cross sectional dependence stems from spatial interactions and/or from the presence of common factors. The latter leads to the need for random norming. The limit theorem for sample moments is derived by showing that the moment conditions can be recast such that a martingale difference array central limit theorem can be applied. We prove such a central limit theorem by first extending results for stable convergence in Hall and Hedye (1980) to non-nested martingale arrays relevant for our applications. We illustrate our result by establishing a generalized estimation theory for GMM estimators of a fixed effect panel model without imposing i.i.d. or strict exogeneity conditions. We also discuss a class of Maximum Likelihood (ML) estimators that can be analyzed using our CLT.

Simultaneous Equations for Hazards: Marriage Duration and Fertility Timing

April 1993

·

94 Reads

"This paper develops an approach to simultaneity among hazard equations which is similar in spirit to simultaneous Tobit models. It introduces a class of continuous time models which incorporates two forms of simultaneity across related processes--when the hazard rate of one process depends (1) on the hazard rate of another process or (2) on the actual current state of or prior outcomes of a related multi-episode process. This paper also develops an approach to modeling the notion of 'multiple clocks' in which one process may depend on the duration of a related process, in addition to its own. Maximum likelihood estimation is proposed based on specific parametric assumptions. The model is developed in the context of and empirically applied to the joint determination of marital duration and timing of marital conceptions."

Nonparametric model validations for hidden Markov models with applications in financial econometrics

June 2011

·

48 Reads

We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

Distribution of estimated break fractions in the one break model.
Distribution of estimated break fractions in the two break model.
Distribution of estimated break fractions in the no break model.
Inference regarding multiple structural changes in linear models with endogenous regressors
ArticleFull-text available

October 2012

·

90 Reads

This paper considers the linear model with endogenous regressors and multiple changes in the parameters at unknown times. It is shown that minimization of a Generalized Method of Moments criterion yields inconsistent estimators of the break fractions, but minimization of the Two Stage Least Squares (2SLS) criterion yields consistent estimators of these parameters. We develop a methodology for estimation and inference of the parameters of the model based on 2SLS. The analysis covers the cases where the reduced form is either stable or unstable. The methodology is illustrated via an application to the New Keynesian Phillips Curve for the US.
Download

Stratified partial likelihood estimation

November 1999

·

63 Reads

When multiple durations are generated by a single unit, they may be related in a way that is not fully captured by the regressors. The omitted unit-specific variables might vary over the durations. They might also be correlated with the variables in the regression component. The authors propose an estimator that responds to these concerns and develop a specification test for detecting unobserved unit-specific effects. Data from Malaysia reveal that concentration of child mortality in some families is imperfectly explained by observed explanatory variables, and that failure to control for unobserved heterogeneity seriously biases the parameter estimates.

Distribution of break fraction estimators for a change from I(0) to I(1) for T=100.
Ratio-based estimators for a change point in persistence

November 2012

·

60 Reads

We study estimation of the date of change in persistence, from [Formula: see text] to [Formula: see text] or vice versa. Contrary to statements in the original papers, our analytical results establish that the ratio-based break point estimators of Kim [Kim, J.Y., 2000. Detection of change in persistence of a linear time series. Journal of Econometrics 95, 97-116], Kim et al. [Kim, J.Y., Belaire-Franch, J., Badillo Amador, R., 2002. Corringendum to "Detection of change in persistence of a linear time series". Journal of Econometrics 109, 389-392] and Busetti and Taylor [Busetti, F., Taylor, A.M.R., 2004. Tests of stationarity against a change in persistence. Journal of Econometrics 123, 33-66] are inconsistent when a mean (or other deterministic component) is estimated for the process. In such cases, the estimators converge to random variables with upper bound given by the true break date when persistence changes from [Formula: see text] to [Formula: see text]. A Monte Carlo study confirms the large sample downward bias and also finds substantial biases in moderate sized samples, partly due to properties at the end points of the search interval.

Inference in a Synchronization Game with Social Interactions

January 2009

·

33 Reads

This paper studies inference in a continuous time game where an agent's decision to quit an activity depends on the participation of other players. In equilibrium, similar actions can be explained not only by direct influences but also by correlated factors. Our model can be seen as a simultaneous duration model with multiple decision makers and interdependent durations. We study the problem of determining the existence and uniqueness of equilibrium stopping strategies in this setting. This paper provides results and conditions for the detection of these endogenous effects. First, we show that the presence of such effects is a necessary and sufficient condition for simultaneous exits. This allows us to set up a nonparametric test for the presence of such influences which is robust to multiple equilibria. Second, we provide conditions under which parameters in the game are identified. Finally, we apply the model to data on desertion in the Union Army during the American Civil War and find evidence of endogenous influences.

Monitoring mortality

October 1983

·

20 Reads

"A state-space model is developed which provides estimates of decrements in a dynamic environment. The model integrates the actual unfolding experience and a priori or Bayesian views of the rates. The estimates of present rates and predicted future rates are continually updated and associated standard errors have simple expressions. The model is described and applied in the context of mortality estimation but it should prove useful in other actuarial applications. The approach is particularly suitable for dynamic environments where data are scarce and updated parameter estimates are required on a regular basis. To illustrate the method it is used to monitor the unfolding mortality experience of the retired lives under an actual pension plan."

Nonparametric Transfer Function Models

July 2010

·

100 Reads

In this paper a class of nonparametric transfer function models is proposed to model nonlinear relationships between 'input' and 'output' time series. The transfer function is smooth with unknown functional forms, and the noise is assumed to be a stationary autoregressive-moving average (ARMA) process. The nonparametric transfer function is estimated jointly with the ARMA parameters. By modeling the correlation in the noise, the transfer function can be estimated more efficiently. The parsimonious ARMA structure improves the estimation efficiency in finite samples. The asymptotic properties of the estimators are investigated. The finite-sample properties are illustrated through simulations and one empirical example.


Impact factors

September 2005

·

106 Reads

In this paper we discuss sensitivity of forecasts with respect to the information set considered in prediction; a sensitivity measure called impact factor, IF, is defined. This notion is specialized to the case of VAR processes integrated of order 0, 1 and 2. For stationary VARs this measure corresponds to the sum of the impulse response coefficients. For integrated VAR systems, the IF has a direct interpretation in terms of long-run forecasts. Various applications of this concept are reviewed; they include questions of policy effectiveness and of forecast uncertainty due to data revisions. A unified approach to inference on the IF is given, showing under what circumstances standard asymptotic inference can be conducted also in systems integrated of order 1 and 2. It is shown how the results reported here can be used to calculate similar sensitivity measures for models with a simultaneity structure.




Top-cited authors