Author: Daniel Friedman
Publisher: Routledge
ISBN: 1317821246
Category : Business & Economics
Languages : en
Pages : 152
Book Description
For several decades, the orthodox economics approach to understanding choice under risk has been to assume that each individual person maximizes some sort of personal utility function defined over purchasing power. This new volume contests that even the best wisdom from the orthodox theory has not yet been able to do better than supposedly naïve models that use rules of thumb, or that focus on the consumption possibilities and economic constraints facing the individual. The authors assert this by first revisiting the origins of orthodox theory. They then recount decades of failed attempts to obtain meaningful empirical validation or calibration of the theory. Estimated shapes and parameters of the "curves" have varied erratically from domain to domain (e.g., individual choice versus aggregate behavior), from context to context, from one elicitation mechanism to another, and even from the same individual at different time periods, sometimes just minutes apart. This book proposes the return to a simpler sort of scientific theory of risky choice, one that focuses not upon unobservable curves but rather upon the potentially observable opportunities and constraints facing decision makers. It argues that such an opportunities-based model offers superior possibilities for scientific advancement. At the very least, linear utility – in the presence of constraints - is a useful bar for the "curved" alternatives to clear.
Risky Curves
The Economics of Search
Author: Brian McCall
Publisher: Routledge
ISBN: 1134422350
Category : Business & Economics
Languages : en
Pages : 574
Book Description
The economics of search is a prominent component of economic theory, and it has a richness and elegance that underpins a host of practical applications. In this book Brian and John McCall present a comprehensive overview of the economic theory of search, from the classical model of job search formulated 40 years ago to the recent developments in equilibrium models of search. The book gives decision-theoretic foundations to seemingly slippery issues in labour market theory, estimation theory and economic dynamics in general, and surveys the entire field of the economics of search, including its history, theory, and econometric applications. Theoretical models of the economics of search are covered as well as estimation methods used in search theory and topics covered include job search, turnover, unemployment, liquidity, house selling, real options and auctions. The mathematical methods used in search theory such as dynamic programming are reviewed as well as structural estimation methods and econometric methods for duration models. The authors also explore the classic sequential search model and its extensions in addition to recent advances in equilibrium search theory.
Publisher: Routledge
ISBN: 1134422350
Category : Business & Economics
Languages : en
Pages : 574
Book Description
The economics of search is a prominent component of economic theory, and it has a richness and elegance that underpins a host of practical applications. In this book Brian and John McCall present a comprehensive overview of the economic theory of search, from the classical model of job search formulated 40 years ago to the recent developments in equilibrium models of search. The book gives decision-theoretic foundations to seemingly slippery issues in labour market theory, estimation theory and economic dynamics in general, and surveys the entire field of the economics of search, including its history, theory, and econometric applications. Theoretical models of the economics of search are covered as well as estimation methods used in search theory and topics covered include job search, turnover, unemployment, liquidity, house selling, real options and auctions. The mathematical methods used in search theory such as dynamic programming are reviewed as well as structural estimation methods and econometric methods for duration models. The authors also explore the classic sequential search model and its extensions in addition to recent advances in equilibrium search theory.
Estimating Discount Functions with Consumption Choices Over the Lifecycle
Author: David I. Laibson
Publisher:
ISBN:
Category : Consumption (Economics)
Languages : en
Pages : 64
Book Description
Intertemporal preferences are difficult to measure. We estimate time preferences using a structural buffer stock consumption model and the Method of Simulated Moments. The model includes stochastic labor income, liquidity constraints, child and adult dependents, liquid and illiquid assets, revolving credit, retirement, and discount functions that allow short-run and long-run discount rates to differ. Data on retirement wealth accumulation, credit card borrowing, and consumption-income comovement identify the model. Our benchmark estimates imply a 40% short-term annualized discount rate and a 4.3% long-term annualized discount rate. Almost all specifications reject the restriction to a constant discount rate. Our quantitative results are sensitive to assumptions about the return on illiquid assets and the coefficient of relative risk aversion. When we jointly estimate the coefficient of relative risk aversion and the discount function, the short-term discount rate is 15% and the long-term discount rate is 3.8%.
Publisher:
ISBN:
Category : Consumption (Economics)
Languages : en
Pages : 64
Book Description
Intertemporal preferences are difficult to measure. We estimate time preferences using a structural buffer stock consumption model and the Method of Simulated Moments. The model includes stochastic labor income, liquidity constraints, child and adult dependents, liquid and illiquid assets, revolving credit, retirement, and discount functions that allow short-run and long-run discount rates to differ. Data on retirement wealth accumulation, credit card borrowing, and consumption-income comovement identify the model. Our benchmark estimates imply a 40% short-term annualized discount rate and a 4.3% long-term annualized discount rate. Almost all specifications reject the restriction to a constant discount rate. Our quantitative results are sensitive to assumptions about the return on illiquid assets and the coefficient of relative risk aversion. When we jointly estimate the coefficient of relative risk aversion and the discount function, the short-term discount rate is 15% and the long-term discount rate is 3.8%.
Sufficient Statistics for Welfare Analysis
Author: Raj Chetty
Publisher:
ISBN:
Category : Welfare economics
Languages : en
Pages : 64
Book Description
The debate between "structural" and "reduced-form" approaches has generated substantial controversy in applied economics. This article reviews a recent literature in public economics that combines the advantages of reduced-form strategies -- transparent and credible identification -- with an important advantage of structural models -- the ability to make predictions about counterfactual outcomes and welfare. This recent work has developed formulas for the welfare consequences of various policies that are functions of high-level elasticities rather than deep primitives. These formulas provide theoretical guidance for the measurement of treatment effects using program evaluation methods. I present a general framework that shows how many policy questions can be answered by identifying a small set of sufficient statistics. I use this framework to synthesize the modern literature on taxation, social insurance, and behavioral welfare economics. Finally, I discuss topics in labor economics, industrial organization, and macroeconomics that can be tackled using the sufficient statistic approach.
Publisher:
ISBN:
Category : Welfare economics
Languages : en
Pages : 64
Book Description
The debate between "structural" and "reduced-form" approaches has generated substantial controversy in applied economics. This article reviews a recent literature in public economics that combines the advantages of reduced-form strategies -- transparent and credible identification -- with an important advantage of structural models -- the ability to make predictions about counterfactual outcomes and welfare. This recent work has developed formulas for the welfare consequences of various policies that are functions of high-level elasticities rather than deep primitives. These formulas provide theoretical guidance for the measurement of treatment effects using program evaluation methods. I present a general framework that shows how many policy questions can be answered by identifying a small set of sufficient statistics. I use this framework to synthesize the modern literature on taxation, social insurance, and behavioral welfare economics. Finally, I discuss topics in labor economics, industrial organization, and macroeconomics that can be tackled using the sufficient statistic approach.
Monthly Labor Review
Author:
Publisher:
ISBN:
Category : Labor laws and legislation
Languages : en
Pages : 418
Book Description
Publishes in-depth articles on labor subjects, current labor statistics, information about current labor contracts, and book reviews.
Publisher:
ISBN:
Category : Labor laws and legislation
Languages : en
Pages : 418
Book Description
Publishes in-depth articles on labor subjects, current labor statistics, information about current labor contracts, and book reviews.
The American Economic Review
On the Desirability of Fiscal Constraints in a Monetary Union
Author: V. V. Chari
Publisher:
ISBN:
Category : Fiscal policy
Languages : en
Pages : 32
Book Description
"The desirability of fiscal constraints in monetary unions depends critically on whether the monetary authority can commit to follow its policies. If it can commit, then debt constraints can only impose costs. If it cannot commit, then fiscal policy has a free-rider problem, and debt constraints may be desirable. This type of free-rider problem is new and arises only because of a time inconsistency problem"--NBER website
Publisher:
ISBN:
Category : Fiscal policy
Languages : en
Pages : 32
Book Description
"The desirability of fiscal constraints in monetary unions depends critically on whether the monetary authority can commit to follow its policies. If it can commit, then debt constraints can only impose costs. If it cannot commit, then fiscal policy has a free-rider problem, and debt constraints may be desirable. This type of free-rider problem is new and arises only because of a time inconsistency problem"--NBER website
How Large are the Classification Errors in the Social Security Disability Award Process?
Author: Hugo Benítez-Silva
Publisher:
ISBN:
Category : Disability evaluation
Languages : en
Pages : 72
Book Description
This paper presents an audit' of the multistage application and appeal process that the U.S. Social Security Administration (SSA) uses to determine eligibility for disability benefits from the Disability Insurance (DI) and Supplemental Security Income (SSI) programs. We study a subset of individuals from the Health and Retirement Study (HRS) who applied for DI or SSI benefits between 1992 and 1996. We compare the SSA's ultimate award decision (i.e. after allowing for appeals) to the applicant's self-reported disability status. We use these data to estimate classification error rates under the hypothesis that applicants' self-reported disability status and the SSA's ultimate award decision are noisy but unbiased indicators of, a latent true disability status' indicator. We find that approximately 20% of SSI/DI applicants who are ultimately awarded benefits are not disabled, and that 60% of applicants who were denied benefits are disabled. Our analysis also yields insights into the patterns of self-selection induced by varying delays and award probabilities at various levels of the application and appeal process. We construct an optimal statistical screening rule using a subset of objective health indicators that the SSA uses in making award decisions that results in significantly lower classification error rates than does SSA's current award process.
Publisher:
ISBN:
Category : Disability evaluation
Languages : en
Pages : 72
Book Description
This paper presents an audit' of the multistage application and appeal process that the U.S. Social Security Administration (SSA) uses to determine eligibility for disability benefits from the Disability Insurance (DI) and Supplemental Security Income (SSI) programs. We study a subset of individuals from the Health and Retirement Study (HRS) who applied for DI or SSI benefits between 1992 and 1996. We compare the SSA's ultimate award decision (i.e. after allowing for appeals) to the applicant's self-reported disability status. We use these data to estimate classification error rates under the hypothesis that applicants' self-reported disability status and the SSA's ultimate award decision are noisy but unbiased indicators of, a latent true disability status' indicator. We find that approximately 20% of SSI/DI applicants who are ultimately awarded benefits are not disabled, and that 60% of applicants who were denied benefits are disabled. Our analysis also yields insights into the patterns of self-selection induced by varying delays and award probabilities at various levels of the application and appeal process. We construct an optimal statistical screening rule using a subset of objective health indicators that the SSA uses in making award decisions that results in significantly lower classification error rates than does SSA's current award process.
The Great Depression and the Friedman-Schwartz Hypothesis
Author: Lawrence J. Christiano
Publisher:
ISBN:
Category : Depressions
Languages : en
Pages : 112
Book Description
We evaluate the Friedman-Schwartz hypothesis that a more accommodative monetary policy could have greatly reduced the severity of the Great Depression. To do this, we first estimate a dynamic, general equilibrium model using data from the 1920s and 1930s. Although the model includes eight shocks, the story it tells about the Great Depression turns out to be a simple and familiar one. The contraction phase was primarily a consequence of a shock that induced a shift away from privately intermediated liabilities, such as demand deposits and liabilities that resemble equity, and towards currency. The slowness of the recovery from the Depression was due to a shock that increased the market power of workers. We identify a monetary base rule which responds only to the money demand shocks in the model. We solve the model with this counterfactual monetary policy rule. We then simulate the dynamic response of this model to all the estimated shocks. Based on the model analysis, we conclude that if the counterfactual policy rule had been in place in the 1930s, the Great Depression would have been relatively mild.
Publisher:
ISBN:
Category : Depressions
Languages : en
Pages : 112
Book Description
We evaluate the Friedman-Schwartz hypothesis that a more accommodative monetary policy could have greatly reduced the severity of the Great Depression. To do this, we first estimate a dynamic, general equilibrium model using data from the 1920s and 1930s. Although the model includes eight shocks, the story it tells about the Great Depression turns out to be a simple and familiar one. The contraction phase was primarily a consequence of a shock that induced a shift away from privately intermediated liabilities, such as demand deposits and liabilities that resemble equity, and towards currency. The slowness of the recovery from the Depression was due to a shock that increased the market power of workers. We identify a monetary base rule which responds only to the money demand shocks in the model. We solve the model with this counterfactual monetary policy rule. We then simulate the dynamic response of this model to all the estimated shocks. Based on the model analysis, we conclude that if the counterfactual policy rule had been in place in the 1930s, the Great Depression would have been relatively mild.