paint-brush
Why Some Dynamic Stochastic General Equilibrium Models Failby@keynesian

Why Some Dynamic Stochastic General Equilibrium Models Fail

by Keynesian TechnologyDecember 8th, 2024
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

This section examines the implications of Theorems 3 and 5, discussing the fragility of DSGE models under specific assumptions. While compactness and stochastic equilibrium are central to the models' validity, failures in non-linear systems highlight significant limitations, such as exploding objective functions and indeterminacy. Insights from statistical physics and theoretical computer science offer alternative perspectives, emphasizing the need for plurality in economic modeling and cautious interpretation of DSGE predictions.
featured image - Why Some Dynamic Stochastic General Equilibrium Models Fail
Keynesian Technology HackerNoon profile picture

Author:

(1) David Staines.

Abstract

1 Introduction

2 Mathematical Arguments

3 Outline and Preview

4 Calvo Framework and 4.1 Household’s Problem

4.2 Preferences

4.3 Household Equilibrium Conditions

4.4 Price-Setting Problem

4.5 Nominal Equilibrium Conditions

4.6 Real Equilibrium Conditions and 4.7 Shocks

4.8 Recursive Equilibrium

5 Existing Solutions

5.1 Singular Phillips Curve

5.2 Persistence and Policy Puzzles

5.3 Two Comparison Models

5.4 Lucas Critique

6 Stochastic Equilibrium and 6.1 Ergodic Theory and Random Dynamical Systems

6.2 Equilibrium Construction

6.3 Literature Comparison

6.4 Equilibrium Analysis

7 General Linearized Phillips Curve

7.1 Slope Coefficients

7.2 Error Coefficients

8 Existence Results and 8.1 Main Results

8.2 Key Proofs

8.3 Discussion

9 Bifurcation Analysis

9.1 Analytic Aspects

9.2 Algebraic Aspects (I) Singularities and Covers

9.3 Algebraic Aspects (II) Homology

9.4 Algebraic Aspects (III) Schemes

9.5 Wider Economic Interpretations

10 Econometric and Theoretical Implications and 10.1 Identification and Trade-offs

10.2 Econometric Duality

10.3 Coefficient Properties

10.4 Microeconomic Interpretation

11 Policy Rule

12 Conclusions and References


Appendices

A Proof of Theorem 2 and A.1 Proof of Part (i)

A.2 Behaviour of ∆

A.3 Proof Part (iii)

B Proofs from Section 4 and B.1 Individual Product Demand (4.2)

B.2 Flexible Price Equilibrium and ZINSS (4.4)

B.3 Price Dispersion (4.5)

B.4 Cost Minimization (4.6) and (10.4)

B.5 Consolidation (4.8)

C Proofs from Section 5, and C.1 Puzzles, Policy and Persistence

C.2 Extending No Persistence

D Stochastic Equilibrium and D.1 Non-Stochastic Equilibrium

D.2 Profits and Long-Run Growth

E Slopes and Eigenvalues and E.1 Slope Coefficients

E.2 Linearized DSGE Solution

E.3 Eigenvalue Conditions

E.4 Rouche’s Theorem Conditions

F Abstract Algebra and F.1 Homology Groups

F.2 Basic Categories

F.3 De Rham Cohomology

F.4 Marginal Costs and Inflation

G Further Keynesian Models and G.1 Taylor Pricing

G.2 Calvo Wage Phillips Curve

G.3 Unconventional Policy Settings

H Empirical Robustness and H.1 Parameter Selection

H.2 Phillips Curve

I Additional Evidence and I.1 Other Structural Parameters

I.2 Lucas Critique

I.3 Trend Inflation Volatility

8.3 Discussion

This stanza commences by defending the only real assumption in the theorems. Subsequent paragraphs proceed under the premise it has been met.


The only economic restriction, that the patient limit β → 1 dominates, is very weak. It rules out non-ergodic solutions and is a requirement for standard econometric analysis. The discussion surrounding Corollary 2, in Section 6.4, suggests it should prove a reasonable approximation. It is rare in the empirical literature to see calibrations of β below 0.99 for developed nations. I am committed to testing all limiting assumptions under plausible model conditions in a future paper.[64]


The key message of Theorem 3 is that dynamic stochastic general equilibrium need not exist. Theorem 5 implies they will not exist under plausible parameters for the Rotemberg model.[65] This surely puts paid to its claim to be a model of the business cycle. This explains a widespread pattern in the literature, where leading practitioners have struggled to compute non-linear New Keynesian models accurately. Carlstrom et al. [2014], Herbst and Schorfheide [2016] and Ascari et al. [2018b] explicitly discuss non-convergent simulations from popular New Keynesian models.[66]


When a solution does not exist, a package such as Dynare will return an error message, whilst a self-programmed routine will show successive iterations diverging. A key insight is that when a linearized model is indeterminate about its stochastic steady state, at least one of the underlying infinite horizon optimization problems has no solution- in the sense that its objective function is divergent. In this paper, it means there is no way to define expected lifetime utility or expected profitability of the resetting firm. The amplitude of the business cycle will be arbitrarily large so, the boundary properties mean that, the supposedly infinitely lived representative household will expect to starve and expire from over work with probability one. Popular programming packages warn macroeconomists about indeterminacy for good reason.


Claims of indeterminacy ignore the fact that dynamics have to be consistent with the existence of the underlying objective function. Macroeconomists have traditionally ignored this preferring to focus on linear approximations. This approach offers a natural tractability. From this standpoint, when I compute a limiting solution, I am using the non-linear objective function as a refinement strategy, to rule out multiplicity and other unsustainable solutions implied by the linear model. This parallels similar developments in microeconomics and game theory. Moreover, it utilises and indeed justifies the first strand of the Lucas critique. Microfoundations serve a scientific purpose over and above their role in deriving common approximate solutions.


Nevertheless, the purpose of this paper is not to close down debate or rule out plausible concepts a priori. Fernández-Villaverde et al. [2023] defines a coherent notion of multiple stochastic steady states in continuous time. There is no guarantee uniqueness would carry over to an environment with a single large player (a government or central bank) facing a non-linear optimization problem. There should be sufficient flexibility. Features like hysteresis and structural change, that appear inconsistent with stochastic equilibrium, could be incorporated as limiting cases, as in Bouchaud and Farmer [2023]. In so far as this new rigorous approach were to impose restrictions on the scenarios that can be analyzed in DSGE, I would advocate greater plurality, including application of ideas and techniques from agent-based modelling, along with greater emphasis on temporary policy deviations, as in Ascari et al. [2018a].


The phenomenon uncovered in Theorem 5 has three mathematical interpretations. The first is failure of local-in-time existence. Reliance on infinite horizon optimization means that a DSGE model exists for all time or none. This is considered pathological in mathematical physics, where systems are typically observed for some time before any blow up. It makes it easier to interpret blow-ups and understand the properties of a system, from a computational perspective, before rigorous proofs can be supplied. This existence problem has prevented the division of labor between theoretical and mathematical physics being successfully applied to macroeconomics.


Secondly, from a statistical physics perspective, there is a phase transition at the mean field limit. This means that the equilibrium, which we expect would exist when a large number of firms are pricing strategically against one another, breaks down in the limit when the firms start to compete against the aggregate population. However, this interpretation is less intuitive in business cycle analysis; it may make more sense in other economic applications.[67]


The third perspective comes from theoretical computer science. They would rather re-scale the exploding objective to make it zero and declare its ratio with the social optimum, the Price of Anarchy, to have exploded. For background consult Koutsoupias and Papadimitriou [1999], Roughgarden [2005] and Roughgarden [2015].


These results confirm, the instrumentalist perspective, that models are meant to be useful not realistic, famously expounded by Friedman [1953]. No central banker should lose sleep worrying about exploding price dispersion or worry about inflation hitting some fictitious upper bound.[68] However, it does reveal that a plausible model can breakdown, when fed unfavorable inputs. This means one cannot take its predictions too seriously away from sensible parameter values. Indeed, the insight is general to DSGE that the facility to make testable predictions is intimately connected with the possibility of non-existence.


This gives the models less robustness. Consider the plausible extension where households’ are uncertain about the state of the economy and form Bayesian beliefs concerning the parameters of the economy γ, including the shock processes. Typical conjugate priors have unbounded tails and hence will typically display values for which the model implodes and the households optimization problem along with it. The focus here is on the inflation reaction in the policy rule, where there is evidence of considerable dispersion in estimates and variance across econometric methodology, such that confidence intervals would include blow up values. This accords with our intuition that macroeconomics is a scientific endeavour but a less precise one than physics, where it is very difficult to find flaws in the behavior of well-known equations, although a greater emphasis on optimal policy and learning might help, particularly in monetary economics.


This paper is available on arxiv under CC 4.0 license.


[64] There is a longstanding tradition in microeconomics of using the patient limit to rule out non-stationary solutions, which can be multitudinous in game theory. This extends back to the folk theorem proven in Friedman [1971]. Proposition 23 is the only case where I consider dynamics away from the patient limit but without treating existence.


[65] In addition to the historical evidence discussed in the introduction, a body of contemporary evidence typically favors under-active response to contemporaneous inflation (see Chortareas and Magonis [2008], Taylor and Williams [2010], Hansen et al. [2011] and Sviták [2013]). This is also reflected in the views of economic forecasters and the lack of public appreciation for the textbook policy stance (Mitchell and Pearce [2010], Pierdzioch et al. [2012], Carvalho and Nechio [2014] and Dräger et al. [2016]).


[66] Judd et al. [2017] demonstrates, via a formal statistical test, that existing approximations do not accurately simulate a typical large New Keynesian model but his method conflates problems with a priori accuracy and ex post existence.


[67] This issue is discussed more formally and explained with commuting diagrams by Carmona et al. [2013].


[68] Consult Alvarez et al. [2018] who provides actual evidence of modest responses of real quantities to high inflation in Argentina.