
Peter von zur Muehlen- Ph.D
- Economist at Board of Governors of the Federal Reserve System
Peter von zur Muehlen
- Ph.D
- Economist at Board of Governors of the Federal Reserve System
About
116
Publications
10,678
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
872
Citations
Introduction
Peter von zur Muehlen worked in the Research and Statistics Division of the Board of Governors of the Federal Reserve System from 1968 to 2001. To this his research involves macroeconomic modeling, econometric theory, modeling of expectations, robust control, environmental economics and Game Theory. His current projects include studies on optimal fiscal policy in climate models under ambiguity, the econometrics of panel data, and agent-based modeling of expectations.
Current institution
Additional affiliations
March 2001 - present
Independent Research
Position
- Independent Research
April 2001 - December 2014
Education
September 1961 - May 1963
Publications
Publications (116)
In a Ramsey policy regime, heterogeneity in beliefs about the potential costs of climate change is shown to produce policy ambiguities that alter carbon prices and taxation. Three sources of ambiguity are considered: (i) the private sector is skeptical, with beliefs that are unknown to the government, (ii) private agents have pessimistic doubts abo...
Typically, the explanatory variables included in a regression model, in conjunction with the omitted relevant regressors implied by the usual error term, have both direct and indirect effects on the dependent variable. Attempts to obtain their separate estimates have been plagued with simultaneity issues. To circumvent these problems, this paper de...
Thirty-six years ago, introducing a distinction between factors and concomitants in regressions, John W. Pratt and Robert Schlaifer determined that the error term in a regression represents the net effect of omitted relevant regressors. As this paper demonstrates, this assumption poses a problem whenever the purpose of a model is to explain an econ...
It has been argued that whenever regression models involve nonstationary and trending variables, estimation methods appropriate to stationary series cannot be applied to such models and instead require cointegration techniques. Unfortunately, extant methodologies applied to cointegration are a trap: if the error term of a cointegration regression c...
Thirty-five years ago, J. W. Pratt and Robert Schlaifer published a critique of then ruling econometric techniques. Introducing a distinction between factors and concomitants in regressions, they determined that a "condition for consistent estimation stated in virtually every book on econometrics is meaningless in one common form, impossible to sat...
It is often thought that the error term in a regression represents the net effect of omitted variables. This poses a problem whenever the purpose of a model is to explain an economic phenomenon, because the estimated coefficients as well as the error will be wrong in the sense that they are not unique. But a model that is not unique cannot be a cau...
As every econometrician knows, in a regression with one regressor, the dependent and explanatory variables may be spuriously correlated if they may have been affected by some third variable, a common cause. In a highly regarded article, Granger and Newbold (1974) were not concerned with this type of spurious correlation but proposed an intuitively...
We appreciate the effort and thoughtfulness of Raunig’s (2017) attempted critique of Swamy et al. (2015).[...]
In recent years, the learnability of rational expectations equilibria (REE) and determinacy of economic structures have rightfully joined the usual performance criteria among the sought-after goals of policy design. Some contributions to the literature, including Bullard and Mitra [2002. Learning about monetary policy rules. Journal of Monetary Eco...
In recent years, the learnability of rational expectations equilibria (REE) and determinacy of economic structures have rightfully joined the usual performance criteria among the sought-after goals of policy design. Some contributions to the literature, including Bullard and Mitra (2001) and Evans and Honkapohja (2002), have made significant headwa...
In the late 1960s and into the 1970s, the United States experienced a burst of inflation the origins of which seemed hard to uncover. This paper advances the idea that the Fed simply got the model wrong. We assume that the true model of the economy is a variant of the standard New Keynesian model, but the Fed estimates its Phillips curve with a red...
We examine learning, model misspecification, and robust policy responses to misspecification in a quasi-real-time environment. The laboratory for the analysis is the Sargent (1999) explanation for the origins of inflation in the 1970's and the subsequent disinflation. Three robust policy rules are derived that differ according to the extent that mi...
We examine learning, model misspecification, and robust policy responses to misspecification in a quasi-real-time environment. The laboratory for the analysis is the Sargent (1999) explanation for the origins of inflation in the 1970s and the subsequent disinflation. Three robust policy rules are derived that differ according to the extent that mis...
We examine learning, model misspecification, and robust policy responses to misspecification in a quasi-real-time environment. The laboratory for the analysis is the Sargent (1999) explanation for the origins of inflation in the 1970s and the subsequent disinflation. Three robust policy rules are derived that differ according to the extent that mis...
This paper derives and presents mean leads and lags as well as patterns of relative importance weights implied by the PAC (polynomial-adjustment-cost) error-correction equations which form the core of the FRB/US model at the Federal Reserve Board. Relative importance weights measure the contributions of past and future expected changes in fundament...
The monetary policy rules that are widely discussed--notably the Taylor rule--are remarkable for their simplicity. One reason for the apparent preference for simple ad hoc rules over optimal rules might be the assumption of full information maintained in the computation of an optimal rule. Arguably this makes optimal control rules less robust to mo...
This paper analyzes the optimality of reactive feedback rules advocated by neo-Keynesians, and constant money growth rules proposed by monetarists. The basis for this controversy is not merely a disagreement concerning sources and impacts of uncertainty in the economy, but also an apparent fundamental difference in the attitude toward uncertainty a...
This paper analyzes the optimality of reactive feedback rules advocated by neo-Keynesians, and constant money growth rules proposed by monetarists. The basis for this controversy is not merely a disagreement concerning sources and impacts of uncertainty in the economy, but also an apparent fundamental difference in the attitude toward uncertainty a...
This paper derives and presents mean leads and lags as well as patterns of relative importance weights implied by the PAC (polynomial-adjustment-cost) error-correction equations which form the core of the FRB/US model at the Federal Reserve Board. Relative importance weights measure the contributions of past and future expected changes in fundament...
This paper describes the role of expectations and how they are modeled in FRB/US, the macroeconomic model of the United States developed at the Board of Governors of the Federal Reserve System during the last seven years. Two principles guide the specification of behavioral equations in the model. One is the assumption of optimizing behavior by rat...
This paper derives and simulates with US data a two-sector growth model to explain the significant drop in the US saving rate, attributing most of it to a crowding-in effect of government saving due to the surplus in the US Federal Government's budget. The paper also demonstrates the effects on personal and gross saving rates, equity wealth, dispos...
This paper studies an endogenous growth economy with two sectors owned by a life-time utility maximizing representative household, one sector producing consumption goods and the other sector producing knowledge and technology. The stationary Euler conditions are used to derived relationships describing balanced growth that may be analyzed for its s...
This paper explores Knightian model uncertainty as a possible explanation of the considerable difference between estimated interest rate rules and optimal feedback descriptions of monetary policy. We focus on two types of uncertainty: (i) unstructured model uncertainty reflected in additive shock error processes that result from omitted-variable mi...
This paper explores Knightian model uncertainty as a possible explanation of the considerable difference between estimated interest rate rules and optimal feedback descriptions of monetary policy. We focus on two types of uncertainty: (i) unstructured model uncertainty reflected in additive shock error processes that result from omitted-variable mi...
This paper derives and estimates a dynamic model of quasi-fixed factor demand for US industries. Interdependence of factor demand and inventories imposes eigenvalue restrictions on the co-integrating vectors derived from the system of Euler equations that are tested with US data. The dynamic Euler equations can be represented by systems of Johansen...
This paper derives the equilibrium Nash strategies of two central banks for economies related by trade. Both policy makers seek to stabilize inflation and output but are con-strained by their economies' interactions. Into this mix I add the further assumption of Knightian uncertainty regarding the channels of monetary policy, compelling banks to ad...
This paper derives the equilibrium Nash strategies of two central banks for economies related by trade. Both policy makers seek to stabilize inflation and output but are constrained by their economies' interactions. Into this mix I add the further assumption of Knightian uncertainty regarding the channels of monetary policy, compelling banks to ado...
In the 2-1/2 years between March, 1996 and September, 1998 the civilian unemployment rate in the United States dropped a full percentage point, the 12-month CPI inflation rate fell nearly 1-1/2 percentage points, a major crisis developed in emerging economies, and commodity prices collapsed. During the same period, the FOMC elected to make just two...
In recent years, there has been a renewal of interest in the use of rules for governing monetary policy. Theory tells us that there are advantages to precommitting to a policy rule. Given this interest, it is perhaps surprising that the rules under discussion are not rules that are optimal in the sense of having been computed from a optimal control...
The normal assumption of full information is dropped and the choice of monetary policy rules is instead examined when private agents must learn the rule. A small, forward-looking model is estimated and stochastic simulations conducted with agents using discounted least squares to learn of a change of preferences or a switch to a more
complex rule....
This paper outlines several optimization techniques useful for deriving optimal monetary and fiscal policy rules in linear dynamic perfect foresight models. Two broad categories, differing in the treatment of risk, are pre-sented. The first grbup comprises certainty-equivalent optimization, based on risk neutrality, and includes closed-loop optimal...
The Core VAR is an auxiliary vector-autoregressive model used to acting as a mechanism for generating expectations by firms and households that populate FRBUS, the Federal Reserve's forward looking macro model of the US economy. This paper studies the autocorrelation functions and power spectra of the Core VAR model in FRBUS and compares them with...
The macroeconomic costs of disinflation are considered for the United States in a rational expectations macroeconometric model with sticky prices and imperfect information regarding monetary policy objectives. The analysis centers on simulation experiments using the Board's new quarterly macroeconometric model, FRB/US, within which are nested both...
The costs of disinflation are explored using the Board's new sticky-price rational expectations macroeconometric model of the U.S. economy, FRB/US. The model nests both model consistent and `restricted-information rational' expectations. Monetary policy is governed by interest-rate reaction functions of which two are considered: the well-known Tayl...
This note presents a multivariate generalization of a polynomial adjustment cost model due to Peter Tinsley "Fitting both data and Theories: Polynomial adjustment costs and error-correction decision rules,"Finance and Economics Discussion Series 93-21, 1993, Federal reserve Board, Washington, DC.
Cu-integration analysis, as Jeveloped by Granger (1981 ), has been widely used tu test for the existence of equilibrium relationships among economic variables. Trust in the outcome of co-integration tests as an aid in identifying long-run relationships is unfoundeJ because a critical dement in the methodology. the so-called cu-integrating vector....
FRB/US is a large-scale quarterly econometric model of the U.S. economy, developed to replace the MPS model. Most behavioral equations are based on specifications of optimizing behavior containing explicit expectations of firms, households, and financial markets. Although expectations are explicit, the empirical fits of the structural descriptions...
This paper derives a general-equilibrium model from assumptions that mimic those embodied in the Federal Reserve Board's Quarterly Model of the United States (FRBUS). The aim is to produce a conceptual framework with which to analyze the dynamic and steady-state properties of the newly estimated model along lines that conforrr with current thinking...
The conclusions of a logically consistent economic theory which strictly adheres to Aristotle's axioms of logic are factually true if its sufficient conditions are all factually true. Alternatively, if a conclusion of such a theory is false, then at least one of its assumptions is false. Unfortunately, the factual truth of sufficient conditions can...
In 1995 FOMC meeting, several participants inquired about the potential value and meaning of the spread between the Treasury bill rate and nominal GDP for monetary policy. The idea comes from Salomon Brothers, who think that it is an indicator of poiicy, in the sense that a high spread should be viewed as a clue that monetary policy should ease. Th...
This paper analyses the mechanics of sirnple interest rate rules for two models----one with backward and the other with forward looking, rational expectations. The approach is to consider policy when faced with a specific task reducing inflation in a stabilizing manner. The two models are: (i) the expectations-augmented Phillips curve and (ii) the...
This paper analyzes the behavior of prices and finished-goods inventories in a model of monopolistic competition, where the motivation for holding inventories is the prospect of lost sales. An eventual goal of the present investigation is the development of an empirical framework, based on a realistic model of an industry or economy, that is able t...
For an economy characterized by neo-Keynesian wage rigidity, an optimal open market rule is derived based on financial market information, including auction price behavior. Simulations of a small model of the United States--estimated via full information maximum likelihood together with a numerical procedure for solving dynamic, linear rational exp...
The real output effects of disinflation are shown to depend not only on whether inflation is flexible or persistent, but also on the type of policy rule used by the monetary authority to implement disinflation. This paper contrasts the con-sequences of a planned reduction in inflation for (1) an expectations-augmented Phillips curve featuring inert...
An issue with monetary policy rules to guide inflation is the indeterminacy of the price level. In the contexts of a traditional backward-looking and a forward-looking New-Keynesian Phillips curve, the paper examines the dynamic and steady state properties of interest rate rules anchored alternatively on real and nominal GDP and on real and nominal...
A standard theoretical assumption of present-value models of asset prices is that agents' expectations of future inflation are embedded in auction prices such as primary commodity prices and the term structure of interest rates. This paper provides an empirical assessment of the use of auction price information in short-run monetary policy, using t...
This paper shows that the standard equations in the econometric
literature connecting structural with reduced-form coefficients are arbitrary. There is therefore little basis for relying on textbook
conditions for the identifiability of coefficients in a linear simultaneous
equation system as necessary and sufficient. This result is reinforced by t...
This paper examines the properties of interest rate rules aimed at controlling aggregate price inflation. Policies are compared in two models having either flexible or sticky inflation The latter is assumed to derive from a traditional, adaptive-expectations augmented Phillips curve. The flexible inflation model derives from the modern view, due to...
Within a Phillips-curve framework for inflation and using an experimental monthly Federal Reserve Board index of commodity prices, this paper tests the usefulness of commodity prices as predictors of future out-of sample inflation in the US, measured by the CPI and the GDP deflator. A variety of specifications show that commodity prices do not sign...
In this paper we introduce a class of tentatively plausible, fixed-coefficient models of money demand and evaluate their forecast performance. When these models are reestimated allowing all coefficients to vary over time, the forecasting performance improves dramatically. Aside from offering insights about improved methods of analyzing time series...
This paper considers the "Lucas"-critique issue of how the indicator role of auction prices is affected when the central bank attempts to exploit the correlation between auction prices and inflation. This question is examined using a simple macroeconomic model with rational
expectations (perfect foresight). The policy instrument is the short-term i...
Co-integration analysis as developed by Granger (1981) has been widely used to test for the existence of equilibrium relationships among economic variables. Trust in the outcome of co-integration tests as an aid in identifying long-run relationships is unfounded because a critical element in the methodology, the so-called co-integrating vector, is...
The ontological basis for causality testing must be some empirically interpretable system leading one to specify a logically valid a priori law that can be shown to exist. Such paradigms as conventional, linear and non-linear models, be they structural or time series, tend to embody contradictory assumptions that, unfortunately, make causal interpr...
Money stock targeting has been criticlzed as unresponsive to real sector
objectives, such as employment. A recently proposed alternative,
nominal income targeting, has been criticlzed as an excesslvely anbitious
policy obJectlve that will ultimately downgrade the credibility of monetary policy. This paper proposes a framework for intermediate targe...
The conclusions of a logically consistent economic theory which strictly adheres to Aristotle's axioms of logic are factually true if its sufficient conditions are all factually true. Alternatively, if a conclusion of such a theory is false, then at least one of its assumptions is false. Unfortunately, the factual truth of sufficient conditions can...
Causality tests developed by Sims and Granger are flawed for several reasons. First, when two variables X and Y are uncorrelated, X has no linear predIctive value for Y, but X and Y may be nonlinearly related unless they are statistically independent, in which case X and Y are not related
at all. The right-hand side variables in a regression equatI...
This paper explores alternatives to the narrow measure of the money stock, M1, as potential target guidelines for short- and intermediate-run policy. Using stochastic simulations of the Federal Reserve Board quarterly model, that mimic the outcomes of a variety of intermediate targeting policies, a ranking can be established based on the consequent...
The three pillars of econometric modeling are: (1) the reduced-form, (2) the recursive form, and (3) the structural form. Each of these techniques exists for the purpose of revealing the joint probability distribution of current and lagged endogenous variables conditional on the values of exogenous variables. The purpose of this paper is to discuss...
This paper explores the dynamic inplications for optimal price
and sales policies in a simple monopolistic model with constraints on
inventory adjustment. Given a linear demand function and bounds on the
rate of change in inventories the question is posed: What is the optimal price and sales policy to attain three alternative
teminal inventory obje...
The altered allocations of money market volatility obtained by alternative monetary policy procedures are illustrated by stochastic simulations of a staff monthly model. The results indicate the nature of the tradeoff between short-run volatility in the money stock and in the funds rate that is available to money stock targeting procedures.
This paper introduces an empirical measure of the cost of allocating money market volatility between the money stock and the Federal funds rate, the principal purpose being to devise a framework for data-based measures of the short-run implied by alternative operating procedures.
Questions
Question (1)
It appears that the two ways to publicize a revised version are to add the new version as a new paper and then either (1) leave the old version intact or (2) delete the old version. In the latter case, all the credits for people having read the paper disappear from my score.In the former case, two versions remain, even though the first one has become obsolete. Is there a better way?