Forecasting and Austrian Economics

Note: Below the dash is the text of my 2016 Fall Semester Paper for a Ph.D. class called Theory of the Market Process I. The actual paper file can be downloaded here, and it has some better formatting applied.

Forecasting and Austrian Economics

John Vandivier


The Austrian School has received criticism for opposition to empirical inquiry. In contrast, the present article explores methods of forecasting consistent with an Austrian view. I provide a limited defense of econometric analysis using the Austrian perspective. I suggest certain adjustments to standard econometric analysis on the grounds of Austrian criticism and I validate these findings by exploring the relationship between the unemployment rate and the American Recovery and Reinvestment Act of 2009. In the final section I review non-econometric techniques for compatibility with the Austrian view.

KEYWORDS: forecasting, unemployment, macroeconomic policy, moral hazard

JEL CODES: B41, B53, C10, C54


1 Introduction

This paper explores methods of forecasting consistent with an Austrian view. I argue that some methods of forecasting are consistent with the Austrian view. I present a limited defense of econometrics from an Austrian perspective. I discuss adjustments to standard econometrics consistent with the Austrian view and provide an empirical test by exploring the relationship between the unemployment rate and the American Recovery and Reinvestment Act of 2009. I investigate compatibility of non-econometric methods of modelling and forecasting with the Austrian view.

This first section reviews the purpose and motivation of the paper. Section 2 presents a limited defense of econometrics from an Austrian perspective. Section 3 discusses adjustments which might improve standard econometric forecasting based on an Austrian perspective. In section 3 I also include a modest empirical test of a suggested adjustment with an applied analysis of the American Recovery and Reinvestment Act of 2009. I find the adjustment reduces the probability of an erroneous forecast. Finally, section 4 discusses non-econometric methods of forecasting.

Backhouse (2000) and other scholars have criticized the Austrian School of economics as having “anti-empirical tendencies.” In contrast, Rothbard (1994) describes at least three approaches to empirical investigation presented in the Austrian literature. The mistaken claim by Backhouse and others is understandable. Many other scholars have defended the thesis that Austrian analysis is empirical in nature, but econometric reports are not frequent products of Austrian journals. It is just that sort of report which is common in applied economics.

It is trivial to identify criticism of econometric methods by Austrians. Garrison (1993) provides one example, saying, among other things, “The economist’s audience is interested in the issue of causality; his mathematical and econometric techniques are not up to the task.” It is also trivial to identify Austrians giving credit to more usual sorts of economic analysis. In the same article Garrison states, “Systems of equations can be used to describe abstract states of general equilibrium, and econometrics can provide some quantification of actual economic magnitudes. There should be no objection to this.”

There may be many explanations about why Austrians sometimes criticize and sometimes give credit to mathematical and econometric methods with which they do not generally publish. I can think of three, and I think they are all true to varying extents. First, the Austrian school may be constituted of some people who do not value these methods and other people who do value these methods. The result is that Austrianism qua Austrianism is neutral on the matter, and heterogeneity is embraced. The second explanation is that Austrians think particular usage of statistics and particular econometric methods are plausible while others are nonsense. Roger Koupl, the former editor of Advances in Austrian Economics, provides evidence for this explanation. He has published a criticism of representative agent methodology (2011) and he has also published a work of experimental study (Cowan and Koppl, 2011).

Finally, it is a common view among Austrians that empirical measurement is valuable for predictive estimation and technical calculation, but it cannot add to economic theory. Caldwell notes, for example, that this is the position of Hayek (1992).

2 A Limited Defense of Econometrics

I have briefly mentioned that some Austrians favor econometric models for various tasks, but that amounts to a survey of preferences rather than a logical proof of the conditional usefulness of such models. Three such proofs will be captured in this section. I will argue that prediction is a prerequisite for human action, that some sort of mathematical modelling is required to resolve logical ambiguity in many cases of logical analysis, and that econometric modelling is an instance of pattern prediction.

The proof from human action takes the form of a simple logical proof:

  1. An individual must expect that some behavior has the power to alleviate uneasiness in order for that person to act (Brody and Mises, 1951).
  2. Human action occurs.
  3. From 1 and 2: Therefore, individuals form expectations about changes over time.
  4. Forecasting is the process of forming expectations about changes over time.
  5. From 3 and 4: Therefore, individuals engage in forecasting.

Notice that the same argument can be adapted into an argument against radical uncertainty. If uncertainty is able to dominate certainty, as is the case under radical uncertainty, then an individual is left unable to form an expectation that any particular means will alleviate any degree of uneasiness. As a result, if radical uncertainty is the case we would not expect human action to be observed. However, human action is observed. By contradiction, then, uncertainty exists but it must not be a dominant force. This point may seem only tangential, but it will lead into an econometric adjustment which is suggested in section 3.

The point has been established that all individuals must engage in forecasting, but it is granted that there are many ways of doing this. Econometrics is sometimes infeasible or costly, but in other cases it can improve the accuracy and precision of estimates and provides a means to greater technical efficiency. Three alternatives, but by no means the only other alternatives, include intuition, guess-and-check strategies, and logical analysis. Here I will seriously consider logical analysis, which is the usual Austrian mode of analysis. Logical analysis has the great benefit that when it is properly carried out it obtains a strong claim to truth, even in the absence of good data, or even in the presence of apparent counterfactual data. Logical analysis, however, has two weaknesses: First, it is subject to a selection bias which is sometimes, though not always, self-defeating, in the technical sense of a logical defeater (Malmgrem, 2006). Second, logical analysis is subject to indeterminate outcome in some cases.

Consider a hypothetical case where an individual owns a gas station. It is the beginning of a beautiful Summer, not too warm, so the owner believes that travel is likely to increase and drive up fuel consumption over the next two months. The same day he receives news that his beverage vendor is increasing wholesale prices. This individual’s gas station revenue is equally divided between fuel and beverage sales. On a purely logical analysis it is indeterminate whether his revenue will increase, decrease, or remain constant. The logical analyst is not completely without an answer: After all, if these are the three options available and the tendencies in either direction are not considered to be different then on balance the owner does not have a strong reason to expect a change in revenue, although perhaps he should invest in some risk protection. That sort of analysis is logically correct, but if we have econometrics then we can compare the specific relative magnitudes of these forces and come out with a more precise calculation which may lead to higher profits. Admittedly our prediction will be subject not only to risk but also to uncertainty, a problem which is addressed in section 3.

The final portion of this section is an argument that econometrics constitutes a representation of pattern prediction and therefore false squarely in mainline Austrian analysis. I present two arguments to this effect. The first argument is that I can logically establish linear regression as an optimal modelling technique. An analyst may also be able to make a logical case for quadratic or cubic models. These models qualify as so-called structural equation models (Ullman, 2003) but models which are more complex than that or take exotic functional forms are difficult to defend on the grounds of pure logic.

In this proof of linear regression, I am not interested in a mathematical proof about the capacity of the least squares method to minimize residual size. Rather, I am interested in proving that mathematical functions of increasing order can be taken as representations of real-world patterns. Suppose you are running a lemonade stand. You are selling quite a bit of lemonade over the course of a month and you decide to open a second lemonade stand about a block away. You set your friend up to run that one. He says he has run a lemonade stand before and he sold lots of lemonade at that time. You aren’t able to decide whether he is bluffing and in fact doesn’t know what he is doing or whether he is being honest and may well be more experienced than you. In short, you are uncertain about his prospective salesmanship.

The question is: What level of sale should you expect from him? Given that he seems to be your equal, but you have some level of uncertainty, you should expect he will produce the same level of revenue as you are presently producing. Suppose we graph this in abstract space where the number of lemonade stands is along the X-axis and the total revenue of all stands is along the Y-axis. In such a space we would graph a line sloped upward. Now we add a third stand and things become interesting.

With the introduction of a third stand we introduce the possibility of a nonlinear relationship. Suppose there are increasing gains from having multiple stands, then we would see an upward curve. We also might see diminishing returns which take the graphical shape of a root function. Figure 1 illustrates several of the curves we might see. In the presence of uncertainty an individual does not know a priori which curve will be generated from a plot of an increasing number of lemonade stands. Given the logical condition that the marginal individual may produce relatively more, less, or an equal amount of revenue, on balance we should suppose the average expectation of an equal amount without any further data. It is also interesting to note that the linear relationship is the average of every possible nonlinear relationship, when such relationships are weighted equally. As such the linear relationship becomes the most expected relationship, barring further information. In this way we can derive the expectation of a linear regression from pure theory.

We can further derive the quadratic relationship from pure theory, to the degree that diminishing marginal returns are granted in theory. Finally, if we can logically establish the existence of a point of inflection then we have grounds for the theoretical supposition of a cubic effect. It may be possible to theoretically or structurally suppose additional mathematical patterns of change, but I have found the intuition becomes unconvincing for complex models. Perhaps one could argue a sine-shaped function for economic cycles, but, for example, why not a cosine, tangent, or other such function?

The second argument that statistical models constitute representations of pattern predictions is straightforward and concise: Hayek came up with the concept of a pattern prediction and he classified statistical models as such (1964).

Theory and logical structure provide a way to generate an abstract model or models. After an abstract model is generated then econometrics becomes important because it allows for determination of the specific magnitudes of various relationships. These magnitudes may not add much in theory, but they are of immense practical value and can serve to improve welfare. Additionally, econometrics can be used to rank the quality of competing models which may be of relatively ambiguous rank in pure theory.

I want to emphasize that the complexity argument as a critique of econometrics is valid, but it only serves as a means to expect additional variance or error. Complexity per se does not suggest a particular direction or systematic kind of error, and the critique does not amount to grounds for positive rejection of a useful tool.

3 A Few Adjustments to Usual Econometric Forecasts

This section will discuss three adjustments to usual econometric forecasts: A correction for complexity, a correction for moral hazard, and a correction for uncertainty. I provide an empirical test for complexity error correction and uncertainty correction.

I have mentioned that the progressive enhancement of a structural model from pure theory is likely to become implausible and the theory underlying the model becomes complex, and also that the real world is complex. This amounts to a general expectation that models will be underspecified in complexity. This finding is reinforced by the idea that models have decreasing marginal benefit from investments of time and resources into their creation. Because no model is expected to be fully specified, there is always some expected degree of what might be called complexity error. This is really just another sort of measurement error or specification error. As has been continually argued, if there is no particular direction of error expected then the sole expected result is an unbiased increase in error. This expectation may be corrected for logically in the following way.

Consider that A is a larger number than B. We can establish rationally expected relative magnitudes and frequencies a priori, but we cannot establish absolute measures. In fact, we can construct an entire statistical distribution based on a priori ordinal reasoning, but such a system may not have much empirical value. If an analyst is uncertain about the magnitude of the relationship between A and B expect, but the analyst is given their order, an expected value about the relative value can be rationally established as follows:

  1. A > B > 0
  2. 1 > B/A > 0
  3. Given that B/A is randomly selected among possible values from 0 to 1, the expected value is .5.
  4. Expressed alternatively, A is expected to be twice as large as B.

Let’s apply our ordinal correction logic to the case of error correction. Given that there is some complexity error which was not accounted for by the regression, the true error is rationally expected to be twice as large as the stated error. That is how we can correct for complexity: By doubling the stated error in the model.

The second form of error is an error from moral hazard. Rothbard (1960) argues convincingly that the government has an incentive to produce statistics which they can leverage in their own interest. In some cases, the direction of this bias is clear and in other cases it is logically ambiguous. For example, government might want to inflate estimates of GDP to appear more effective. However, government might want to deflate estimates of GDP to spur demand for policy solutions. In this case of logical ambiguity, as in the case of the gas station owner, the analyst should treat the situation by simply increasing their error expectations. In other cases, the government has an arguably clear incentive in a particular direction. While government might have an incentive to spur policy demand, it seems they would have a clear incentive to make it appear as if their policy solution was successful. We will shortly conduct an analysis of the 2009 stimulus, for example, and I would expect they inflated their effectiveness on the grounds of a bias from moral hazard.

When it comes to the use of government statistics we may plausibly omit the correction of the moral hazard, even when we anticipate the direction, on the grounds that the output of the prediction model is going to be compared against government statistics. Since the input variables and the output variables contain a consistent bias, manual addition of the bias error correction would result in the inability to compare the predicted values to the reported values.

The final adjustment of interest is a correction for uncertainty. Suppose that some good is priced with an expected value of P. Given that P is not only risky but also uncertain, does this increase or decrease the value of P? The answer is that uncertainty does not modify the expected value, it simply reduces our confidence in whatever point value we have already established. Given that the value of P is truly uncertain we have no reason to think it is any less valuable than P, and we equally have no reason to think it is any more valuable than P. Whereas previously the estimated value was P, the estimated value is now P, P+, or P-. The resulting expected value is equal to P, but the probability of obtaining P is smaller.

In January of 2009, the Office of the Vice-President Elect and the Council of Economic Advisers jointly released a plan describing the American Recovery and Reinvestment Act of 2009. The plan included forecasts and the researchers disclaimed, “It should be understood that all of the estimates presented in this memo are subject to significant margins of error.” However, they gave no numbers for these errors.

Based on the report it was apparent the researchers had access to unemployment data through Q3 of 2008. I attempted to replicate their forecast by obtaining unemployment data from Q1 of 2000 through Q3 of 2008 from the U.S. Bureau of Labor Statistics. Table 1 outlines three specifications attempted including a linear, quadratic, and cubic model which considers the unemployment rate to be a function of time. The preferred specification was the cubic model which is illustrated in Figure 2. This specification obtained statistical significance for each independent variable as well as a substantial match of the data on visual inspection.

In Table 2 I report the actual and forecasted unemployment rates for 1 and 2 years after the report was distributed. Model 3, the preferred model, forecasted the same rate of unemployment on the 1-year estimate, but on the 2-year estimate Model 3 significantly outperformed the estimates reported by the researchers. As the table reports, the estimates provided by the government were highly significantly and importantly under the actual values. The gross error of the government forecasts, in combination with the direction of those errors, is consistent with a bias from moral hazard. The degree of this error is surprising when compared to the degree of accuracy provided by Model 3 which is a relatively trivial model. The lower bound on some estimates extends into nonsensical negative values. If the confidence interval provided by government estimates did likewise this may be a reason they did not publish such values.

The model which produced point estimates with the greatest degree of error was Model 2. As an empirical test on the validity of the error correction measures previously discussed, I applied a correction for complexity error and uncertainty error. Because the lower bound of the estimate had already proceeded past 0, the effective lower bound stands at 0. The corrected upper bound estimate of the unemployment rate 17.8 and it resulted from the following equation:

17.8 =            (4)(1.96)(.1610624) + 4.108746

+ 127[(4)(1.96)(.0071574) + .0502388]

– 1272[(4)(1.96)(.0000666) – .0004541]

As shown in the equation, the usual 1.96 standard deviations which are used to obtain a confidence interval of 95% where multiplied by 4 in this case. This is because the error is double once for complexity and once again for uncertainty. The result is that the observed unemployment rate falls well within the corrected upper bound.

4 Non-Econometric Methods

There are two particular fields in applied economics and quantitative social science that I think are worth mentioning in any discussion about the compatibility of Austrian economics with various empirical methods. These two fields are agent-based modelling and experimental economics. I also argue that prediction markets are a technique for forecasting which is consistent with the Austrian view.

The emphasis on a process view of action is a unique and valuable contribution of Austrian thought to the broader field of economics. Not many modelling techniques are capable of implementing this complex orientation. Agent-based modelling, however, is a candidate.

Agent-based modelling (ABM) is the computational study of social agents as evolving systems of autonomous interacting agents (Jansen, 2005). Fictitious individuals are created using this model. They are placed in a computer-simulated environment and endowed with programmatic goals and a variety of proficiencies and preferences. In contrast to other models of economics, ABM implements very few overarching rules on the system. Instead, ABM focuses on modelling many detailed fictitious agents and these agents interact to produce a wider emergent order, sometimes to unanticipated effect.

Because ABM is a recent field requiring technical knowledge not held by many economists, and given that the Austrian School is a relatively small group of economists, it would be reasonable to expect that an overlapping literature would be thin or absent. Instead, there seems to be a current trend in ABM to capture the Kirznerian model of entrepreneurship. Holian and Newell, for example, provide an agent-based model with Kirznerian entrepreneurship and they are able to identify some interesting patterns about the degree of alertness including the observation that an economy with high levels of alertness will experience diverse products but equilibrate slowly compared to a less alert economy.

Hayek has in one place declared experiment possible in the social sciences, and yet in another place he is among the first people to suggest it may be feasible (Cevolani, 2011). He then proceeds to state it would hardly be worth the expense. Years later, Paxson and Wenzel do not seem terribly deterred. They argue that experimental economics can play a critical role by filling in the gap between human intention and human action, which Mises had relegated to the purview of psychologists (2016).

Perhaps I can agree with both sides. Experimental economics provides a highly synthetic environment in which it is extremely difficult, perhaps impossible, to generate an emergent order reflective of natural society. At the same time, it is an indispensable tool for investigating the peculiarities of individual decision making. As such, I agree that experimental economics can be used to fill in some of the gap between intention and action, but I also want to emphasize that experimental economics can help construct realistic agents for use in agent-based modelling and thereby provide an indirect but important role in the study of emergence.

Krasnozhon and Levendis (2015) make a convincing case that prediction markets are a price-centric, Austrian-friendly means of incorporating information about the future into present calculations. I am a particular fan of prediction markets on the simple grounds of efficacy. Using a sample of 964 polls, Berg et al report that prediction markets perform better about 74% of the time for long term predictions (2008).



Backhouse, Roger E. “Progress in heterodox economics.” Journal of the History of Economic Thought 22.2 (2000): 149-155.

Berg, Joyce E., Forrest D. Nelson, and Thomas A. Rietz. “Prediction market accuracy in the long run.” International Journal of Forecasting 24.2 (2008): 285-300.

Brody, Alexander, and Ludwig von Mises. “Human Action: A Treatise on Economics.” (1951): 606-608.

Caldwell, Bruce. “Hayek the falsificationist? A refutation.” Research in the History of Economic Thought and Methodology 10.1 (1992): 1-15.

Cevolani, Gustavo. “Hayek in the lab. Austrian School, game theory, and experimental economics.” (2011).

Cowan, Everard James, and Roger Koppl. “An experimental study of blind proficiency tests in forensic science.” The Review of Austrian Economics 24.3 (2011): 251-271.

Garrison, Roger W. Mises and his methods. “The Meaning of Ludwig von Mises: Contributions is Economics, Sociology, Epistemology, and Political Philosophy.” (1993).

Hayek, Friedrich A. “The theory of complex phenomena.” The critical approach to science and philosophy (1964): 332-349.

Holian, Matthew J., and Graham D. Newell. “An Agent-Based Model of Entrepreneurship.” Draft paper (2011).

Janssen, Marco A. “Agent-based modelling.” Modelling in ecological economics (2005): 155-172.

Koppl, Roger. “Against representative agent methodology.” The Review of Austrian Economics 24.1 (2011): 43-55.

Krasnozhon, Leonid, and John Levendis. “Mises and prediction markets: Can markets forecast?.” The Review of Austrian Economics 28.1 (2015): 41-52.

Malmgren, Anna-Sara. “Is there a priori knowledge by testimony?.” The Philosophical Review 115.2 (2006): 199-241.

Paxson, Nathaniel, and Nikolai G. Wenzel. “Praxeology, Experimental Economics and the Process of Choice: FA Hayek and Vernon Smith on the Misesian Action Axiom.” The Review of Austrian Economics 29.2 (2016): 163-176.

Romer, Christina, and Jared Bernstein. “The job impact of the American recovery and reinvestment plan.” (2009).

Rothbard, Murray N. “Reviewed Work: “Austrian Economics: Tensions and New Directions”.” Southern Economic Journal, vol. 61, no. 2, (1994): 559–560.

Rothbard, Murray N. “The politics of political economists: Comment.” The Quarterly Journal of Economics 74.4 (1960): 659-665.

Ullman, Jodie B., and Peter M. Bentler. Structural equation modeling. John Wiley & Sons, Inc., 2003.

U.S. Bureau of Labor Statistics, “Labor Force Statistics from the Current Population Survey, 2000-2016.” (2016).



1 thought on “Forecasting and Austrian Economics”

Leave a Comment