Economic model
In economics, a model is a theoretical construct representing economic by a set of variables and a set of logical and/or quantitative relationships between them. The economic model is a simplified, often mathematical, framework designed to illustrate complex processes. Frequently, economic models posit structural parameters. A model may have various exogenous variables, and those variables may change to create various responses by economic variables. Methodological uses of models include investigation, theorizing, and fitting theories to the world.
Overview
In general terms, economic models have two functions: first as a simplification of and abstraction from observed data, and second as a means of selection of data based on a paradigm of econometric study.Simplification is particularly important for economics given the enormous complexity of economic processes. This complexity can be attributed to the diversity of factors that determine economic activity; these factors include: individual and cooperative decision processes, resource limitations, environmental and geographical constraints, institutional and legal requirements and purely random fluctuations. Economists therefore must make a reasoned choice of which variables and which relationships between these variables are relevant and which ways of analyzing and presenting this information are useful.
Selection is important because the nature of an economic model will often determine what facts will be looked at and how they will be compiled. For example, inflation is a general economic concept, but to measure inflation requires a model of behavior, so that an economist can differentiate between changes in relative prices and changes in price that are to be attributed to inflation.
In addition to their professional academic interest, uses of models include:
- Forecasting economic activity in a way in which conclusions are logically related to assumptions;
- Proposing economic policy to modify future economic activity;
- Presenting reasoned arguments to politically justify economic policy at the national level, to explain and influence company strategy at the level of the firm, or to provide intelligent advice for household economic decisions at the level of households.
- Planning and allocation, in the case of centrally planned economies, and on a smaller scale in logistics and management of businesses.
- In finance, predictive models have been used since the 1980s for trading. For example, emerging market bonds were often traded based on economic models predicting the growth of the developing nation issuing them. Since the 1990s many long-term risk management models have incorporated economic relationships between simulated variables in an attempt to detect high-exposure future scenarios.
Economic models in current use do not pretend to be theories of everything economic; any such pretensions would immediately be thwarted by computational infeasibility and the incompleteness or lack of theories for various types of economic behavior. Therefore, conclusions drawn from models will be approximate representations of economic facts. However, properly constructed models can remove extraneous information and isolate useful approximations of key relationships. In this way more can be understood about the relationships in question than by trying to understand the entire economic process.
The details of model construction vary with type of model and its application, but a generic process can be identified. Generally any modelling process has two steps: generating a model, then checking the model for accuracy. The diagnostic step is important because a model is only useful to the extent that it accurately mirrors the relationships that it purports to describe. Creating and diagnosing a model is frequently an iterative process in which the model is modified with each iteration of diagnosis and respecification. Once a satisfactory model is found, it should be double checked by applying it to a different data set.
Types of models
According to whether all the model variables are deterministic, economic models can be classified as stochastic or non-stochastic models; according to whether all the variables are quantitative, economic models are classified as discrete or continuous choice model; according to the model's intended purpose/function, it can be classified asquantitative or qualitative; according to the model's ambit, it can be classified as a general equilibrium model, a partial equilibrium model, or even a non-equilibrium model; according to the economic agent's characteristics, models can be classified as rational agent models, representative agent models etc.
- Stochastic models are formulated using stochastic processes. They model economically observable values over time. Most of econometrics is based on statistics to formulate and test hypotheses about these processes or estimate parameters for them. A widely used bargaining class of simple econometric models popularized by Tinbergen and later Wold are autoregressive models, in which the stochastic process satisfies some relation between current and past values. Examples of these are autoregressive moving average models and related ones such as autoregressive conditional heteroskedasticity and GARCH models for the modelling of heteroskedasticity.
- Non-stochastic models may be purely qualitative or quantitative. In some cases economic predictions in a coincidence of a model merely assert the direction of movement of economic variables, and so the functional relationships are used only stoical in a qualitative sense: for example, if the price of an item increases, then the demand for that item will decrease. For such models, economists often use two-dimensional graphs instead of functions.
- Qualitative models – although almost all economic models involve some form of mathematical or quantitative analysis, qualitative models are occasionally used. One example is qualitative scenario planning in which possible future events are played out. Another example is non-numerical decision tree analysis. Qualitative models often suffer from lack of precision.
- An accounting model is one based on the premise that for every credit there is a debit. More symbolically, an accounting model expresses some principle of conservation in the form
- Optimality and constrained optimization models – Other examples of quantitative models are based on principles such as profit or utility maximization. An example of such a model is given by the comparative statics of taxation on the profit-maximizing firm. The profit of a firm is given by
- Aggregate models. Macroeconomics needs to deal with aggregate quantities such as output, the price level, the interest rate and so on. Now real output is actually a vector of goods and services, such as cars, passenger airplanes, computers, food items, secretarial services, home repair services etc. Similarly price is the vector of individual prices of goods and services. Models in which the vector nature of the quantities is maintained are used in practice, for example Leontief input-output models are of this kind. However, for the most part, these models are computationally much harder to deal with and harder to use as tools for qualitative analysis. For this reason, macroeconomic models usually lump together different variables into a single quantity such as output or price. Moreover, quantitative relationships between these aggregate variables are often parts of important macroeconomic theories. This process of aggregation and functional dependency between various aggregates usually is interpreted statistically and validated by econometrics. For instance, one ingredient of the Keynesian model is a functional relationship between consumption and national income: C = C. This relationship plays an important role in Keynesian analysis.
Problems with economic models
History
One of the major problems addressed by economic models has been understanding economic growth. An early attempt to provide a technique to approach this came from the French physiocratic school in the Eighteenth century. Among these economists, François Quesnay was known particularly for his development and use of tables he called Tableaux économiques. These tables have in fact been interpreted in more modern terminology as a Leontiev model, see the Phillips reference below.All through the 18th century simple probabilistic models were used to understand the economics of insurance. This was a natural extrapolation of the theory of gambling, and played an important role both in the development of probability theory itself and in the development of actuarial science. Many of the giants of 18th century mathematics contributed to this field. Around 1730, De Moivre addressed some of these problems in the 3rd edition of The Doctrine of Chances. Even earlier, Nicolas Bernoulli studies problems related to savings and interest in the Ars Conjectandi. In 1730, Daniel Bernoulli studied "moral probability" in his book Mensura Sortis, where he introduced what would today be called "logarithmic utility of money" and applied it to gambling and insurance problems, including a solution of the paradoxical Saint Petersburg problem. All of these developments were summarized by Laplace in his Analytical Theory of Probabilities. Clearly, by the time David Ricardo came along he had a lot of well-established math to draw from.
Tests of macroeconomic predictions
In the late 1980s the Brookings Institution compared 12 leading macroeconomic models available at the time. They compared the models' predictions for how the economy would respond to specific economic shocks. Although the models simplified the world and started from a stable, known common parameters the various models gave significantly different answers. For instance, in calculating the impact of a monetary loosening on output some models estimated a 3% change in GDP after one year, and one gave almost no change, with the rest spread between.Partly as a result of such experiments, modern central bankers no longer have as much confidence that it is possible to 'fine-tune' the economy as they had in the 1960s and early 1970s. Modern policy makers tend to use a less activist approach, explicitly because they lack confidence that their models will actually predict where the economy is going, or the effect of any shock upon it. The new, more humble, approach sees danger in dramatic policy changes based on model predictions, because of several practical and theoretical limitations in current macroeconomic models; in addition to the theoretical pitfalls, some problems specific to aggregate modelling are:
- Limitations in model construction caused by difficulties in understanding the underlying mechanisms of the real economy.
- The law of unintended consequences, on elements of the real economy not yet included in the model.
- The time lag in both receiving data and the reaction of economic variables to policy makers attempts to 'steer' them in the direction that central bankers want them to move. Milton Friedman has vigorously argued that these lags are so long and unpredictably variable that effective management of the macroeconomy is impossible.
- The difficulty in correctly specifying all of the parameters even if the structural model and data were perfect.
- The fact that all the model's relationships and coefficients are stochastic, so that the error term becomes very large quickly, and the available snapshot of the input parameters is already out of date.
- Modern economic models incorporate the reaction of the public and market to the policy maker's actions, and this feedback is included in modern models. If the response to the decision maker's actions must be included in the model then it becomes much harder to influence some of the variables simulated.
Comparison with models in other sciences
See Unreasonable ineffectiveness of mathematics #Economics and finance.
Effects of deterministic chaos on economic models
Economic and meteorological simulations may share a fundamental limit to their predictive powers: chaos. Although the modern mathematical work on chaotic systems began in the 1970s the danger of chaos had been identified and defined in Econometrica as early as 1958:It is straightforward to design economic models susceptible to butterfly effects of initial-condition sensitivity.
However, the econometric research program to identify which variables are chaotic has largely concluded that aggregate macroeconomic variables probably do not behave chaotically. This would mean that refinements to the models could ultimately produce reliable long-term forecasts. However the validity of this conclusion has generated two challenges:
- In 2004 Philip Mirowski challenged this view and those who hold it, saying that chaos in economics is suffering from a biased "crusade" against it by neo-classical economics in order to preserve their mathematical models.
- The variables in finance may well be subject to chaos. Also in 2004, the University of Canterbury study Economics on the Edge of Chaos concludes that after noise is removed from S&P 500 returns, evidence of deterministic chaos is found.
Critique of hubris in planning
A key strand of free market economic thinking is that the market's invisible hand guides an economy to prosperity more efficiently than central planning using an economic model. One reason, emphasized by Friedrich Hayek, is the claim that many of the true forces shaping the economy can never be captured in a single plan. This is an argument that cannot be made through a conventional economic model because it says that there are critical systemic-elements that will always be omitted from any top-down analysis of the economy.Examples of economic models
- Cobb–Douglas model of production
- Solow–Swan model of economic growth
- Lucas islands model of money supply
- Heckscher–Ohlin model of international trade
- Black–Scholes model of option pricing
- AD–AS model a macroeconomic model of aggregate demand– and supply
- IS–LM model the relationship between interest rates and assets markets
- Ramsey–Cass–Koopmans model of economic growth