Winner of the New Statesman SPERI Prize in Political Economy 2016
Showing posts with label Jo Michell. Show all posts
Showing posts with label Jo Michell. Show all posts

Wednesday, 12 April 2017

Economics is an inexact science

When I wrote about why the BBC should treat a clear consensus in economics the same way as it now treated climate science, I got a number of comments about why economics is not a science. A common theme was that economics couldn’t prove theories ‘beyond doubt’ the same way as the hard sciences could. A more sophisticated version of this complaint is that most economic theories cannot be disproved in the same way that Popper thought scientific theories could be disproved.

All this ignores a key feature of any social science, which is their inexact nature. Instead we have accumulations of evidence that confirm the applicability of some theories and reject the applicability of others. Economists’ views about what models are applicable change as this evidence accumulates.

A good example involves the minimum wage, as Noah Smith suggests. The basic economic model suggested even a modest minimum wage should significantly reduce employment, but economists discovered that the evidence did not show this. As this evidence accumulated, alternative theories and models (monopsony and search) were thought to be more relevant. It is this response to evidence that makes economics a science.

Jo Michell writes “The scientific method of forming a hypothesis and then testing that hypothesis against reality can never be the final arbiter of knowledge, as it can in the physical sciences.” He is right that no single experiment or regression can kill a theory, but wrong that the accumulation of evidence is not the final arbiter, because no other arbiter is available. He links to a post by Noah Smith which talks about the failures of forecasting. But as that post makes clear, this is not about data rejecting models, but the inability of models to predict the future. We would never dream of condemning medics because they cannot predict the exact time of our death, still less suggest that this failure indicates they are not doing science.

Of course economics involves cases where economists appear too reluctant to give up their favoured models. You can find similar stories in the hard sciences. There will be more such stories in economics because the inexact nature of economics makes it easier to discount any single piece of evidence. What I cannot understand is what leads someone like Russ Roberts to argue against the use of evidence, and instead that “economics is primarily a way of organizing one’s thinking”. Astrology is also a way of organising one’s thinking, but it fails because evidence does not back it up.

That comparison is slightly unfair, because while the theory behind astrology is obviously implausible, the basic principles of microeconomics are not. In a class on economic methodology I once drew a huge tree that showed how most of economics could be derived from principles of rational choice. But go beyond the basics, and add in complications involving information and transactions costs (to name but two) and you very quickly derive competing models. There is no single model that comes from thinking like an economist, so for that reason alone we need data to tell us which models are more applicable.

So thinking like an economist does not tell me at what point raising the minimum wage will reduce employment. But why would anyone want to keep their models from being proved relevant or otherwise by data? The only reason I can think of is that some models give answers that are ideologically convenient. Of course allowing data to establish the relevance of some models over others does not make economics ideology proof. For example people can always select the one study that suggests that fiscal policy does not influence output and ignore the hundreds that show otherwise. That is why the accumulation of evidence, which includes its replicability, is so important. If you think economics has problems in that respect, have a look at psychology.

This is why economists views about the long term impact of Brexit should be treated as knowledge rather than just an opinion. Here knowledge is shorthand for the accumulation of evidence consistent with plausible theory. Sometimes the theories are common sense, like making trade more difficult will reduce trade. Estimates of the size of trade reduction based on evidence are uncertain, but they are better than estimates based on wishful thinking. Empirical gravity equations consistently show that geography still matters a lot in determining how much is traded. Finally there is clear evidence that trade is positively associated with productivity growth. To say that all this has no more worth than some politicians opinion is ultimately to degrade evidence and the science which interprets it.



Friday, 24 February 2017

The NAIRU: a response to critics

When I wrote my piece on NAIRU bashing, I mainly had in mind a few newspaper articles I had read which said we cannot reliably estimate it so why not junk the concept. What I had forgotten, however, is that for heterodox economists of a certain hue, the NAIRU is a trigger word, a bit like methodology is for mainstream economists. It conjures up lots of bad associations.

As a result, I got comments on my blog that were almost unbelievable. The most colourful was “NAIRU is the economic equivalent of "Muslim ban"”. At least two wanted to hold me directly responsible for any unemployment at the NAIRU. For example: “So according to you a fraction of the workforce needs to be kept unemployed.” Which is a bit like saying to doctors: “So according to you some people have to be allowed to die as a result of cancer.”

I have to say straight away that not everyone responded in that way. Some were much more thoughtful and constructive (like Jo Michell, for example). But the less thoughtful reactions are interesting in a way too.

I need to recap what the NAIRU is, particularly because heterodox economists seem to imagine it is many things it is not. Let’s take a very simple Phillips curve

Inflation this period = expected inflation next period - aU +b

where ‘a’ is a parameter and U is a measure of excess supply/demand in the economy. Unemployment will be one measure of that excess supply, but it is far from a perfect measure. (That my previous post was about excess supply, rather than actual unemployment, was obvious from what I wrote.) ‘b’ stands for a collection of slow moving variables. These could include a measure of union power, or how mobile labour was, or the degree of monopoly in the goods market. The NAIRU is defined as

NAIRU = b/a

If U is less than the NAIRU over a sustained period then inflation will rise, which will increase inflation expectations, which increases inflation further etc.

The concept is of interest to policymakers involved in demand management. They have to decide how much they can push demand before inflation starts rising. If they are independent central banks, they have to accept the world as it is. The NAIRU is a description of how the economy works: nothing more or less. This is why complaints that economists who use or estimate the concept are somehow responsible for those left unemployed are so dumb.

Of course you can criticise the concept of the NAIRU, but logically that has to involve criticism of the Phillips curve from where it comes. It is also reasonable to argue that the concept is fine, but the NAIRU is so difficult to measure that it would be better not to try and estimate it or let it guide policy. I have a lot of sympathy with that view at the moment, which is why I argue that, in the US right now, policy makers should find the NAIRU by allowing inflation to rise above target. But that point of view was irrelevant in my previous post, which was about the concept of the NAIRU, not its measurement.

As far as the concept is concerned, I think the strongest attacks come from thinking about hysteresis, as Jo Michell suggests. But even here, we add a complication to the NAIRU analysis, rather than overturn that analysis altogether. What hysteresis does is to make periods where unemployment is above the NAIRU extremely costly. It also means that periods of being slightly below the current NAIRU might be justified if they reduce the NAIRU itself.

I want to end by adding two reflections. The first relates to modelling the NAIRU. There once was, following the work of Layard and Nickell, an empirical literature that attempted to model for OECD countries a time series for the NAIRU, using proxy variables for things like union power, the benefit regime and geographical mismatch. With the dominance of the microfoundations methodology that work appears to have decreased, although to some extent it is still there in work based on matching models. I would be very interested to know if that time series analysis, now potentially enriched by matching models and flow data, has continued in any way.

The second relates to the sharp reactions to my original post I noted at the start, and the hostility displayed by some heterodox economists (I stress some) to the concept. I have been trying to decide what annoys me about this so much. I think it is this. The concept of the NAIRU, or equivalently the Phillips curve, is very basic to macroeconomics. It is hard to teach about inflation, unemployment and demand management without it. Those trying to set interest rates in independent central banks are, for the most part, doing what they can to find the optimal balance between inflation and unemployment.

Accepting the concept of the NAIRU does not mean you have to agree with their judgements. But if you want to argue that they could be doing something better, you need to use the language of macroeconomics. You can say, as many besides myself have done, that the NAIRU is either a lot lower than central bank estimates, or is currently so uncertain that these estimates should not influence policy. But if you say that the NAIRU has to be Bashed, Smashed, And Trashed, you will not get anywhere.

I also get very annoyed when I hear refutation by reference (as here for example). It would be so easy to write my blog posts that way. Instead I generally try to explain or present an argument that I hope is understandable. Economics is usually not so hard that this is impossible, although finding the right words is never easy. Economics is certainly not a religion, where all you have to do is choose which sect you belong to and then follow great works.     

Wednesday, 16 November 2016

Macroeconomics and Ideology

Jo Michell has a long post in which he enters in a debate between Ann Pettifor and myself about the role of mainstream macroeconomics in austerity. Ann wanted to pin a large part of the blame for austerity on mainstream macroeconomics, and Jo largely sides with her. Now I have great respect for Jo’s attempts to bridge the divide between mainstream and heterodox economics, but here he is both wrong about austerity and also paints a rather distorted picture of the history of macroeconomic thought.

Let’s start with austerity. I think he would agree that the consensus model of the business cycle among mainstream Keynesians for the last decade or two is the New Keynesian (NK) model. That model is absolutely clear about austerity. At the zero lower bound (ZLB) you do not cut government spending. It will reduce output. No ifs or buts.

So to argue that mainstream macro was pushing for austerity you would have to argue that mainstream economists thought the NK model was deficient in some important and rather fundamental respect. This was just not happening. One of, if not the, leading macroeconomist of the last decade or two is Michael Woodford. His book is something of a bible for those using the New Keynesian model. In June 2010 he wrote “Simple Analytics of the Government Expenditure Multiplier”, showing that increases in government spending could be particularly effective at the ZLB. The interest in that paper for those working in this area was not in that this form of fiscal policy would have some effect - that was obvious to those like myself working on monetary and fiscal policy using the NK model - but that it could generate very large multipliers.

This consensus that austerity would be damaging and fiscal stimulus useful was a major reason why we had fiscal stimulus in the UK and US in 2009, and why even the IMF backed fiscal stimulus in 2009. There were some from Chicago in particular who argued against that stimulus, but as bloggers like DeLong, Krugman and myself pointed out, they simply showed up their ignorance of the NK model. Krugman in particular was very familiar with ZLB macro, having done some important work on Japan’s lost decade.

What changed this policy consensus in 2010 was not agitation from the majority of mainstream academic macroeconomists, but two other events: the Eurozone crisis and the election of right wing governments in the UK and US Congress.

Jo tries to argue that because discussion of the ZLB was not in the macroeconomic textbooks, it was not part of the consensus. But textbooks are notorious for being about 30 years out of date, and most still base their teaching around IS/LM rather than the NK model. Now it might just be possible that right wing policy makers were misled by the consensus assignment taught in these textbooks, and that it was just a coincidence that these policy makers chose spending cuts rather than tax increases (and later tax cuts!), but that seems rather unlikely. You do not have to be working in the field to realise that the pre-financial crisis consensus for using changes in interest rates as the stabilisation tool of choice kind of depended on being able to change interest rates!

Moving on from austerity, Jo’s post also tries to argue that mainstream macroeconomics has always been heavily influenced by neoliberal ideology. To do that he gives a short account of the post-war history of macroeconomic thought that has Friedman, well known member of the Mont Pelerin society, as its guiding light, at least before New Classical economics came along. There is so much that could be said here, but let me limit myself to two points.

First, the idea that Keynesian economics was about short term periods of excess or deficient demand rather than permanent stagnation pre-dated Friedman, and goes back to the earliest attempts to formalise Keynesian economics. It was called the neoclassical consensus. It was why the Keynesian Bob Solow could give an account of growth theory that assumed full employment.

Second, the debates around monetarism in the 1970s were not about the validity of that Keynesian model, but about its parameters and policy activism. Friedman’s own contributions to macroeconomic theory, such as the permanent income hypothesis and the expectations augmented Phillips curve, did not obviously steer theory in a neoliberal direction. His main policy proposal, targeting the money supply, lost out to policy activism using changes to interest rates. And Friedman certainly did not approve of New Classical views on macroeconomic policy.

Jo may be on firmer ground when he argues that the neoliberal spirit of the 1980s might have had something to do with the success of New Classical economics, but I do not think it was at all central. As I have argued many times, the New Classical revolution was successful because rational expectations made sense to economists used to applying rationality in their microeconomics, and once you accept rational expectations then there were serious problems with the then dominant Keynesian consensus. I suppose you could try to argue a link between the appeal of microfoundations as a methodology and neoliberalism, but I think it would be a bit of a stretch.

This brings me to my final point. Jo notes that I have suggested an ideological influence behind the development of Real Business Cycle (RBC) theory, but asks why I stop there. He then writes
“It’s therefore odd that when Simon discusses the relationship between ideology and economics he chooses to draw a dividing line between those who use a sticky-price New Keynesian DSGE model and those who use a flexible-price New Classical version. The beliefs of the latter group are, Simon suggests, ideological, while those of the former group are based on ideology-free science. This strikes me as arbitrary. Simon’s justification is that, despite the evidence, the RBC model denies the possibility of involuntary unemployment. But the sticky-price version – which denies any role for inequality, finance, money, banking, liquidity, default, long-run unemployment, the use of fiscal policy away from the ZLB, supply-side hysteresis effects and plenty else besides – is acceptable.”

This misses a crucial distinction. The whole rationale of RBC theory was to show that business cycles were just an optimal response to technology shocks in a market clearing world. This would always deny an essential feature of business cycles, which is involuntary unemployment (IU). It is absurd to argue that NK theory denies all the things on Jo’s list. Abstraction is different from denial. The Solow model does not deny the existence of business cycles, but just assumes (rightly or wrongly) that they are not essential in looking at aspects of long term economic growth. Jo is right that the very basic NK model does not include IU, but there is nothing in the NK model that denies its possibility. Indeed it is fairly easy to elaborate the model to include it.

Why does the very basic NK model not include IU? The best thing to read on this is Woodford’s bible, but the basic idea is to focus on a model that allows variations in aggregate demand to be the driving force behind business cycles. I happen to think that is right: that is what drives these cycles, and IU is a consequence. Or to put it another way, you could still get business cycles even if the labour market always cleared.

To suggest, as Jo seems to, that the development of NK models had something to do with the Third Way politics of Blair and Bill Clinton is really far fetched. It was the inevitable response to RBC theory and its refusal to incorporate rigid prices, for which there is again strong evidence, and its inability to allow for IU.

That’s all. I do not want to talk about globalisation and trade theory partly because it is not my field, but also because I suspect there is some culpability there. I would also never want to suggest, as Jo implies I would, that ideological influence is confined to the New Classical part of macroeconomics. But just as it is absurd to deny any such influence, it is also wrong to imagine that the discipline and ideology are inextricably entwined. 2010 austerity is a proof of that.







Sunday, 11 September 2016

Stock-Flow Consistent models: response to Jo Michell

Jo has a thoughtful and constructive response to my post discussing a recent Bank of England paper that presents a new Stock-Flow Consistent (SFC) model. One of the reasons it is constructive is because it is not tribal: too many followers of heterodox schools seem to just want to rubbish mainstream macro and suggest their particular school represents the new dawn. So I thought I might make a few points on Jo’s post that might be helpful.

  1. A model that includes a lot of institutional detail is not a virtue in itself: indeed if at the end of the day these institutions do not matter too much it is an unnecessary and distracting feature. A useful way to think about modelling approaches is in terms of the validity of simplifications or short-cuts. It is for this reason that my method of theoretical deconstruction outlined and demonstrated here for large models is so important. By trying to relate large model properties to simpler models, you find out where additional detail is important or unnecessary. And of course, the answer to that problem may be context specific.

  2. I hope I never said SFC modes were “accounting, not economics”, because that statement makes no sense. Any behavioural model contains some kind of theory. What I think I said was that these models often seemed ‘light on theory’, which means that they talk a great deal about the accounting and rather little about theory.

    For example, to say that consumers have a desired wealth to income ratio is light on theory. Why do they have such a ratio? Is it because of a precautionary motive? If it is, that will mean that this desired ratio will be influenced by the behaviour of banks. The liquidity structure of wealth will be important, so they may react differently to housing wealth and financial assets. Now the theory behind the equations in the Bank’s paper may be informed by a rich theoretical tradition, but it is normal to at least reference that tradition when outlining the equations of the model.

  3. It is true that stock-flow accounting is important in modelling, in the sense that doing it stops you making silly errors. But it is not dissimilar to identities or market clearing conditions in this respect. You would never call a class of models ‘National Income Identity models’. [1] If the point is to emphasise that stocks matter to behavioural decisions about flows, then that is making a theoretical point. As Jo says, DSGE models are stock-flow consistent, but in the basic model consumers have no desired wealth ratio: it is the latter that matters. So when Jo says this absence should ring alarm bells, he is making a theoretical statement.

    I think Jo is right that the SFC name is unfortunate, but you can make a similar case for the name DSGE. It only matters when some people believe that stock-flow consistency is some kind of heterodox invention. Equally the label DSGE becomes a problem when economists start thinking that macroeconomists cannot do partial equilibrium any more, a point that Blanchard makes in his discussion of DSGE models.

  4. When Jo tries to connect the unimportance of stocks in DSGE to the return to full employment I think he is painting with too broad a brush. Let’s take a simple example. In the baseline small open economy model of mainstream macro, a temporary shock that leads to a current account deficit will permanently reduce welfare because net assets permanently fall. The trade balance has to improve, and consumption is therefore lower. A permanent depreciation worsens the terms of trade. In that case what happens to stocks has a permanent effect. Indeed, if you alter the model by replacing the consumption function with one based on Blanchard/Yaari consumers, there would be a feedback from wealth to consumption which would mean this shock would no longer have a permanent effect.

  5. In using the quote of mine about ‘not their field’ from a previous post he is rather unfair. As I go on to say, mainstream macro was at fault in neglecting finance. Pretty well every mainstream macroeconomist will say the same. The point I wanted to make was that it is not true that they all did this because they were sure it didn’t matter, the sector would regulate itself etc etc. What I have argued in this paper is that macroeconomists might well have not neglected the financial sector if they had allowed more traditional aggregate (i.e. non-microfounded) models to continue to be a legitimate area of academic research. Some might want to argue that this neglect of the financial sector reflected that mainstream macroeconomists were inherently neoliberal and believed financial markets looked after themselves. Perhaps some were, but plenty of others were not.

  6. I also think it is a bit unfair to suggest that I was criticising the model in the Bank’s paper. As it represents an alternative to DSGE models it should be welcomed. (Especially so for the Bank. Many public institutions, like the Fed, have maintained their aggregate models alongside DSGE models: the Bank of England has not.) What I was criticising was (a) the emphasis in the paper on the accounting at the expense of theoretical discussion (b) that the paper ignored the non-DSGE non-Post Keynesian modelling tradition.
[1] It may well be that the models I quoted, like the 1970s Treasury model, were SFC because of the influence of Godley, but they would have thought that this was just good modelling, and not a defining aspect of what they were doing.