As I am reading the most recent post in Greg Mankiw's Blog, I find the data he presents intriguing. He is comparing two years (2007 and 2010) to show that workers being paid the minimum wage has increased, measured as a % of all workers, from 2.3% to 6%. In those years the federal minimum wage increased from $5.15 to $7.25 in nominal terms. Greg is asking his students to evaluate the link between the two. As one of his former students I feel obliged to look at the data in more detailed to see what I can say.
My quick reaction is that there is an automatic relation between increasing the minimum wage and the % of individuals who are paid that wage. If we assume that firms keep employment constant and simply pay the higher wage, this increase will simply be those who were paid in between the old minimum wage and the new minimum wage. There is the possibility that some workers are fired, in which case the increase would be smaller.
My second reaction was about the fact that the period 2007-10 is special. Not only we have seen an increase in the minimum wage but also a deep recession. It is possible that some wages have fallen and old employees have been replaced by new ones who are now paid a lower wage - right at the level of the minimum wage. This would also cause an increase in the number of workers paid the minimum wage. But, of course, there is a potential second effect of a recession going in the opposite direction: it can be that during recessions those who lose their jobs are workers that are being paid lower wages and, as a result, you might see the percentage then decreasing as opposed to increasing.
So I was curious to see how this figure - the % of workers being paid the minimum wage- has changed over the last business cycles and whether this could also be behind its recent increase.
So I plotted three variables from 1979-2010. In blue you see the % of workers paid minimum wage (Greg's variable). As we can see this percentage has been decreasing since 1979. In that downward trend we also see three spikes: around 1991, around 1997 and 2008-10. The shape of this line, including the spikes correlate very well with the minimum wage (in green). It is measured in real terms (1996 dollars) and the scale is on the right hand side of my chart (the scale does not start at zero to see some meaningful variation). The real minimum wage has also been decreasing since 1979. A decrease that has been interrupted with increases in 1991, 1997 and 2007-09. These three increases coincide with the spikes in the blue line, the % of workers being paid the minimum wage. So the mechanical explanation is very visible in the chart, as you raise the minimum wage you see more workers being paid that rate.
What about the business cycle or labor market conditions in general? The unemployment rate is in red (left hand side axis). Interestingly, the unemployment rate, or the business cycle more generally, is also correlated with the blue line. Recessions are period where we generally see spikes in the blue line. Although there is an exception, the recession of 2001 saw unemployment increase without any change in the % of workers that were paid the minimum wage.
The difficulty in the figure above is that there is a correlation between the three variables and it is difficult to establish causality or assess the strength of each of the two effects. Two of the last three increases in minimum wages are not far from recessionary episodes which makes it very difficult to understand the potential role of the business cycle [Yes, there is a possible reading of those episodes as an increase in the minimum wage causing the recessions, but as we know well this is not what led to the 1990 or 2007 recessions].
One thing that might tell the two factors apart is to realize that we have an episode of increasing the minimum wage without a recession (1997) and we still see the mechanical effect very clearly; and we also have a recession (2001) without and increase in the minimum wage and the blue line does not change much. So based on that evidence it seems that the business cycle effect is not as visible as the simple mechanical effect of raising the minimum wage.
[For those who like econometrics: A more formal, but still very weak, test is to run a regression to see which of the two variables is more significant when put in a regression. The race is won by the minimum wage. Both the minimum wage and unemployment are significant when used in a regression by themselves. When you include them together only the minimum wage is. This regression is, of course, full of econometric problems because everything is endogenous, so still no causality inference can be made.]
Antonio Fatás
Sunday, October 16, 2011
Thursday, October 13, 2011
Who pays for the debt overhang?
High levels of debt by governments and households are a constraint on how fast demand can grow today. Even if the economic fundamentals (productivity, labor market) were unaffected by the crisis, an environment where everyone wants to save cannot be conducive to growth. Production needs to be sold and for that you need customers. Even those who are not very sympathetic to economic models where demand drives growth understand the difficulties of growing in an environment of debt overhang (here is Martin Feldstein today on the New York Times).
Default, reduction in mortgage payments are all proposals to alleviate the problem but they come at a cost: your debt is someone else's asset. There are, of course, circumstances where debt reduction is not a zero-sum game, where this is the only way to avoid a spiral of less spending, lower income and even higher debt as a percentage of income. This is what is called the paradox of thrift. There are other circumstances where everyone can benefit from debt reductions. As Martin Feldstein argues, reducing the value of a mortgage can be beneficial to both the individual and the bank:
This plan is fair because both borrowers and creditors would make sacrifices. The bank would accept the cost of the principal write-down because the resulting loan — with its lower loan-to-value ratio and its full recourse feature — would be much less likely to result in default. The borrowers would accept full recourse to get the mortgage reduction.
Of course, the bank would love to get back the full amount of the loan but given that in many cases individuals can walk away from an underwater home, a reduction in debt is the best the bank can get. In this case we can argue that it is even in the interest of the individual bank to strike this deal, so there is no need for co-ordination.
A similar story applies to the debate about default in Greece but the solution is less obvious. Banks do not like the idea of default; they want to be paid back but they also understand that a compromise might be better than fighting to a point where default is even bigger. This is the logic behind the current negotiations between the Greek government, the holders of the bonds and governments (who are behind the potential tax payer money that could go into the deal). And this becomes a very difficult discussion where co-ordination is key. As an example, European banks are likely to find themselves under pressure to raise more capital to absorb potential losses from an orderly default of the Greek debt. But it is costly to raise capital. The negotiations start and you need to get enough public support for your position. Josef Ackermann is quoted today as saying that recapitalization is a bad idea because the cost will ultimately be paid by customers (those asking for a loan). Or worse, the capital will not come from private investors but from public sources, making the government debt problem even worse. Here is the quote from the Financial Times:
On the one hand it [the debate] sends the signal that a [debt] haircut is more likely, and on the other because the resources for recapitalisation will surely not come from private investors, but rather states would ultimately have to raise the funds themselves, thereby worsening their debt levels.
So here we are, in the middle of the negotiation phase. I am sure there is some consensus on why dealing with the debt overhang is good for the economy, now the question is who pays for it. And no one wants to pay for it, so unfortunately there is no consensus there.
Antonio Fatás
Monday, October 10, 2011
The Lost Decade
Will the US economy go through a "lost decade" like Japan went after the burst of the bubble in the mid 90s? This is a question that gets often asked when comparing the current economic environment in the US to the one in Japan in those years. Lots of similarities: a bubble that burst, the fear (or reality) of deflation, a central bank with limited tools at its disposal, etc
But if you want to be really pessimistic, you can also look backwards to the previous decade, a decade that along some dimensions has also been lost. Here are three variables that show a downward (or flat) trend starting abut 10 years ago. No clear link between the three but interestingly a very similar pattern. The US economy showed weaknesses starting 10 years ago. They were not generalized and some measures of economic activity were doing ok, but it is interesting to see how others were not progressing anymore or even heading in the wrong direction starting around 2000.
US Stock Market (Dow Jones Index)
Antonio Fatás
Monday, October 3, 2011
Fear (of another recession), not uncertainty.
What is keeping growth in advanced economies from recovering at a speed similar from previous recessions? There are several explanations and which one you prefer might depend on your political taste (see an example of this debate in the US here).
There is one potential explanation that I find is being overemphasized: "it is all about uncertainty". And some make it more explicit and talk about regulatory uncertainty, uncertainty about taxes, about a sovereign default in Europe, etc.
No doubt that uncertainty plays a role in explaining macroeconomic fluctuations and I am a big fan of Nick Bloom's work, an economist at Stanford, who has provided strong evidence that uncertain raises around some of the most recent recessionary episodes.
But what do we mean when we use the word uncertainty to describe the current environment? I believe we are mixing two things: one is that the future is more difficult to predict (and this truly matches the notion of uncertainty) but the second one is that future scenarios are simply worse than what we thought before. This is not uncertainty, this is just bad news.
Here is an example: five years ago most investors would not consider the possibility of sovereign default in Europe. Today there is a chance that it might happen. Has uncertainty increased? Yes. There are now two scenarios (default and no default) and we are not certain about which one will happen. But the real problem is that on average the future looks much worse than it used to! So all the uncertainty comes from the left side of the distribution. This is mainly bad news combined with some increase in uncertainty.
Same applies to other issues where we currently use the word uncertainty: business face uncertain demand but the real problem is that in the scenarios they are considering, most look bad and they have recently gotten worse; there is increased uncertainty about public finances but the problem is not that we do not know how governments will resolve this challenge, the real problem is that governments have a challenge to resolve!
My preference would be to use the word fear rather that the word uncertainty to describe what we are seeing these days. What is really damaging is the possibility of a new recession, the possibility of sovereign default. These are all bad news. On average the future does not look great and this is the real problem.
Antonio Fatás
There is one potential explanation that I find is being overemphasized: "it is all about uncertainty". And some make it more explicit and talk about regulatory uncertainty, uncertainty about taxes, about a sovereign default in Europe, etc.
No doubt that uncertainty plays a role in explaining macroeconomic fluctuations and I am a big fan of Nick Bloom's work, an economist at Stanford, who has provided strong evidence that uncertain raises around some of the most recent recessionary episodes.
But what do we mean when we use the word uncertainty to describe the current environment? I believe we are mixing two things: one is that the future is more difficult to predict (and this truly matches the notion of uncertainty) but the second one is that future scenarios are simply worse than what we thought before. This is not uncertainty, this is just bad news.
Here is an example: five years ago most investors would not consider the possibility of sovereign default in Europe. Today there is a chance that it might happen. Has uncertainty increased? Yes. There are now two scenarios (default and no default) and we are not certain about which one will happen. But the real problem is that on average the future looks much worse than it used to! So all the uncertainty comes from the left side of the distribution. This is mainly bad news combined with some increase in uncertainty.
Same applies to other issues where we currently use the word uncertainty: business face uncertain demand but the real problem is that in the scenarios they are considering, most look bad and they have recently gotten worse; there is increased uncertainty about public finances but the problem is not that we do not know how governments will resolve this challenge, the real problem is that governments have a challenge to resolve!
My preference would be to use the word fear rather that the word uncertainty to describe what we are seeing these days. What is really damaging is the possibility of a new recession, the possibility of sovereign default. These are all bad news. On average the future does not look great and this is the real problem.
Antonio Fatás
Friday, September 30, 2011
Kerviel versus Greek government debt
It is all a matter of several billions but as I am reading two articles from the business press today I thought it is nice to compare the numbers on both articles. One is about the exposure of European banks to Greek debt. The other one is the history or recent episodes of individual traders causing massive losses on banks because of unauthorized trading. Here are the two numbers that I find interesting to compare:
1. Loss of Societe Generale as a result of unauthorized trades by Jerome Kerviel back in January 2008: 4.9 Billion Euros.
2. Exposure of Societe Generale to Greek government debt today: 2.9 Billion Euros (this is the total amount of Greek debt they hold).
This is not to minimize the risk of holding Greek government debt but it is useful to keep things in perspective. The real danger in Europe would be one of contagion and the most important task for European authorities is to avoid it. Default in Greece will be painful, but the costs could be contained if it does not spill over to other countries which are substantially larger. George Soros makes this point today in an FT article.
Antonio Fatás
Wednesday, September 28, 2011
Sovereign default: panic versus fundamentals
There is a growing concern that we are approaching a wave of sovereign defaults in Europe. And if there is default on government debt, it will have an effect on the balance sheets of Euroepan financial institutions and this is the source of the recent concerns about the solvency of some of these institutions.
Stress tests are designed to look at "pessimistic" scenarios to see whether financial institutions have enough capital to deal with them. But stress tests will always have an element of subjectivity. How pessimistic should we be in these scenarios? The IMF has recently expressed their concerns about the need for capital of some European banks because of the possibility of sovereign defaults not priced into some of the stress tests that European regulators have produced. This is a source of debate between European officials, the ECB and the IMF. But what is a good assumption about sovereign default in Europe? How do we measure the probability of default? Should we look at CDS (credit default swaps or should we use interest rates as a measure of default probabilities?
Both of these measures capture the "market" view on default probabilities. A completely different approach is too look at the fundamentals of fiscal policy sustainability (yes, it requires more work but it is always a productive exercise to look at the numbers and not just at how others read those numbers!).
The IMF fiscal monitor (last issue is just out) provides a very detailed analysis of the fundamentals behind fiscal policy. They look at several indicators or fiscal policy risk:
- gross debt as % of GDP (Debt)
- gross financing needs as % of GDP (GFN)
- short-term debt (as % of total)
- the currency deficit (adjusted for the cycle) (CAPD)
- Expected increase in pension spending over the coming years
- Expected increase in healthcare spending over the coming years
- Difference between interest rates paid on debt and the growth rate of output (r-g)
All indicators are straightforward, they look at the past (debt), the present (deficit) and the future (pensions, healthcare) taking into account the cost of borrowing (interest rate) and the ability of the economy to generate growth to keep the debt to gap ratio under a reasonable number. You want all these indicators to be as low as possible.
I am copying below the indicators for some of the countries they analyze (indicators are in the same order as in my list above)
Comparing France and the US we can see that both countries look risky along several dimensions. Overall, the US seem to score worse than France in several dimensions. Higher level of debt, deficits and more importantly, a larger future burden in terms of pensions and healthcare spending. The only indicator where the US does better is the low interest rates that the US government faces when borrowing in financial markets.
Here are the data from Germany and Spain
They do not look good either although the risks are similar or slightly lower than the ones in the US or France. Levels of debt are lower in Spain, current deficit is lower in Germany. Short-term pressures are similar to other countries and long-term pressures (pensions and healthcare) look better both in Spain and Germany than in the US. The only dimension where both of these countries do worse is when it comes to the difference between interest rates and growth.
In the case of Spain, the issue of credibility is key. If credibility is lost, the average interest rate paid on government debt will increase and will make more difficult to set a sustainable path for fiscal policy (in the chart above the indicator of the right will get higher). But the credibility of a government must be a function of the other indicators. The trust in a government's ability to repay should be a function of the level of debt, future spending, etc. Looking at those first six indicators above for Spain explains why the Spanish government insists that their fiscal position is not as bad as what "the market' believes. If you remove the last column, Spain could be seen as the strongest of the four countries.
But expectations and credibility matter and criticizing speculators might not be enough. What is needed is clarity in communications coming from the European authorities in order to rebuild the faith in the system. And this requires a combination of not denying bad news while at the same time restoring credibility where is needed. They need to try harder.
Antonio Fatás
Stress tests are designed to look at "pessimistic" scenarios to see whether financial institutions have enough capital to deal with them. But stress tests will always have an element of subjectivity. How pessimistic should we be in these scenarios? The IMF has recently expressed their concerns about the need for capital of some European banks because of the possibility of sovereign defaults not priced into some of the stress tests that European regulators have produced. This is a source of debate between European officials, the ECB and the IMF. But what is a good assumption about sovereign default in Europe? How do we measure the probability of default? Should we look at CDS (credit default swaps or should we use interest rates as a measure of default probabilities?
Both of these measures capture the "market" view on default probabilities. A completely different approach is too look at the fundamentals of fiscal policy sustainability (yes, it requires more work but it is always a productive exercise to look at the numbers and not just at how others read those numbers!).
The IMF fiscal monitor (last issue is just out) provides a very detailed analysis of the fundamentals behind fiscal policy. They look at several indicators or fiscal policy risk:
- gross debt as % of GDP (Debt)
- gross financing needs as % of GDP (GFN)
- short-term debt (as % of total)
- the currency deficit (adjusted for the cycle) (CAPD)
- Expected increase in pension spending over the coming years
- Expected increase in healthcare spending over the coming years
- Difference between interest rates paid on debt and the growth rate of output (r-g)
All indicators are straightforward, they look at the past (debt), the present (deficit) and the future (pensions, healthcare) taking into account the cost of borrowing (interest rate) and the ability of the economy to generate growth to keep the debt to gap ratio under a reasonable number. You want all these indicators to be as low as possible.
I am copying below the indicators for some of the countries they analyze (indicators are in the same order as in my list above)
Comparing France and the US we can see that both countries look risky along several dimensions. Overall, the US seem to score worse than France in several dimensions. Higher level of debt, deficits and more importantly, a larger future burden in terms of pensions and healthcare spending. The only indicator where the US does better is the low interest rates that the US government faces when borrowing in financial markets.
Here are the data from Germany and Spain
They do not look good either although the risks are similar or slightly lower than the ones in the US or France. Levels of debt are lower in Spain, current deficit is lower in Germany. Short-term pressures are similar to other countries and long-term pressures (pensions and healthcare) look better both in Spain and Germany than in the US. The only dimension where both of these countries do worse is when it comes to the difference between interest rates and growth.
In the case of Spain, the issue of credibility is key. If credibility is lost, the average interest rate paid on government debt will increase and will make more difficult to set a sustainable path for fiscal policy (in the chart above the indicator of the right will get higher). But the credibility of a government must be a function of the other indicators. The trust in a government's ability to repay should be a function of the level of debt, future spending, etc. Looking at those first six indicators above for Spain explains why the Spanish government insists that their fiscal position is not as bad as what "the market' believes. If you remove the last column, Spain could be seen as the strongest of the four countries.
But expectations and credibility matter and criticizing speculators might not be enough. What is needed is clarity in communications coming from the European authorities in order to rebuild the faith in the system. And this requires a combination of not denying bad news while at the same time restoring credibility where is needed. They need to try harder.
Antonio Fatás
Monday, September 26, 2011
Macroeconomics: Evidence or Ideology
The Wall Street Journal had a weekend interview with Robert Lucas, Nobel-winning economist and Professor of economics at the University of Chicago. He is asked about the economic situation in the US and Europe. When asked about the US he talks about the cost of uncertainty about future taxes. When he is asked about Europe, he talks about the cost of high taxes. From the interview:
For the best explanation of what happened in Europe and Japan, he points to research by fellow Nobelist Ed Prescott. In Europe, governments typically commandeer 50% of GDP. The burden to pay for all this largess falls on workers in the form of high marginal tax rates, and in particular on married women who might otherwise think of going to work as second earners in their households. "The welfare state is so expensive, it just breaks the link between work effort and what you get out of it, your living standard," says Mr. Lucas. "And it's really hurting them."No doubt that (theoretically) high taxes could discourage effort but is this statement empirically relevant? Below is a chart of marginal tax rates (as estimated by the OECD) and the female employment to population ratio for the age range (25-54) for 2010. I have chosen that particular employment to population ratio because it matches the statement in the quote above (the chart looks similar if we look at a different age range or male participation rates).
Do we see more or less effort in countries with high tax rates? Not obvious. In fact, in the sample I have selected there seems to be a positive correlation, not a negative one. Countries with strong welfare state, high taxes (both average and marginal) show higher level of efforts as measured by employment to population ratios. The US appears as a country with low taxes but also low levels of effort.
The chart above is, of course, not the final answer to the question of how taxes affect labor market outcomes but at least it gives as good argument to dispute the claim that all European problems are about high taxes.
Antonio Fatás
Subscribe to:
Posts (Atom)








