Comments on economics, mystery fiction, drama, and art.

Saturday, August 01, 2015

Minimum wages and the US military

I was reflecting the other day on the recent flurry of activity at the state and local level to raise the minimum wage paid to workers.  And it occurred to me that there is one place in which the Federal Government may wish to act.  That is in the pay of members of the military.

In many ways, the volunteer military makes things different--now, the various branches of the military must compete on pay and benefits with private sector jobs, something that was less of an issue when many, if not most, enlisted ranks were filled through conscription.  Not surprisingly, the pay to people in enlisted ranks is considerably higher in real terms today (about $18,400) than it was in (say) 1969 (about $9,600) (in both cases, that's adjusted to 2014 CPI).  Of course a full-time, full-year minimum wage job today would pay $15,080 (and a full-time minimum wage job in 1969, when the minimum was $1.30, paid the equivalent of $17,500 when adjusted to 2014's CPI). Starting pay in the military is about 20% more than a minimum wage job. many cases, a entry-level military job is not 40 hours a week.  It is conceivable that someone in one of those positions is, effectively, on call 24 hours a day, 7 days a week, especially when deployed in (say) Iraq.

So I propose that the minimum base pay for members of the military be the federal minimum wage--but paid for 24 hours a day, 7 days a week, 52 weeks a year--8.760 hours, at $7.25, or $63,510 per year.  (Obviously, pay in higher ranks would need to be adjusted upward as well.)

Tuesday, July 28, 2015

Why People Don't Vote

Tim Taylor takes a look at the question "Why don't more people vote?" and concludes that there may not be much that we can do about low voter turnout.  Perhaps.  But I was struck by a couple of the numbers, one for both Presidential elections and the other for of-year elections, of reasons given by people for not voting:

Too busy, conflicting schedule:
Presidential elections:  18.9%
Off-year elections:  28.2%

Transportation problems:
Presidential elections:  3.3%
Off-year elections:  2.1%

Apparently, both parties do a very good job of getting people who want to vote to the polls.  But, equally apparently, there's something about the way in which we schedule elections that is a problem.

We are, so far as I know, the only major country in the world that (1) elects people to political office and (2) holds elections on a day when almost all people work.  I would suggest, just as an experiment, that we try holding some elections on, say, Saturdays or Sundays--and that we make national election days also national holidays.  You know, like civilized countries do.

Sunday, July 12, 2015

Andrew Jackson, Paper Money, and Banks

Before the U.S. Treasury announced that it would be replacing the $10 bill (picturing Alexander Hamilton) there had been a substantial discussion in the blogosphere about the desirability of replacing the $20 bill (Andrew Jackson).  The primary justification for replacing Jackson was his record as president, especially his opposition to the Second Bank of the United States and paper currency, but also his treatment of native Americans.

Many people may conflate Jackson's opposition to paper money with an opposition to Federal government participation in the creation or production of coin and currency.  This is not correct.  In the early part of the 19th century (and, indeed, until the Civil War), the Federal Government minted coins in various denominations (some examples are shown here; note that the gold and silver cins were much smaller than contemporary coins, so don't let the fact that they are all shown the same size deceive you.  A silver dollar would have been about the size of a current dime.  Jackson in fact supported coinage by the Federal government or gold and silver coins, and of "token" coins minted , for example, copper 1 and 5 cent pieces.


(That $1 gold coin would have been really tiny, and would weigh about 2 grams.) 

Jackson's opposition to paper currency disguises, for modern audiences, his opposition to banks.  In the early 19th century (and, indeed, even as late as the 1920s) paper money was issued exclusively (before the Civil War) or partly by private banks.  And, before the Civil War, the banking business was largely the business of issuing bank notes (which were required to be, but were not always. convertible into gold or silver at a fixed rate (one ounce of gold was about $20; one ounce of silver was about $1.35).  (Examples of bank note are shown here.

(I happen to have a $10 bill, issued by the Merchant's National Bank of Terre Haute in 1929, signed by the bank's Treasurer, my grandfather, Alfred J. Woolford.)
Banks today generate revenue (and profits) by accepting deposits (on which they may pay interest and may charge fees) and by making loans.  How did banks in the 1820s generate revenue and make profits?  Mostly they were not depository institutions; while savings and what we would call checking accounts existed, they were not terribly common.  Banks made money by printing bank notes and making loans.  Banks were generally required to redeem their bank notes with specie(gold or silver coins).  Where, then, did banks get the resources to do this?  [Today a large (but declining) source of funds for making loans is from deposits.)  When a bank organized, its owners would provide capital, generally in the form of gold or silver coin or U.S. Government bonds.  [The  bonds also provided some income (interest rates on long-term US Government bonds averaged about 4.5% between 1820 and 1840).] 
Banks could then make loans in one of two ways.  They could provide a borrower with bank notes issued by the bank, with a repayment (including interest) as indicated by the loan agreement.  The second mechanism for lending was a letter of credit, by which a bank would guarantee that any purchase for which the holder of the letter of credit used it would be made by the bank.  Generally, banks charged a fee for issuing these and then charged interest on any amounts advanced. (Letters of credit could be conveyed in whole to a third party, and circulate as, essentially, a form of money.) According to Howard Bodeman and Hugh Rockoff's research, interest rates on private sector loans between 1820 and 1840 were roughly the same as those on U.S. Government bonds.[1]  Banks, knowing that all the notes they issued were unlikely to be presented for redemption at once, generally lent more (sometimes much, much more) than they had on hand in coin.  This is, in effect, "fractional reserve" banking, which is how commercial banks operate today.
Now suppose Jackson had triumphed, and paper currency--bank notes--had been forced out of existence.  Now what would banks do?  Well, they could still exist, and they might have transitioned into depository institutions more quickly than they did.  More likely, they would have organized as before, with owners putting up capital in the form of gold and silver coin and Government bonds.  They could have made loans by lending the physical coins they had.  If this were all they did, they would be practicing "100% reserve" banking--a bank could only make a loan if it had sufficient coin to do so.  This would restrict the quantity of loans, which, by itself, would tend to drive interest rates up.  Banks, however, would have had an alternative--expanded use of letters of credit and other negotiable paper.  The bank would provide the borrower with the instrument, for which the borrower would pay a fee and then pay interest on any portion of the line of credit created.
Whether the net effect of elimination of paper money have been a contraction of credit generally is uncertain, but my own guess is that it would have.  As a result, interest rates would have increased, borrowing would have decreased, the US rate of economic growth would have declined as firms acquired less capital equipment and expanded less rapidly.  Not my idea of the best possible outcome.  But, then, Jackson is not my idea of the best possible president, on almost any dimension.
[1] Howard Bodeman and Hugh Rockoff'. "Regional Interest Rates in Antebellum America," Strategis Factors in Nineteenth Century American Economic History:  A Volume to Honor Robert W. Fogel (ed. Claudia Goldin and Hugh Rockoff, University of Chicago Press, 1991, pp.159-187.

Wednesday, June 24, 2015

The Gambler's Fallacy

And getting the analysis wrong.

I recently read (somewhere) which used this example:  Suppose you are playing blackjack, and you have lost 20 consecutive hands.  Are you more or less likely to win the 21st hand?  Whoever wrote that post (and I can't remember who, or where, and I haven't been able to find it) argued that your odds of winning the next hand are unchanged.

Well, maybe not.  Consider a coin flip (of a fair coin).  The odds of a head on any flip are 50%--because the flips are independent of each other.  Or suppose you are doing the other old standby--picking a marble from an urn, and you are (reliably) told that of the 1,000 marbles in the urn, 500 are red and 500 are white.  You pluck out a red marble, than toss it back in, and the urns is shaken to re-mix it.  Your picks are, again, independent of each other.

Now, suppose you are playing blackjack at a Las Vegas casino.  According to this source, the most common games use 6 or 8 decks at a time.  But the play is without replacement of the cards used in the current hand.  It's as if you drew a red marble from the urn and, instead of replacing it, you threw it away.  The odds of winning in a blackjack hand are, in fact, dependent  on how the play has gone so far from those 6 or 8 decks.  It is precisely the fact that blackjack hands are not independent that makes card-counting a viable strategy (and why casinos try to identify and throw card-counters out).

Any card game in which play is without replacement is a game in which your odds can change as play progresses.  You just have to know what sort of game it is.  And be willing to do the work to figure out how the odds have changed.  And counting a 8-deck shoe in a casino is hard, even if you work very hard at it.

Monday, June 15, 2015

Where's the Skills Gap?

Tyler Cowen spotlights a blog post presenting some analysis suggesting that the US economy, and particularly the goods-producing part of it, is experiencing a skills gap--firms want to hire people with skills that are not available among the unemployed, or among discouraged workers.  The author reaches this conclusion "...the US has gutted its manufacturing base, creating a large deficit of skilled manufacturing workers. The skills gap therefore is likely to persist for years to come, creating a material drag on economic growth."    It's worth looking at the analysis, and the evidence presented there.

The evidence there is substantial, and has to be considered.  But I think there's some other evidence that should also be considered:  What's happening to average weekly hours, to overtime hours (among workers with overtime), and to real average hourly earnings.  Using BLS (monthly, not seasonally adjusted) data, here's what I find:

(Click each diagram to enlarge.)

All three show essentially the same pattern--declining weekly hours and weekly overtime during the recession; declining average hourly earnings during the recession, all followed by recovery and then plateauing around 2010 or 2011.

This is not, it seems to me, consistent with a skills shortage.  Firms would want to extend the hours of existing workers, offer more overtime, of offer higher pay.  Yet none of that seems to be (systematically) happening.  If there is a skills shortage, why is it not showing up in these data, in addition to the data in the post I've linked to above?

Friday, June 05, 2015

A comment on this: "Report: Social Security overpaid disability benefits by $17B": Context, folks, context

I don't mean to dismiss concerns like this:
Social Security overpaid disability beneficiaries by nearly $17 billion over the past decade, a government watchdog said Friday, raising alarms about the massive program just as it approaches the brink of insolvency.
Many payments went to people who earned too much money to qualify for benefits, or to those no longer disabled. Payments also went to people who had died or were in prison.  In all, nearly half of the 9 million people receiving disability payments were overpaid, according to the results of a 10-year study by the Social Security Administration's inspector general. 
Social Security was able to recoup about $8.1 billion, but it often took years to get the money back, the study said.
But let's look at it a little more closely.  After the recovery of $8.1 billion (and ignoring the administrative costs of that recovery), we're left with  $8.9 billion in excess payments, to 4.5 million people.  That's about $2,000 per over-payment case.  And disability benefits tend to be paid for multiple years; if the over-payments persisted for an average of 5 years per case, that's around $400 per year per case, or about $34 per month.  So it's pretty clear that no one was getting rich off of this, or even living all that well.

For context, in the 10-year period 2004-2013 (the most recent period for which I can find data), total disability payments were, um, $961 billion.  So the unrecovered overpayments amounted to, ah, 0.8% of the total disability payments made.*  (Note, as well, that this does not account for any under-payments that occurred.  And I'm sure there were some.)  A problem to address, to be sure.  But a significant cause of the financing problem faced by the disability system?  Probably not.

*I'd be interested in knowing the error rates for things like private health insurance plans.

Tuesday, June 02, 2015

Genrating Wilderness

This is one of the most important things I have read lately, and the diagram below is one reason why--we can return land--physical space--to wilderness without necessarily sacrificing our material standard of living.

(Click to enlarge.)

(Props to The Growth Economics Blog.)