Comments on economics, mystery fiction, drama, and art.

Tuesday, April 29, 2014

"Many things that are desirable are not feasible": Bad advice for an imperfect world

A little over a week ago, this list of things Tom Sargent said in 2007 in a commencement address at Berkeley somehow got hot in the economics blogosphere.  Several chunks of it received a lot of discussion (you can find it all on your own), but I have been musing on the very first item in the list:

1. Many things that are desirable are not feasible.

 My immediate reaction was that the inverse of it ("Many things that are feasible are not desirable") was both truer and more important.  Since then, I have concluded that there is something much more wrong about it, which may depend on what Sargent means by "feasible."  My favorite online dictionary provides these definitions:


1:  capable of being done or carried out
2:  capable of being used or dealt with successfully :  suitable
3:  reasonable, likely

The first definition seems to mean impossible (so Sargent is saying "Many things that are desirable are not possible"); the second, possible, but not a good idea; and the third, well, likely.  I suspect Sargent's take (and since he made no comment about his meaning, I guess it's up to me) was #1:  not feasible as impossible,

I also take Sargent to be telling us that something that is desirable, but not feasible-in-the-sense-of-possible should not be attempted.  That, is he is providing advice about what actions you might want to consider undertaking, and those you might not want to undertake.

Here's the problem:  Not possible in what sense?  Logically impossible?  Well, I wouldn't lean too strongly on that.  Impossible in the world as we know it?  Impossible in the world as it is, even if we  have incomplete knowledge?

You see, we might not be able to do something now, because we don't know how to do it, or even if it is possible to do it.  But being able to do it might be highly desirable, because being able to do it might be highly beneficial.  Obviously, I need examples here, so try these.

In the mid-18th century many people died and many were scarred for life by smallpox.  In the mid-20th century, many people died and many were crippled for life by polio.  In both cases, there seemed to be nothing we could do about either disease, but being able to do something was highly desirable.  What does Sargent's advice suggest we should do?  If he means that because we can't do anything right now, we should do nothing, well, we probably still couldn't do anything about smallpox or about polio.  But in both cases people (in the case of smallpox, Edward Jenner, and in the case of polio, by Jonas Salk and Albert Sabin) looked at what was happening in the world around them, and said to themselves, we must be able to find a way to prevent this disease.  I'm afraid that taking Sargent's implicit advice could lead to people saying, we don't know what to do, so why try?

(A similar case can be made for the discovery of every effective drug-based treatment of disease.)

Or, it is the mid-19th century.  Our only sources of interior or nighttime illumination come from burning something (candles, whale oil, kerosene...), with which there is an obvious problem--fires, which in cities with only rudimentary fire-fighting organizations can easily destroy large sections of a city, or even an entire city.  (A rather elegant view of this is provided by William Manchester in A World Lit Only By Fire.)  And we don't know how to provide interior or nighttime illumination from any other source.  So, if he took Sargent's advice, would Edison (and Westinghouse) have worked so hard to develop an alternative lighting source (electricity, obviously)?

I may be wrong to read what Sargent said as a counsel of passivity in the face of harms that we currently know not how to remedy.  But is seems to be a plausible reading, and one that, I think, can only be damaging to almost any type of human progress.

Friday, April 25, 2014

What it means to have a scientific attitude toward the world

"The lead-crime story fits all my political prejudices, as well as my taste for simple and surprising explanations with clear policy implications. And I’ve been a loud advocate for it. But I don’t want to believe it if it isn’t true."

This is  from a blog I read (daily), and it expresses as clearly as anything I have ever read how we must approach our search for truth.  Too much of what I read in economics essentially starts from the writer having found a simple, eloquent, compelling story, and then believing it, regardless of the evidence.  (And it's a serious temptation, one you have to work to avoid.  I find myself always thinking about it, and reminding myself that falling into that trap has caused me to make some of my own mistakes.)

Tuesday, April 22, 2014

The Revival of Hofstadter's Law

In his NYT blog yesterday (April 21), Paul Krugman refers to something Brad deLong wrote sometime recently:
"To reuse an old line from Brad DeLong, at this point right-wing paranoia is worse than you can possibly imagine, even if you take into account the fact that it’s worse than you can possibly imagine."

 It's entirely possible that neither Krugman nor deLong are aware of it, but this is clearly derivative of Hofstadter's Law, which first saw print (I think) in Douglas Hofstadter's magnificent book, Gödel, Escher and Bach:  An Eternal Golden Braid (1979):

"Hofstadter's Law: It always takes longer than you expect, even when you take into account Hofstadter's Law."

I suppose this episode will result in a lemma to Hofstadter's Law...

Monday, April 14, 2014

Gross Investment as a Percent of GDP