ECON 571
Public Sector Economics
COURSE
POLICIES
Contact Information
Dr. Brian Goff/414 Grise Hall
Phone: 745-3855 / Email: brian.goff@wku.edu
Office Hours: M/W/F 9-11
(I am in my office or on campus most days for the bulk of the day)
Materials
Microeconomics of Public Policy
Analysis (Friedman)
Grading
Reading Quizzes
20%
Assignments 1-5
20%
Midterm Exam
30%
Empirical Studies
30%
(A >= 90%; B= 80-89; C=70-79; D=60-69; F < 60
Reading Quizzes: Brief short answer or multiple choice quizzes
(probably 5 questions) to assess whether you have completed the readings
Assignments 1-5: Short answer assignments on analytical topics
Miscellaneous
Last day to drop course with a "W" or change from credit to audit is
listed on WKU's
Academic Calendar. Any students requiring special
consideration
under the provisions of the ADA should register with the ADA Compliance
Office first and then consult with me as soon as possible. If you
are not fluent in English or are weak in your
writing abilities, you should utilize a writing "consultant" to examine
your written reports before turning them in. The WKU Writing
Center is one option. Undergraduate students willing to offer
tutorial services (for a fee or free) are another.
Attendance/Missed Assignments
The combination of students with job responsibilities and a course
which only meets once per week can present problems. The policy below
attempts to strike a balance of accommodation while maintaining
legitimate standards. Under special circumstances and my approval
discussed in advance, one assignment may be missed and the other
assignments/final weighted more heavily contingent upon my approval of
the reason. Failure to complete more than two assignments or absence
from more than 2 full sessions (or equivalent) will result in a student
being dropped from the course regardless of the reason.
COURSE
OUTLINE & LINKS
PART I
Analytical Emphasis
Week 1 (Jan 27) Admistrative Matters; Basic
Policy Models; Uncertainty in Public Policy
Models: The Case
of Crime & Punishment (HOPE
Program)
Assignment 1 --
Public Policy
& Hurricanes
Week 2 (Feb 3) Impact Analysis
Primer: RIMS
II Multipliers (pp. 1-14) ; Kentucky Downs Impact
Study
Assignment 2 -- Assessing
an Impact Study
Related: I-O PPT;
Week 3 (Feb 10) Cost-Benefit Analysis
Primer: Minn
Fed Cost-Benefit Overview of Recycling
Assignment 3 -- Breaking Down a Cost-Benefit Study
Related Links: OMB
Guidelines
; Seattle
Monorail;
Toolbox
for
Regional Policy);
PPT on Basics
Week 4 (Feb 17) Externalities + Congestion
Pricing: Auctions
as a Vehicle to
Reduce Airport Delays
Assignment 4 -- London
Congestion Pricing
Week 5 (Feb 24) Optimal Tax Concepts
& Beyond: Econ
Encyclopedia Summary; Optimal
Taxation Brookings Summary;
Assignment 5 -- The Laffer Curve
Week 6 (Mar 3) Debt/Deficit Policy;
"The
Needed Quantity of Government Debt" Prescott Minn Fed;
Excel
file with Prescott Model;
CMA
Sovreign Risk Report Sovreign and State Debt
Week 7 (Mar 10)
SPRING
BREAK
Week 8 (Mar 17) Industry
Regulation, Market Definitions, Pricing Power-U.S.
District Court Opinion in Whole Foods Case (pp. 23-67)
Natural Monopoly &
Industry Regulation Electricity (Smith
Article) & Drugs (Epstein
Article)
Games
& Policy; Summary
of Game Theory; PowerPoint on
Policy Games
Week 9 (Mar 24) Midterm
Exam
PART II
Empirical Emphasis
Week 10 (Mar
31) Example of Empirical Public Economics: Role
of Presidential Advisors (forthcoming Public Choice); ADA Scores
Week 11 (Apr 7) Student Empirical Studies:
Replicating and Improving Fair Presidential
& Congressional Voting Models
Replicate Fair
Model; Adjust/Amend/Improve in some way(s); turn in i) replication
table, ii) improved table, iii) bulleted summary of improvement(s)
Fair
Data
Through 2008; 2008 Data
and Prediction;
Week 12 (Apr 14) Example of Empirical Public Economics:
Goff/Tollison on Public
Debt (Selected
Data)
Week 13 (Apr 21) Student Emprical Studies II: State
Budget Outcomes or State Bond
Ratings
Select topic below; build simple empirical
model; turn in i) simple empirical model results, ii) bulleted summary
of model/results
Why are some states in
better/worse fiscal condition? Fiscal
Survey of the States Archive
of Fiscal Survey of the States State
Pension Conditions
Budget
Processes in the States ; Brookings-Urban
Inst. Policy Data Center
Archive
of State Expenditure Reports
Why explains differences in
state (or national) credit standing? 2008
State Bond Ratings ; CMA
Sovreign Risk Report
Week 14 (Apr 28) Example of Empirical Public
Economics: KTRS and Defined Benefit Plan Issues; Anecdotes
on Pension Problem Source;
KTRS Publications;
Financial
Report ; Actuarial
Report;
Statistical Report
Week 15 (May 5) Student Empirical Studies in Public Policy
III; KTRS Simulation
Worksheet
i) Examine possible changes in key variables
(retirement age, retirement pop, ...) and then present values of of
policy changes (retirement eligibility rules, contribution rates,
benefit rates)
ii) Examine handling shortfalls through increasing public
transfers (sizes of these implied transfers; difference in timing/size
of transfers) on state tax burdens
iii) Examine economic environmental variables
(wage growth, longer lives, ...) on present values; can we "grow
out" of the problem?
Miscellaneous Topics
Natural Monopoly & Industry Regulation Electricity (Smith
Article) & Drugs (Epstein Article)
Games & Policy; Summary of Game Theory; PowerPoint on Policy
Games
Estimating Regressivity of Consumption Taxes (Ricmond Fed, Winter 2009)
Measuring Output, Cost, & Productivity in Not-for-Profit
Settings
Rand Study on Higher Ed Productivity ;
StlFed-HigherEd Productivity
Written
Assignment 1: Public
Policy for Things Difficult to Insure
Read
“Are
Hurricanes
Uninsurable?” and answer following questions (1-2 pages; use
equations, graphics where appropriate and useful)
1. List the
key variables in a simple economic model (equation) of hurricane
insurance
pricing.
2. Relate the
items in the simple insurance model to the discussion by Jenkins – that
is,
briefly explain how government policies and actions have distorted key
elements
of the insurance pricing model.
2.
What aspects do terrorism made and hurricanes
share in common that may cause people to question whether they are
insurable?
4.
If governmental subsidy is a given, as assumed by
Robert Litan, how does Litan suggest to establish the government
subsidy to
better match the insurance market?
Written
Assignment 2: Impact Study Assessment
Read
the Impact Study for a proposed Pittsburgh Casino, Station
Square Casino Study, and
answer the following questions (1-2 pp.)
1.
What is the basis of the revenue estimates used in this study, and what
are the intial impacts from which multipliers are applied?
2.
What are potential leakages that might impact the size of the initial
impacts? Are these clearly identified in the paper?
3.
What are the basis of the multipliers used in the study and what
strengths/weaknesses might they have versus other possible sources?
4.
What additional questions/issues arise regarding this study,
particularly with respect to changes over time that may occur?
Written
Assignment 3:
Breaking
Down a Cost-Benefit Analysis
San
Marcos C-B Study http://ecommons.txstate.edu/arp/104/
1. Identify
and describe the main sources of benefit measured by the author.
2.
Identify and describe the main sources of cost
measured by the author.
3.
From the standpoint of economic theory/practice,
what are some weaknesses of the author’s methods?
Written
Assignment 4: Pricing
& Policy in London
Victoria
Transport Institute
(Also, additional information Jonathan
Leape
full Leape article at Journal
of Economic Perspectives, Fall 2006, pp. 157-176;)
1. List the
key factual features of the London congestion pricing system.
2.
Draw a diagram(s) providing economic basis behind
the London pricing system.
3.
What economic (non-political) complications may arise from implementing
a congestion pricing system like this (refer to the airport congestion
article for ideas):
4.
How was London able to overcome these obstacles and related political
objections?
5.
What difficulties might arise appear in trying to
apply such an idea in Los Angeles or other major American cities that
may be different or greater than in the London case?
Written Assignment
5: Laffer Curve Topics
Read The
(Shifty) Laffer Curve (Govt Spending influenced LC)
(Atlanta Fed, 3rd QTR 2000) and answer the following questions
1. Draw and give and briefly sketch (using equations Prescott's
model or simplifications of it) an explanation of the logic behind the
Laffer Curve's shape.
2. What is the key variable(s) influencing the shape of the
Laffer discussed in this article. What equations might this work
through in a Prescott-type model? ?
3. How might differences in wealth over time or across countries
influence the shape of the curve? (show a graphic and explain
your reasoning through parameters in the Prescott model)
4. Searching economic literature, locate up to date estimates of the
Laffer curve peak/shape (indicate specifically the sample used to
generate these estimates)
Why
Do Americans Work more Than Europeans - MinnFed
http://minneapolisfed.org/research/QR/QR2811.pdf
(Note:
this reading includes extensive mathematical
expressions; these expressions really just involve algebra. My intent is not for you to be able to
replicate the mathematics but for you to be able to uncover key
arguments and
logical links within the presentation.)
1. Which of
Prescott’s equation’s pertain to:
a)
consumer behavior
b)
producer behavior
c) market
equilibria
2.
What are the key “parameter” values used by
Prescott?
3. Attempt to
give a simple explanation for each of these parameter values.
4.
Attempt to make a flowchart type diagram of the
ways that an increase in income tax rates work themselves into the
choice of
labor hours in Prescott’s model
Production
Concepts in Not-for-profit Settings
StlFed-HigherEd Productivity http://www.stls.frb.org/publications/re/2006/a/pdf/higher_education.pdf
(Also
confer, RAND – Higher Ed)
1.
What objectives matter in higher education?
2. Using typical "outputs divided by inputs" measures, what are
the main measures by which higher education productivity is considered
low?
3. What
objectives may have become more important to higher education
(undergraduate)
consumers, and as a result important to administrators, over the last
40 years?
4.
What adjustments to the basic measures of
productivity need to be made to more yield more accurate estimates?
Summarizing an Empirical Study in
Public Policy (Worth
twice as many points as regular assignment)
Find an article that uses statistical methods to explore a topic in
public policy. Write a 2-3 page report summarizing the
article. In the report
1. Explain the main question addressed by the article
2. Summarize the data used (type of data, time frame)
3. Explain the main statistical results (explain one table/figure
at a time). For example, "In Table 1, the author describes the
average values of ... These values indicate ... In Table 2,
the author presents a regression analysis where .... are used to
explain ... The regression model shows ...".
4. Explain the main findings of the article. Explain the
main weaknesses of the study. Explain questions raised for future
study.
Federal Reserve District Bank publications (for example, Federal
Reserve Bank of St. Louis Quarterly Review) are a good place to look
for
studies. Also, any peer-reviewed journal in economics or public
policy is fine. Below, I have listed some suggested topics with
associated
links.
State economic (and other) performance using matched pairs (e.g. State Growth
Comparisons
with Matched Pairs )
Presidential
voting models
State budget forecasting
State regulatory indexes Economic
Freedom of North America
Federal-State dependence and interactions (Explaining state relative
shares)
Debt & deficits: Sustainability
of Federal Deficits; ; Explaining Federal Deficits
;
Comparison of practices across states and their effects (line item
vetos;
non-traditional procedures; earmarking; accounting methods, ...) Chicago
Fed;
Basis or impacts of sports subsidies
-- State Budgets and Medicaid (Chicago
Fed)
-- Approved Student Selection
Fiscal
Survey of States, National Association of State Budget Directors)
Cost-Benefit Studies
-- Portland
Freeway
-- Sacramento
Transport Improvements
-- San
Marcos (TX)
Bridge Overpass
--Bowling Green
Minor League Baseball
-- Estimating Willingness to Pay for Non-market activities (e.g. life)
-- Social Security Problems or Solutions
-- Medicare or Solutions
-- Nationalized Health Care (e.g. Fraser
Institute Data)
-- Universal Health Insurance
-- Predicting Presidential Outcomes
-- Stadium Projects
-- Net Cost-Benefit of Government Spending (Track Back Via Marginal
Revolution post)
Misc Links
Budgeting ( Federal
Budget
Process ; Economic
Report of the President; U.S. Budget
& Citizen's
Guide) State
Budgets & Economy; Excel File
with State Data)
Pharmaceutical Review -- Epstein Experiment (surgical procedures)
Identifying
Policy Effects on Choices
NEA
Funding & Charitable Donations
(pp. 1-17) FRB
STL
Fed – Economics of Charitable Giving
(Note:
The FRB reading involves working through a
mathematical theoretical model that is likely beyond the ability of
most
students in the course to develop or fully comprehend.
I do not expect you to be able to replicate
the mathematics. Nonetheless, this
assignment is intended for you to make an attempt to understand the
arguments
developed. The St. Louis Fed reading
provides a very readable introduction to the logic and terms behind the
more
technical model.)
1. What is the key question addressed?
2.
Write down the equation(s) (and equation numbers)
that identify the objectives of consumers/taxpayers and those that
identify the
direct constraints on their decisions.
3.
What are the ways that public policy (NEA
Funding) may influence the consumer/taxpayer constraints?
4.
Can you identify the equations that take into
account these influences?
5. What are the
limits of this model in examining the potential crowding-out effects of
NEA?
Written
Assignment : Antitrust Policy
District
Court Decision: FTC v. Whole Foods
(pp. 23-64).
http://www.ftc.gov/os/caselist/0710114/0710114dcopinpub.pdf
1.
List the main features of the FTC’s case that the
judge considers.
2.
What weakness does the judge highlight in the
FTC’s economic reasoning and data support?
3.
Based on
our discussion of the problem of “dynamics” in markets, what longer run
impacts
might not have received due consideration by the FTC?
Written
Assignment Week 11: Public Provision & Public(?) Goods
(See
Reading Below: “Cities Start Own Efforts to Speed Up Broad Band”)
1. What are
the basic characteristics of a public good – does internet
infrastructure
appear to meet these characteristics?
2. Why might
the private market fundamentals be different for a large city than a
small or
medium sized on like Chattanooga?
3. What are
the economic incentives to construct telecomm networks?
What might the longer term implications of
municipalities providing telecomm infrastructure be to network
construction?
4. Consider
the earlier piece by Vernon Smith on electrical markets – what is the
effect of
government regulation that does or does not segment different aspects
of the
supply chain?
5.
How might Coase criticize the approach taken by a
local government like Chattanooga or offer an alternative way for the
local
government to try to improve telecomm infrastructure?
Are Hurricanes Uninsurable?
by Holman Jenkins
It wasn't so long ago that insurers were pronouncing terrorism
"uninsurable." But ask any insurer: Aon's Paul Bassett recently noted
that the global war on terror had greatly reduced the threat of
megaplots on the scale of Sept. 11, 2001.
Yes, suicide bombers and car bombs remain a threat, but not to the
industry's capital base. Hurricanes are the new "uninsurable" now.
Here's a complicated story that only begins with warming ocean
temperatures thought to justify an expectation of increased hurricanes.
One risk modeler, Risk Management Solutions Inc., chucked out 100 years
of hurricane data and brought together four climate scientists who
probably wouldn't agree about much else but agreed that the next five
years would see a higher-than-average number of hurricanes. Presto, a
risk model employed by many insurers to set rates suddenly implies a
40% hike for Gulf Coast property owners.
A second factor: More hurricanes meet more people and property.
Regionally speaking, Katrina came ashore in a low-rent neighborhood.
Florida, a hurricane highway, represents an agglomeration of coastal
property worth $2 trillion. New York City and Long Island offer a
similar target. The Texas coast represents about $750 billion worth of
bowling pins.
These scenic vicinities experienced building booms in the '70s and
'80s, when hurricane activity was at low ebb. Beginning in the early
1990s, a series of devastating storms swept through, especially in
Florida. Enter factor three: The willingness of the federal government
to rush in with rebuilding aid far above even the billions in
subsidized flood insurance already provided to coastal homeowners.
Coastal development only quickened when it should have slowed.
Upping the ante were 9/11 and Katrina, which demonstrated to high
rollers the federal government was incapable of not shelling out
infinite sums to ease personal tragedies when bad things happen on a
large scale (if you lose your house or loved one in an everyday mishap,
of course, you're still on your own).
As anyone might have predicted, this dynamic is now unraveling the
insurance industry's ability to help society control its risk-taking by
properly pricing risk. Lloyd's Julian James put it this way late last
year: "It seems to me that with $400,000 per family [paid out in the
wake of the terrorist attacks and Katrina], if the government hands out
checks, do people need insurance?"
In olden times, robber barons built their seaside mansions a safe
distance from the ocean. Today's yuppies build their palaces right on
the beach: FEMA reckons that a quarter of coastal dwellings will be
destroyed in the next half century. The solution seems obvious: Restore
the incentive for yuppies to behave responsibly like the robber barons.
This assumes two things: Insurers would have to be free to charge
realistic rates, which is problematic given that rates in most states
are approved by elected or appointed insurance commissioners, most of
whom have their eyes on higher office.
It also assumes state courts will uphold insurance contracts -- a
principle being tested in Mississippi, where Attorney General Jim Hood
likens insurers to "Nazis in lockstep" and is pursuing civil and
criminal complaints against them for refusing to pay for flood damage
explicitly not covered in the policies they sold to homeowners.
The riskiest assumption of all is that property owners would be willing
to pay "actuarially sound" rates rather than just skipping insurance
and relying on a federal bailout in the wake of a big storm. Here's the
real crux of the claim by some industry watchers that hurricanes have
become effectively uninsurable.
In this camp is Robert Litan, a Brookings Institution economist who
says bailouts have become politically mandatory and property owners
expect them. So the only rational course now is to fund them in
advance, with taxes borne mainly by those who benefit. Coming to a
similar conclusion is Allstate CEO Ed Liddy, a one-man band who's been
trying to drum up enthusiasm for a federal disaster insurance program.
He was a voice alone but lately has acquired allies in the form of
State Farm and a few others. Mr. Liddy lays out an approach that, he
claims, would tap all business and property owners in Florida who
presumably benefit from coastal development even if their own property
is inland. Folks in Peoria and Dubuque wouldn't be milked, as they are
now, to subsidize the lifestyle of beach dwellers.
He also says his plan would mandate realistic insurance rates and
impose damage mitigation measures to reduce the cost of hurricanes and
discourage high-risk development.
Don't hold your breath for this part. Federal flood insurance was
instituted in 1968 with the same good intentions -- to make property
owners bear the cost of their own recurrent bailouts. Instead it became
a subsidy to increased risk-taking. That program today is $21 billion
in the hole, its shortfalls financed by taxpayers in Peoria and Dubuque.
Where's Al Gore when you need him? Mr. Liddy's plan may be a defensible
concession to political realities, i.e., the inevitability of the
federal government continuing to pay people to rebuild what storms
knock down (see New Orleans). But it also makes an undesirable peace
with over-development of coastal areas. The harder road of imposing
market insurance rates on coastal property owners and ending taxpayer
handouts would mean a lot of coastal development would come to a halt
-- as it should.
In any case, all agree the debate won't be settled in an election year,
and probably not until another hurricane forces the country to face up
to how the rest of us have been taxed again and again to subsidize the
high-risk lifestyles of those who plant themselves in the paths of
hurricanes.
Efficient Markets
The Welfare of American Investors
By HENRY G. MANNE
Behavioral finance, a developing field of academic research that
emphasizes investor irrationality (and ignorance) and the inefficiency
of markets, has been hailed by defenders of the SEC as offering a solid
economic rationalization for our vast scheme of federal securities
regulations. Even apart from the obvious implications for the
regulatory system of ignorance and irrationality on the part of
regulators, a closer examination of the logic of behavioral finance
leaves little for the pro-regulation crowd to crow about.
Initially, behavioral finance emerged as an academic antidote to a
claim of substantial market perfection in the finance field, the
well-known "efficient market" theory of stock prices. Numerous
"anomalies" or irrationalities were discovered in the market for
securities, such as various kinds of over- or under-reactions to new
information, herding behavior, endowment effects, January effects,
weekend effects, small-firm or distressed-firm effects, bubbles and
crashes -- to name a few.
Faulty Data
Most of these alleged peculiarities proved in time to be far less
anomalous than was first thought. The data on which they were based
were often faulty, or the econometric models were measuring the wrong
thing, or various kinds of relevant transactions costs were ignored.
The effects of irrational or uninformed behavior were often canceled
out by opposite forces, and much of it was simply irrelevant.
Furthermore, the behavioralists did not -- and do not -- have a general
theory that can explain why financial markets work as well as they do.
Some close approximation of the efficient market theory is still the
most accurate and useful model of the stock market that we have.
Still, some of the behavioralists' criticisms stuck, especially in
regard to crashes and bubbles, events that arguably should not occur in
perfectly efficient markets. In this connection the efficient market
theorists had no choice but to reexamine and refine their own models,
which they have now done with some success. Perhaps the most important
behavioralist contribution to economics has been their reminder that
the market-model claim of rationality often does not comport with
actual human behavior.
Economists frequently failed to qualify economic pronouncements as
being limited in application to aggregate behavior. Too many assumed
that if markets in the aggregate behave rationally, it must be because
the "marginal" participant -- the trader who has the correct
information about what a price should be -- was himself a perfectly
rational maximizer. This better-informed and rational trader would
always arbitrage away any discrepancies from efficiency that a market
displayed.
But there is a vast difference between economics and psychology, and we
can thank the behavioralists for forcing economics back into its
correct posture of dealing with aggregate behavior. We can also thank
the behavioralists for demonstrating that the marginal trader/arbitrage
theory cannot explain all price formation, since we have no way, a
priori, of knowing that this hypothetical individual will be rational.
Nor can we any longer assume that the arbitrageur (apart from a
purchaser of 100% of the securities of a given company) will have all
the information necessary to set the correct price.
That discovery left a serious gap in economic theory. The efficient
market mavens were indeed correct in their conclusions about aggregate
market behavior -- but how could they explain this near perfection of
functioning markets while irrational and less-than-fully informed
individuals (so-called "noise" traders) were known to abound?
Traditional economics did contain the start of an answer to this
question, most notably in F.A. Hayek's classic "The Use of Knowledge in
Society" (1945). There, Hayek (addressing the then-pressing problem of
countering socialist doctrine) made the astute observation that
centralized or socialist planning can never be economically efficient
because it was impossible for a central planner to accumulate all the
information needed for correct economic decisions ("correct" in the
sense of displaying efficient market allocations of goods). The
critical information, he noted, is too scattered in bits and pieces
throughout the population ever to be assembled in one person's mind (or
computer). Diffused markets, on the other hand, function well because
the totality of relevant information, even subjective preferences, can
be aggregated through the price mechanism into a correct market
valuation.
This insight of Hayek's has been a mainstay of market theory ever since
it was advanced, but it remains merely an observation and a conclusion.
It does not detail how new information gets so effectively impacted
into the prices of goods and services. In other words, how does this
"weighted averaging" get done? And why should we assume that the impact
of rational participants would dominate that of irrational ones in
markets?
Similarly, the efficient market theory was based almost entirely on
empirical observations and did not offer a theory of how the market
came to be so efficient. Subsequent literature examined the mechanisms
of market efficiency (including insider trading), but these were again
observational and descriptive works that did not even recognize the
absence of a good theory of how new information gets properly
integrated into a price. The implicit and often explicit theory of
price formation was always the "arbitrage" notion, with the marginal
trader calling the shots.
Enter now financial journalist James Suroweicki and his charming and
insightful book, "The Wisdom of Crowds" (2004). The book opens with the
story of a contest at a county fair in England in 1906 to guess the
weight of an ox on display after slaughter and dressing. There were
about 800 guesses entered in the contest both by knowledgeable people
and by those who had no expertise in such matters. We are not told what
the winning guess was, but we are told that the average of all the
guesses (1,197 pounds) was virtually identical to the actual weight
(1,198 pounds).
Similar results show up regularly in the relatively new use of
so-called "prediction" or "virtual" markets, primarily employed today
in predicting outcomes of political elections, sporting events, new
product introductions or new movies. Though there are still some
problems with the technique, these "markets" have proved in the main to
be much more accurate than traditional interview polls. And these
various illustrations of the wisdom of crowds suggest a solution to the
problem of how correct prices are formed in financial markets beset by
irrational and poorly informed traders.
* * *
Weighted-average results are similar to "correct prices," since
informed investors can be assumed to invest more money if their
confidence in the validity of their information -- or the intensity of
their desire for the product -- is higher, thus imparting a weighted
average element to each price. And while the actual weight of an ox is
a more objective measure than the "correct" price of a security, the
main difference may be between a static and a dynamic figure with the
"correct price" of a stock being a kind of moving target.
The literature on prediction markets makes clear that the more
participants in a contest and the better informed they are, the more
likely is the weighted average of their guesses to be the correct one.
That is true, ironically, even though the additional participants have
even less knowledge than the earlier ones. The only requirements for
these markets to work well are that the various traders be diverse and
that their judgments be independent of one another. Clearly, there is
still a lot more work of a statistical and mathematical nature to be
done before the idea of the wisdom of crowds is turned into a
full-fledged theory of price formation, but at least we have identified
the problem and made a start towards a solution.
'Wisdom of Crowds'
The implications of what we already know of this "wisdom of crowds"
approach to price formation, as against the traditional marginal
pricing/arbitrage approach, are apt to be startling. We should rethink
any current policies based on a view of pricing in which we exclude the
best-informed traders and discard the wisdom of the many. For instance,
we now have a new and more powerful argument than we had in the past
for legalizing most insider or informed trading.
Since such trading clearly makes the market process work more
efficiently, it aids capital allocation decisions and informs business
executives through market-price feedback of the best predictions about
the value of new plans. Furthermore, the Supreme Court's "fraud on the
market" theory of civil liability under the federal securities laws and
Congress's ideas of correct civil damage claims for insider trading no
longer have any intellectual merit. The same is true of any other part
of our securities laws implicitly based on the notion of the marginal
trader as a rational arbitrageur of price.
The new approach would suggest that it is undesirable to have laws
discouraging stock trading by anyone who has any knowledge relevant to
the valuation of a security. Thus, assembly-line workers,
administrative assistants, office boys, accountants, lawyers,
salespeople, competitors, financial analysts and, of course, corporate
executives (government officials are another story) should all be
encouraged to buy or sell stocks based on any new information they
might have. Only those privately enjoined by contract or other legal
duty from trading should be excluded. The "wisdom of crowds" can do far
more for the welfare of American investors than all the mandated
disclosures and insider trading laws that the SEC and Congress can
think up.
Mr. Manne, a resident of Naples, Fla., is dean emeritus of George Mason
University School of Law. This is the first of a two-part series.
Power
to the People
By VERNON L. SMITH
Telecom deregulation has been judged successful. Long-distance rates
have declined and innovation has dramatically improved service. The
deregulation of natural gas, airlines, trucking and the railroads
during the Carter and Reagan administrations are successful experiments
in institutional change. But restructuring electricity has a tarnished
image, and many hold that market liberalization was a mistake.
The wholesale market has been volatile when seasonal energy supplies
are tight. California served up an economic disaster when growth in
demand and hot weather coincided with low water in the Pacific
Northwest reservoirs that would have strained the old regulated regime.
Finally, the Midwest-Eastern blackout happened, two years ago this
Sunday.
Why? Is it because electricity cannot be stored for peak demand? Is
electricity inherently different from the other targets of reform, and
impervious to liberalization? Is it an aging and inadequate
transmission grid?
It is none of the above. Hotel and transportation accommodations also
cannot be stored, but competition in these industries has led firms to
discover ways to dynamically price their products to respond
efficiently to variations in daily, weekly or seasonal demand. Every
industry is different, and this requires attention to the details of
how they are restructured for governance by market property-right
rules. And finally, the grid is inadequate only if you are wedded to
the belief that it must never be bypassed by local energy sources or
conservation from peak pricing to relieve congestion.
* * *
Many foreign countries -- the U.K., Chile, Australia and New Zealand --
have managed to liberalize electricity systems. There are no regrets in
spite of mistakes, backsliding and learning bumps. Liberalization
occurred because both U.S.-style regulation and foreign nationalization
programs were judged serious failures.
From the beginning many foreign countries saw that restructuring must
honor the technical difference between the wires business and the
energy business. They severed that long enforced tie-in sale of energy
with the wires monopoly, and alternative energy suppliers were allowed
entry to compete with the distributors. In the U.S., we made the costly
error of not embracing upfront the principle that the local monopoly
wires business must be distinct and separate from the sale and
provision of energy to retail accounts. Only in this way can you hope
to see retail energy competition, and the unleashing of a
trial-and-error discovery process in which firms search for the best
means of matching dynamic pricing and monitoring technologies with
consumer preferences.
Although some countries made the right decision, the devil is in the
details, and we have all learned that implementing it successfully has
not been easy. In New Zealand, exclusive energy-supply obligations were
incrementally removed from the existing local wires companies to permit
free entry of competitors, but in practice, entry penetration was
agonizingly slow. It is not too hard to see why: The local wires
companies, still supplying energy, are not motivated to make it easy
for an entrant to compete away their customer accounts. To implement
their menu of technologies, entrants must gain access to household
wires -- historically accessed only by the distributor -- to install
the switching or metering devices preferred by individual customers,
and accounts must be transferred from the incumbent distributor to the
new merchant supplier. The distributors have incentive to resist, delay
and impede customer changeover.
At home we have attempted to deregulate the provision of energy to
retail customers by altering the regulation of local utilities to
distinguish competitive (energy) components of the rate structure from
noncompetitive (wires infrastructure) components, and to apply new
rules for allocating costs to each. But there are complaints by retail
energy suppliers that the wires companies and their regulators have
used creative accounting to shift energy costs to the regulated price
of wires in order to undercut energy competitors without sacrificing
overall profit.
The Federal Energy Regulatory Commission got it right at wholesale
level: They moved to require generation companies to be separated from
the transmission grid. They understood that you cannot have a
competitive wholesale market if generators also own transmission. This
would allow energy production to be combined with the more limited
contestability of the transmission grid and unnecessarily restrain
energy competition in the wholesale market.
So why don't we just extend the FERC principle to the local wires and
energy purchased by retail customers? The political and regulatory
structure stands in the way. It would infringe states rights: FERC has
jurisdiction over the interstate energy transmission system, but no
authority over the local wires or retail energy competition on those
wires. Each state long ago granted a franchised local monopoly to your
utility company. This legally restricted service to one set of wires,
but implicitly was interpreted to mean that each utility could tie
customer purchases of energy to the rental of the wires -- a right they
are loath to give up.
In the deregulation of telephones we had a preview of how a wires legal
monopoly can be used to impede local competition in the use of the
wires. Recall the time when no one except a serviceman from Ma Bell was
allowed in your house to service the wires, and you were not permitted
to install phones that had not been produced by Bell. The industry
argument was that the "integrity and quality of the network" needed to
be protected, but this was just an excuse for limiting competition for
products and services that were separable from the regulated activity.
This also impeded innovation, an unseen cost of limiting choice and
entry.
The failure to liberalize the provision of retail energy is the
fundamental reason that there has been so little technical innovation
in the local distribution of energy to the end-use customers. The
electronic age of switching, metering and monitoring has found little
application between the end-use customer and the energy supply system.
The dead hand of historical cost pricing is hostile to innovation.
Without the free entry/exit trial-and-error discovery process there is
no way to know how technology, pricing and differential customer
preferences can be matched.
No state has yet tried a mandate to separate electricity from the wires
monopoly to allow competition in energy sales. In the natural gas
industry, however, one state has separated the customer's commodity
purchases from the utility's delivery system. Georgia voted to separate
the local pipes business from the sale of the natural gas that comes
through the pipes. The rental rate for the pipes continues to be
regulated as a monopoly, but there are now a dozen competing companies
that supply the end-use customer with gas: Each pumps gas vapor into
the distribution pool in response to its customers' decisions to burn
gas. The gas is metered at the household and the company bills only its
own customers.
The same model applied to electricity could yield great benefits since
over half of total retail cost is the energy component and that is
likely to grow.
Since peaking energy is much more costly to produce than base-load
off-peak energy, competition would be expected to lower off-peak prices
and raise peak energy prices to reflect their differential costs. But
peak-energy pricing is only part of the story. The capacity of the grid
is determined entirely by peak-energy demand. Reduce peak consumption
and you relieve transmission congestion and increase reliability and
security. Hence, regulatory reform needs to address how we price the
wires infrastructure.
Suppose half the wires capacity cost is due to only six hours of peak
demand -- the peak capacity being idle for 18 hours. Then those
consuming power during one-quarter of the day should be charged for
half of the capital cost of the wires. Such pricing is not only "fair,"
it conveys the right incentives for capital utilization. This principle
is why hotel rates are so much higher at seasonal peaks. It's the peak
renters that have required investors to build all the extra room
capacity. Off-season renters are not the ones straining capacity and
are charged less.
Competition naturally discovers this and prices reflect the opportunity
cost of new capacity, not the irrelevant historical cost of the
infrastructure.
It could pay a high-rise office-building owner to install a gas micro
turbine for peaking energy, or install motion-sensitive light switches
in all the offices, if in addition to the energy savings he could get a
wires-charge rebate due to his reduced dependence on the grid which
would accommodate growth without new investment. The fact that he
cannot benefit tells you how regulation blocks innovation. We badly
need changes in the local regulation of the wires that reward customers
if they reduce their dependence on the grid. As for retail energy
prices, why regulate them at all? Is there a state out there willing to
mandate separation of the wires monopoly from energy provision, and
allow free entry by retail energy merchants?
Mr. Smith, a professor at George Mason and the Rasmuson Chair at the
University of Alaska, Anchorage, is a 2002 Nobel laureate in economics.
Cities Start Own Efforts
To Speed Up Broadband
By CHRISTOPHER RHOADS
CHATTANOOGA, Tenn. -- Internet traffic is growing faster than at any
time since the boom of the late-1990s. Places like Chattanooga are
trying hard not to get stuck in the slow lane.
Some 60 towns and small cities, including Bristol, Va., Barnsville,
Minn., and Sallisaw, Okla., have built state-of-the-art fiber networks,
capable of speeds many times faster than most existing connections from
cable and telecom companies. An additional two dozen municipalities,
including Chattanooga, have launched or are considering similar
initiatives.
The efforts highlight a battle over Internet policy in the U.S. Once
the undisputed leader in the technological revolution, the U.S. now
lags a growing number of countries in the speed, cost and availability
of high-speed Internet. While cable and telecom companies are spending
billions to upgrade their service, they're focusing their efforts
mostly on larger U.S. cities for now.
Smaller ones such as Chattanooga say they need to fill the vacuum
themselves or risk falling further behind and losing highly-paid jobs.
Chattanooga's city-owned electric utility began offering ultrafast
Internet service to downtown business customers five years ago. Now it
plans to roll out a fiber network to deliver TV, high-speed Internet
and phone service to some 170,000 customers. The city has no choice but
to foot the bill itself for a high-speed network -- expected to cost
$230 million -- if it wants to remain competitive in today's global
economy, says Harold DePriest, the utility's chief executive officer.
It's a risky bet. Some municipal Internet efforts, including wireless
projects known as Wi-Fi, have failed in recent months. EarthLink Inc.
confirmed last week it was pulling the plug on its wireless partnership
with Philadelphia. A number of towns have abandoned a municipal fiber
initiative in Utah, called Utopia, amid financial difficulties.
The latest efforts have aroused intense opposition from private-sector
providers. Cable and telecom companies have successfully lobbied 15
state legislatures to pass laws preventing municipalities from entering
the broadband business. Comcast Corp., Cox Communications Inc. and
other cable and telecom providers have also filed lawsuits against
existing projects, arguing they're an improper use of taxpayer money
and amount to unfair competition. In Chattanooga, Comcast sued the
city's utility late last month in Hamilton County Chancery Court.
"They don't know what they're getting into," says Stacey Briggs, the
director of the trade group Tennessee Cable Telecommunications
Association, of Chattanooga's plan. She says the utility has
underestimated the costs involved, among other things.
Mr. DePriest counters that the suit is just a stall tactic: "So long as
they can delay us they can hold on to their customers."
Such disputes take on greater significance as the Internet enters a new
phase of explosive growth, much of it driven by user-generated video
and images. More network and cable TV shows are also being shown
online, and Web-enabled cellphones are bringing the Internet to new
users in places like Africa.
According to a recent report by Cisco Systems Inc., total annual
Internet traffic will quadruple by 2011, reaching a size of more than
342 exabytes (one exabyte is the equivalent of one trillion books of
about 400 pages each).
Global Comparison
In the U.S., where most of the critical infrastructure that led to the
creation of the Internet originated, questions persist about how
well-positioned the country is today. South Korea, for example, now
generates about the same amount of Internet traffic as the U.S., with
just one-sixth the population.
In terms of adoption, or the percentage of households using broadband,
the U.S. ranks 10th out of the 30 leading industrialized countries that
are members of the Organization of Economic Cooperation and
Development, a Paris-based research and policy group. The U.S. was
among the leaders in this category at the beginning of the decade. The
U.S. fares only slightly better in affordability, ranking 11th most
affordable, behind countries such as Italy and Norway.
The U.S. has fallen behind in speed, too. In the same study, conducted
by the Information Technology and Innovation Foundation, a nonpartisan
think tank, the U.S. ranked 15th in the average advertised download
speed, at 4.9 megabits a second. That's slower than the 17.6 megabits a
second in France and the 63.6 megabits a second in Japan, which ranks
No. 1 in this category. In other words, it takes a little over two
minutes to download a movie on iTunes in Japan, compared with almost
half an hour in the U.S. The average U.S. download speed is even
slower, according to other estimates.
Chattanooga's Mr. DePriest compares his agency's plan for high-speed
Internet to the rollout of electricity, which came to many parts of
Tennessee only in the 1930s as a result of the creation by the federal
government of the Tennessee Valley Authority. That was three decades
after many businesses and homes in major urban areas like New York were
first electrified.
The country's electricity at the time was largely provided by private
companies, which denounced any government efforts to get into the
business as "socialist" -- echoing the debate over municipal fiber
networks today. Against this opposition, many public utilities,
including Chattanooga's Electric Power Board, or EPB, were formed to
help bring electricity to their towns and surrounding countryside.
Electricity, of course, would later be used for many home appliances
that didn't exist at the time, from refrigerators and stereos to
televisions and computers. Similarly, bringing fiber to the home is
"not about what services are available now in the market, but about
things that haven't even been invented yet," says Katie Espeseth, head
of the Chattanooga fiber project.
City in Decline
The EPB views the fiber effort as central to the revival of a city long
in decline. In 1969, Walter Cronkite announced on the CBS Evening News
that Chattanooga had America's dirtiest air. The decline of passenger
rail traffic and the local iron industry was followed by massive
unemployment, the abandonment of downtown and soaring crime.
Today, after more than a billion dollars of investment, the city's
downtown is coming back to life. While some factory buildings remain
abandoned, others are being filled by high-tech start-ups, and by a
handful of restaurants, coffee shops and galleries that cater to their
young employees.
In a converted saddle factory here, Jonathan Bragdon, 38 years old,
runs a 40-person company that he says couldn't exist without a lot of
affordable Internet bandwidth. Seven of his employees live and work in
other cities, including New York and Leeds, England. His business,
called Tricycle Inc., transmits high-resolution 3-D simulations of
carpeting to interior designers.
More important than download speed for such work is upload speed. Yet,
on most connections it often takes longer to upload files to the
Internet than it does to download them from the Internet. With Comcast,
Mr. Bragdon was getting a download speed of eight megabits a second,
but an upload speed of only one megabit a second.
About two years ago, Tricycle switched to the EPB's fiber network. Mr.
Bragdon says that lowered his costs several-fold and gave him the
flexibility to upgrade to speeds as fast as 100 megabits a second.
"With the rivers and the mountains, young people want to live here,"
says Mr. Bragdon. "But you need good bandwidth to work here."
A Comcast spokeswoman says the company recently increased its speeds
for small businesses to 16 megabits a second in many markets, including
in Chattanooga, and upload speeds to two megabits.
Critics of the notion that Internet service in the U.S. is falling
behind other countries say gaps stem from cultural and political
differences. More than half the citizens of South Korea, for example,
live in multitenant buildings of at least 50 units concentrated in
large cities, making it easier and cheaper to connect people there,
according to a report this month from the Information Technology and
Innovation Foundation. In the U.S., by contrast, most people live in
single-family homes.
Other countries, such as France, have benefited from increased
competition by governments forcing their former telecom monopolies to
open their networks to new providers. In the U.S., the regional
successors to the former Ma Bell resisted such regulatory efforts,
arguing it made little sense for them to invest in their networks if
forced to share them with potential competitors.
As a result, in most markets in the U.S. there have been only two
broadband providers, one telecom and one cable company. While some
countries were aggressively trying to catch up to the U.S. Internet
lead, "not much changed in the U.S.," says Susan Crawford, a professor
of Internet governance at the Benjamin N. Cardozo School of Law in New
York.
Change is finally starting to happen, as cable and telecom companies
compete more aggressively in each other's traditional businesses. Bills
are now making their way through Congress to remove the state barriers
to municipalities offering broadband. And the Federal Communications
Commission recently revamped its definition of broadband, which had
been just 200 kilobits a second, to bring it more up-to-date. It now
includes several tiers of speeds, starting at 768 kilobits per second.
Verizon Communications Inc. is in the midst of a $23 billion project,
called FiOS, to bring fiber to the homes of more than half of its 33
million customers in 28 states by 2010. Comcast last month began
boosting speeds on its network, and estimates 20% of its customers will
have access to faster speeds by the end of the year.
Still, these ultrafast networks are destined only for certain parts of
the country, such as major urban areas, at least for the foreseeable
future. In large swaths of the U.S., particularly second- and
third-tier cities and towns with more dispersed populations, providers
consider deploying broadband less profitable.
In downtown Chattanooga, James Busch, a 37-year-old radiologist and
medical-software entrepreneur, says when he opened his business, he
couldn't find an Internet service that was fast enough. Comcast's plan
was too slow and AT&T said it would take three months to build a
dedicated higher-speed connection to his business, says Mr. Busch.
AT&T says it now offers small businesses a download speed of six
megabits a second, and upload of 512 kilobits a second.
Mr. Busch's clinic, located in a strip mall, consists of 10
radiologists who provide remote diagnoses for rural hospitals that
can't afford their own radiologists. Transmitting the high-resolution
medical imagery often requires a very fast speed, which he says the EPB
network now provides him.
Losing the Advantage
"Information technology means a smaller country with fewer people can
now do the same amount of work as a larger country," says Mr. Busch.
"If we don't become more efficient, we lose our big-country advantage."
Late last month, the EPB raised $219 million through municipal bonds,
which it says will primarily be used to upgrade its existing electrical
system. The upgrade will involve laying a fiber network to create a
so-called smart grid, which will allow the utility to remotely monitor
and control how power is distributed, says Mr. DePriest. He
acknowledges that once the fiber is laid it can be used to deliver TV,
Internet and phone service, but says that is a separate venture
altogether, and one which will require an additional $60 million to get
off the ground.
In its lawsuit, Comcast argues the grid isn't Chattanooga's primary
objective. It says the real goal of last month's bond issue was to
bring Internet and other services to residents. If the utility fails to
meet payments on the new debt, ratepayers would be stuck with the tab,
says Comcast. "We believe the plans constitute a cross subsidy
prohibited by Tennessee state law," says a Comcast spokeswoman. "Our
intention is to ensure...that Comcast be allowed to compete in a fair
environment.
Mr. DePriest remains undeterred. He expects to have most of the
smart-grid network completed within three years, serving 80% of the
city. "The issue is, does our community control our own fate," says Mr.
DePriest. "Or does someone else control it?"
New Wave of Nuclear Plants Faces High Costs
By REBECCA SMITH
A new generation of nuclear power plants is on the drawing boards in
the U.S., but the projected cost is causing some sticker shock: $5
billion to $12 billion a plant, double to quadruple earlier rough
estimates.
NRG Energy Inc. hopes to add two units to the South Texas Project
nuclear site.
Nuclear power is regaining favor as an alternative to other sources of
power generation, such as coal-fired plants, which have fallen out of
favor because they are major polluters. But the high cost could lead to
sharply higher electricity bills for consumers and inevitably reignite
debate about the nuclear industry's suitability to meet growing energy
needs.
Nuclear plants haven't been built in meaningful numbers in the U.S.
since the 1980s. Part of the cost escalation is bad luck. Plants are
being proposed in a period of skyrocketing costs for commodities such
as cement, steel and copper; amid a growing shortage of skilled labor;
and against the backdrop of a shrunken supplier network for the
industry.
The price escalation is sobering because the industry and regulators
have worked hard to make development more efficient, in hopes of
eliminating problems that in the past produced harrowing cost overruns.
The Nuclear Regulatory Commission, for example, has created a
streamlined licensing process to make timelier, more comprehensive
decisions about proposals. Nuclear vendors have developed standardized
designs for plants to reduce construction and operating costs. And
utility executives, with years of operating experience behind them, are
more astute buyers.
Now, 104 nuclear reactors are operating in the U.S. Most are highly
profitable but that was not the case until fairly recently. For the 75
units built between 1966 and 1986, the average cost was $3 billion or
triple early estimates, according to the Congressional Budget Office.
Many plants operate profitably now because they were sold to current
operators for less than their actual cost.
The latest projections follow months of tough negotiations between
utility companies and key suppliers, and suggest efforts to control
costs are proving elusive. Estimates released in recent weeks by
experienced nuclear operators -- NRG Energy Inc., Progress Energy Inc.,
Exelon Corp., Southern Co. and FPL Group Inc. -- "have blown by our
highest estimate" of costs computed just eight months ago, said Jim
Hempstead, a senior credit officer at Moody's Investors Service
credit-rating agency in New York.
Moody's worries that continued cost increases, even if partially offset
by billions of dollars worth of federal subsidies, could weaken
companies and expose consumers to high energy costs.
On May 7, Georgia Power Co., a unit of Atlanta-based Southern, said it
expects to spend $6.4 billion for a 45.7% interest in two new reactors
proposed for the Vogtle nuclear plant site near Augusta, Ga. Utility
officials declined to disclose total costs. A typical Georgia Power
household could expect to see its power bill go up by $144 annually to
pay for the plants after 2018, the utility said.
Bill Edge, spokesman for the Georgia Public Service Commission, said
Georgia "will look at what's best for ratepayers" and could pull
support if costs balloon to frightening heights. The existing Vogtle
plant, put into service in the late 1980s, cost more than 10 times its
original estimate, roughly $4.5 billion for each of two reactors.
FPL Group, Juno Beach, Fla., estimates it will cost $6 billion to $9
billion to build each of two reactors at its Turkey Point nuclear site
in southeast Florida. It has picked a reactor design by Westinghouse
Electric Co., a unit of Toshiba Corp., after concluding it could cost
as much as $12 billion to build plants with reactors designed by
General Electric Co. The joint venture GE Hitachi Nuclear Energy said
it hasn't seen FPL's calculations but is confident its units "are
cost-competitive compared with other nuclear designs."
Exelon, the nation's biggest nuclear operator, is considering building
two reactors on an undeveloped site in Texas, and said the cost could
be $5 billion to $6.5 billion each. The plants would be operated as
"merchant" plants and thus would not have utility customers on the hook
to pay for them, as is the case in both Florida and Georgia. Instead,
they would have to cover expenses through wholesale power sales.
Several things could derail new development plans. Excessive cost is
one. A second is the development of rival technologies that could again
make nuclear plants look like white elephants. A drop in prices for
coal and natural gas, now very expensive, also could make nuclear
plants less attractive. On the other hand, if Congress decides to tax
greenhouse-gas emissions, that could make electricity from nuclear
plants more attractive by raising costs for generators that burn fossil
fuels. Nuclear plants wouldn't have to pay the charges because they
aren't emitters.
Some states are clearing a path for nuclear-power development, even
before costs are fully known. They are inspired by a growing fear of
climate change. "The overwhelming feeling in Florida is that nuclear
power is popular and that's why it's going to go ahead," said J.R.
Kelly, head of the Office of Public Counsel in Tallahassee, which
represents consumers. "Our main concern is the tremendous cost."
In Florida, state officials are allowing utilities to collect money
from customers to cover development and construction costs. In the
past, regulators typically required utilities to bear the costs until
plants were finished.
Many utilities said they are watching with interest. Ralph Izzo, chief
executive of Public Service Enterprise Group Inc. in New Jersey, said
his company may not be big enough to build a nuclear plant, even though
it is a nuclear operator. "We're concerned by the rise in construction
costs," he said.
Markets for the Poor in Mexico
June 30, 2008
Helping the poor may be virtuous, but when the poverty industry starts
losing "clients" because the market is performing good works, watch out.
Compartamos Banco knows what it's like to have a tarnished halo. The
Mexican bank specializes in microfinancing for low-income entrepreneurs
in a country that never used to have a financial industry serving the
poor. Compartamos not only figured out how to meet the needs of this
excluded population, but also how to make money at it.
Capitalism is bringing financial services to the poor in Mexico. But
will nonprofit groups allow it? The Americas columnist Mary Anastasia
O'Grady speaks with James Freeman. (June 30)
As a result, the bank has been growing fast. With an average loan size
of only $450, it now has more than 900,000 clients – 15 times as many
as it had in 2000.
This strong growth suggests that the bank's for-profit model makes both
borrowers and lenders better off. Yet the triumph is not good news for
everyone. In the economic sector that Compartamos serves – those making
about $10 a day – the international charity brigade is at risk of
becoming obsolete. Perhaps this explains why people who make their
living giving away other people's money are badmouthing Compartamos for
the vulgar practice of earning "too much" profit.
Lending to microenterprises took off some years ago as economists
recognized that the poor, just like the middle class, can make
productive use of credit. The most famous microfinancier is Muhammad
Yunus, founder of the Grameen Bank and winner of the 2006 Nobel Peace
Prize.
Compartamos got its start in southern Mexico in 1990 as a nonprofit
providing working capital to small businesspeople like food preparers,
vendors and handicraft producers. Its funds initially came from
private-sector charity and governments, and its clients were – and
still are – largely female. This group is often illiterate but it is
also entrepreneurial and, as it turns out, a very good credit risk. In
lieu of collateral, the bank typically accepts the credit of a group of
entrepreneurs who effectively co-sign for a peer.
Compartamos
Compartamos Banco makes money and so do its clients.
After 10 years, Compartamos was financing 60,000 microborrowers. But it
recognized that the need for its service was much greater. In 2000, to
raise new capital, it formed a for-profit company to utilize
private-sector capital as well as loans and grants from government
agencies and charities. In 2002, it issued $70 million in debt, and
four years later its client base had grown to more than 600,000.
By 2006, bankers in the developing world who had traditionally ignored
the "C" and "D" economic classes – with "A" being the wealthiest and
"E" being the poorest – began to realize that lending to lower-income
entrepreneurs is good business. One reason for the change was that
computer software advances enabled banks to handle small accounts more
efficiently.
What was once written off as an unviable market became a hot
opportunity, and Compartamos was well positioned to capitalize on it in
Mexico. Last year the company launched an initial public offering that
was oversubscribed 13 times. That's when the do-gooders stepped in to
question the company's ethics.
In a commentary published last June on the Compartamos IPO, Richard
Rosenberg, a consultant for the Consultative Group to Assist the Poor –
not part of the World Bank but housed on its premises – observes that
the demand for shares in the company was driven, in part, by
"exceptional growth and profitability." He then ruminates for some 16
pages on whether Compartamos's for-profit model is at odds with the
goal of lifting the poor. A similar, though far less rigorous,
challenge to Compartamos titled "Microloan Sharks" appears in the
summer issue of the Stanford Social Innovation Review.
THE AMERICAS IN THE NEWS
Get the latest information in Spanish from The Wall Street Journal's
Americas page.
In his "reflections" on "microfinance interest rates and profits," Mr.
Rosenberg writes that "overcharg[ing]" clients under a nonprofit model
is OK because it is done for the sake of future borrowers. But when
profits go to providers of capital through dividends, then there is a
"conflict between the welfare of clients and the welfare of investors."
It's not the commercialization of the lending, we're told, but the
"size" of the profits that must be scrutinized.
What seems to elude Mr. Rosenberg is the fact that there is no way for
him to know whether there is "overcharg[ing]" or by how much. That
information can be delivered only by the market, when innovative new
entrants see they can provide services at a better price. This has been
happening since for-profit microfinance began to emerge, and the result
has been greater competition. Rates have been coming down even as the
demand for and availability of services have gone up.
How much better it would have been, Mr. Rosenberg suggests, if
Compartamos had raised capital through "socially motivated investors"
like the "international financial institutions" – i.e., the World Bank
and the like. How much better indeed, for him and his poverty lobby
cohorts, but not, it seems, for Mexico's entrepreneurial poor.
Special-Interest Secret
By BRYAN CAPLAN
May 12, 2007; Page A11
Behind every policy that does more harm than good, there's a special
interest that favors it anyway. The steel tariff was bad for consumers,
steel-using industries and foreign steel producers, but the steel lobby
still pushed for it. Farm subsidies are bad for both taxpayers and
unsubsidized farmers, but in 2002 the American farm lobby got a 70%
increase in government support. The minimum wage is bad for consumers,
employers and low-skill workers who get priced out of their jobs, but
unions are hard at work to raise it again.
When special interests talk, politicians listen and the rest of us
suffer. But why do politicians listen? Social scientists' favorite
explanation is that special interests pay close attention to their pet
issues and the rest of us do not. So when politicians decide where to
stand, the safer path is to satisfy knowledgeable insiders at the
expense of the oblivious public.
This explanation is appealing, but it neglects one glaring fact.
"Special-interest" legislation is popular.
Keeping foreign products out is popular. Since 1976, the Worldviews
survey has always found that Americans who "sympathize more with those
who want to eliminate tariffs" are seriously outnumbered by "those who
think such tariffs are necessary." Handouts for farmers are popular. A
2004 PIPA-Knowledge Networks Poll found that 58% agree that "government
needs to subsidize farming to make sure there will always be a good
supply of food." In 2006, the Pew Research Center found that over 80%
of Americans want to raise the minimum wage. It is safe to assume,
then, that few people want to abolish it. These results are not
isolated. It is hard to find any "special interest" policies that most
Americans oppose.
Clearly, there is something very wrong with the view that the steel
industry, farm lobby and labor unions thwart the will of the majority.
The public does not pay close attention to politics, but that hardly
seems to be the problem. The policies that prevail are basically the
policies that the public approves.
No wonder special interests so often get their way. They do not have to
force their policies down the public's throat, or sneak them through
Congress unnoticed. To succeed, special interests only need to persuade
politicians to swim with the current of public opinion.
Why would the majority favor policies that hurt the majority? There is
a good reason. The majority favors these policies because the average
person underestimates the social benefits of the free market,
especially for international and labor markets. In a phrase, the public
suffers from anti-market bias.
Economists have spent centuries explaining how markets channel greedy
intentions into socially desirable results; how trade is mutually
beneficial both within and between countries; how using price controls
to redistribute income inflicts a lot of collateral damage. These are
the lessons of every economics textbook. Contrary to the stereotype
that they can't agree, economists across the political spectrum, from
Paul Krugman to Greg Mankiw, see eye to eye on these basic lessons.
Unfortunately, most people resist even the most basic lessons of
economics. As every introductory teacher of the subject knows, students
are not blank slates. On the first day of class, they arrive with
strong -- and usually misguided -- beliefs about economics. Convincing
students to rethink their anti-market views is no easy task.
The principles of economics are intellectually compelling; but
emotionally, they fall flat. It feels better to believe that greedy
intentions imply bad consequences, that foreigners destroy our
prosperity and that price controls are a harmless way to transfer
income. Given these economic prejudices, we should expect policies like
steel tariffs, farm subsidies and the minimum wage to be popular.
None of this means that special interests don't matter, but it does put
their activities in a new light. Special interests do not have to sneak
behind the majority's back; they just need to ask for the right favor
in the right way. The steel lobby could have demanded a big handout
from the federal government. But that would have struck many voters as
welfare for the rich; steel-makers can't expect the same treatment as
farmers, can they? Instead, the steel lobby took the crowd-pleasing
route of blaming foreigners and asking for tariffs. Tariffs were less
direct than a naked subsidy from Washington, but they enriched the
steel industry without alienating the majority.
If special-interest legislation were fundamentally unpopular, public
relations campaigns would be futile. They would serve only to warn
taxpayers about plans to pick their pockets. Since the public shares
interest groups' critique of the free market, however, there is room
for persuasion. Left to its own devices, the public is unlikely to
spontaneously fret about the plight of the steel industry. But a good
public relations campaign can -- and often does -- change the public's
mind. Once the public actively supports an interest group, even
politicians who would prefer to leave the market alone find it awkward
to block government intervention.
In many cases, though, a public relations campaign is overkill. Special
interests can make money by maneuvering around the indifference of the
majority. Even though most people are protectionists, for example, they
are fuzzy about specifics. Which industries need protection? How much?
Should we use tariffs, quotas or what? To most citizens, these are mere
details; within broad limits, they will accept whatever happens. As far
as special interests are concerned, however, these details mean the
difference between feast and famine. When it is time to determine
details, special interests have a lot of influence -- in large part
because no one else cares enough to quibble.
In a monarchy, no one likes to blame the king for bad decisions. So
instead of blaming the king himself, critics point their fingers at his
wicked, incompetent and corrupt advisers. While this is a good way to
keep your head, it is hard to take seriously. Kings often make bad
decisions; and in any case, if his advisers are hurting the country,
isn't it the king's fault for listening to them?
In a democracy, similarly, no one likes to blame the majority for bad
decisions. So instead of blaming the majority, critics point their
fingers at special interests. But this too is hard to take seriously.
The majority often makes bad decisions; and in any case, if special
interests are hurting the country, isn't it the majority's fault for
listening to them?
We often ponder special-interest politics in order to solve a mystery:
"Why aren't policies better?" Realizing how many bad policies are here
by popular demand turns this question upside down. The real mystery is
not why policies aren't better. The real mystery of politics is why
policies aren't a lot worse.
Mr. Caplan, an associate professor of economics at George Mason
University, is the author of "The Myth of the Rational Voter: Why
Democracies Choose Bad Policies" (Princeton University Press, 2007).
Lessons of a Food Fight
August 29, 2007; Page A14
Lawyers for the Federal Trade Commission apparently can't believe their
"gotcha" haul of off-color statements by Whole Foods CEO John Mackey
wasn't enough to block his merger with Wild Oats, a competing chain, in
the absence of serious antitrust evidence.
Wailed the agency to an appeals court last week: The judge who refused
our injunction request ignored the substance of our case!
He sure did. Judge Paul Friedman barely eluded the pith of the FTC's
complaint, a private email from Mr. Mackey to his board in which the
hyperbolic CEO said the acquisition would "eliminate a competitor" and
"avoid nasty price wars."
You can find Judge Friedman's opinion at the Web site of the U.S.
District Court for the District of Columbia. He found, in essence, that
no amount of blather by Mr. Mackey in a state of competitive heat can
overcome the relevant facts: Whole Foods competes against the entire
universe of food retailers, not just Wild Oats, even if both happen to
style themselves "natural foods" supermarkets.
"The evidence before the court demonstrates that other supermarkets . .
. compete today for the food purchases of customers who shop at Whole
Foods and Wild Oats and that Whole Foods' customers already turn for
some of their food purchases to the full range of supermarkets," wrote
the Clinton appointee.
Duh. But the agency did succeed at least in its primary tactical aim,
embarrassing Mr. Mackey. Not only was it able to flaunt his unguarded
memo to his board. It disclosed his habit of unwisely posting his
anonymous thoughts on a Yahoo message board. It even managed
"inadvertently" to leak some of his company's confidential information
to the press.
It wasn't Judge Friedman's job to ask why FTC would bring such a
frivolous case in the first place. At times like these, one must
consult the work of James Buchanan, who won a Nobel Prize for applying
what economics tells us about incentives to the behavior of government
officials. To wit, they are people, and frequently behave like people.
They don't necessarily get up everyday thinking, "What can I do today
to advance the general welfare?" They frequently think: "What can I do
to advance my own interests? What can I do to extort tribute from the
private sector? What can I do to blackmail politicians into increasing
my resources and privileges? What can I do to manipulate the media?"
Antitrust agencies are especially prone to these habits because,
frankly, they lack useful ways to occupy their time. So few are the
opportunities in a modern economy for businesses to create meaningful,
exploitable, durable monopolies, trustbusting agencies must employ bold
ingenuity to keep themselves and the Washington antitrust community
busy. In this regard, a landmark in bar-lowering was the surprise
success of the FTC's 1997 move to nix a merger of Staples and Office
Depot, two office supply chains in a world full of office supply
retailers, on grounds that they offered consumers a unique "shopping
experience."
We've been off to the races ever since. That's how we got the Whole
Foods case, in which the agency argued the chain must be regulated as a
potential monopolist because some Whole Foods shoppers (its "core
customers") might refuse to shop elsewhere even for lower prices and
better service.
The appeal of such reasoning to trustbusters is obvious: Successful
differentiation through mere marketing can be reason enough to subject
a company to antitrust regulation. By such logic, Ford might be a
monopolist if some number of customers refuse to consider anything but
a Taurus, no matter how serviceable the substitutes from Toyota, etc.
Alas, such regulatory grabs are especially common in the waning days of
a weakened administration -- see the FCC's sudden enthusiasm for
wireless "open access" or the Justice Department's play for new
regulatory authority over the porn film industry.
For better and worse, antitrust seldom rises to the level of a threat
to the general prosperity, giving politicians little reason to blow the
whistle. Our over-the-rainbow solution would be simply to cancel the
antitrust laws, and leave it to Congress to legislate singly in the
case (if it ever arises) of a true monopoly that threatens the public
good. An absurd prescription? Check out "Does Antitrust Policy Improve
Consumer Welfare?" by Brookings Institution economists Clifford Winston
and Robert Crandall. It can be found in the fall 2003 Journal of
Economic Perspectives.
We won't expect such lessons to trickle down anytime soon, but Europe
also offers a variation worth considering. Unlike their U.S.
counterparts, Europe's trustbusters can be sued for damages when found
to have abused their discretion and authority.
A seminal verdict came in a ruling last month in favor of Schneider
Electric, a French company that had been forced to unwind a merger at
great cost on sloppy antitrust analysis. People who work in government
may be acting in good faith and concerned with the public good. The
Schneider case shows that it's unwise to give them credit for doing so
in advance of, or contrary to, the evidence.
Once we get over such naive sentiments, there's much to be said for
enforcing accountability by making regulators liable for damages when
they don't act in good faith in carrying out their public mandates.