Is AIDS Having A Serious Impact on World Population Levels?

The WorldWatch Institute recently
released a report arguing that premature mortality from AIDS accounted
for about one-third of the current slowing of population growth with the
other two-thirds being accounted for by declines in fertility. Is AIDS
having a serious impact on world population growth?

In some African nations, AIDS
has become a nightmare according to U.S. Census Bureau statistics. In
Zimbabwe, for example, life expectancy has fallen to 39 years – down from
65 years prior to the AIDS epidemic.

“AIDS results in higher
mortality rates in childhood, as well as among young adults where mortality
otherwise is low,” said Karen Stanecki who co-authored the Census
Bureau’s recent World Population Profile: 1998. “As a result,
AIDS deaths will have a larger impact on life expectancies than on some
other demographic indicators in these nations.”

According to the US Census
Bureau, by 2010 sub-Saharan Africa alone will have 71 million fewer people
than it would have without the AIDS epidemic. Some parts of Latin America
and Asia will also experience significant decreases in population growth
due to the effect of AIDS.

On the other hand, the Census
Bureau report projects that the AIDS pandemic should run its course in
Africa by 2020 at which point the number of deaths will again decline
in those parts of Africa hit hardest and life expectancies will begin
to rise. Part of the good news in the Census Bureau report highlights
the turnaround in Uganda, one of the nations hit hardest by the AIDS epidemic
— in some areas prevalence of the disease is as high as 30 percent of
the population. Since 1993, however, the prevalence of AIDS has been halved
in the country thanks to the government’s willingness to admit to the
epidemic and tackle the problem head on.

Source:

Life expectancy in Africa cut short by AIDS. CNN, March 18, 1999.

‘Living Wage’ Proposals Harm Poor, Low-Skilled Workers

       For the past several months,
left wing activists in the city where I live — Kalamazoo, Michigan —
have been demanding that the city commission enact a so-called “living
wage” ordinance. The ordinance would require businesses contracting with
the city to pay their workers no less than $8.25/hours. This local action
is part of a nation-wide push by left wing groups to enact high minimum
wage ordinances; in the midst of the “living wage” campaign in Kalamazoo,
syndicated columnist Molly Ivins weighed in to support Sen. Edward KennedyÂ’s
bill calling for a national “living wage.”

       Most of the debate over the
high minimum wage proposal tends to focus on the impact, if any, a high
minimum wage would have for businesses. People tend to end up debating
whether it would encourage or discourage investment in the area, whether
or not it would cause employment levels to increase or decrease or even
whether increasing the minimum wage might have inflationary effects. Lost
in that debate, though, is the larger issue of whether simply increasing
the minimum wage would really help the low income workers it is nominally
intended to benefit. In fact, living wage proposals would likely harm
such workers over the long term.

       As Western Michigan University
professor of economics Emily Hoffman pointed out, in a position paper
opposing the Kalamazoo proposal, the wages firms pay workers are closely
tied to individualsÂ’ productivity and skills. Workers in low wage jobs
tend to be there precisely because they have few skills with which to
shop around for higher paying jobs.

       The solution proposed by Cooney,
Kennedy and Ivins is to simply force some employers to pay higher wages
to such low skills workers. But firms are in the business of maximizing
profit, not solving social problems such as the persistence of low-skilled
workers, and they will respond rationally to living wage requirements
by hiring the higher skilled employees that the higher wage rate will
inevitably attract, rather than the lower skilled workers who currently
command such jobs.

       IÂ’ve worked at firms here in
Kalamazoo, for example, that paid $6/hour to perform very low skilled
work, such as sorting and washing laundry from local hospitals. Now thatÂ’s
certainly not enough to life off of if it is a personÂ’s primary wage,
but most such workers werenÂ’t the primary or only wage earner in their
families. More importantly such jobs were often filled by people who were
unemployable at higher wage rates. Many of the applicants I saw included
people with criminal convictions, semi-literate high school dropouts who
had trouble reading the basic employment application and others who for
one reason or another had very few skills.

       Most of the employees didnÂ’t
stay at these low paying jobs for very long – in fact the turnover rate
was tremendously high. Many stuck around for 6 months or a year to establish
themselves as a reliable worker and develop some marketable skills, and
then left to take higher paying jobs.

       With a living wage in place,
however, it is likely these low-skilled workers would never have gotten
that chance to improve themselves. For $6/hour a business might have to
hire a high school dropout to perform a tedious task. But at $8.25/hour
that job becomes far more likely to attract more skilled and experienced
workers. Kalamazoo, for example, is home to about 30,000 college students.
At a rally for the living wage proposal at the largest campus, Western
Michigan University, one supporter went on at length about how the living
wage would mean better paying jobs for college students. Which is true
but is exactly the problem.

       As wage rates went up to $8.25/hour,
college students and others would be more likely to enter the job market
and more likely to compete for jobs that would have been unattractive
to them at only $6/hour. Guess who gets hired when a college student and
a high school drop out both apply for the same job? The result is pretty
easy to predict – fewer opportunities for low skilled workers to gain
additional skills and move up the economic ladder.

       This is, of course, the reason
that unions are generally the biggest supporters of living wage legislation.
In 1998 Detroit voters approved a $7.70/hour living wage proposal placed
on the ballot and pushed heavily by the Metropolitan Detroit AFL-CIO.
By making non-union labor more expensive, living wage ordinances make
union labor more competitive — at the expense of low skilled and poor
workers.

       That living wage legislation
would relegate low skilled workers to fewer opportunities and higher unemployment
can be seen in the results of the Davis-Bacon Act, which Ivins lauded
in a recent op-ed as an example of the good that regulating wages can
do. Enacted by Congress in 1931, the Davis-Bacon Act required construction
contractors working on government projects to pay high “prevailing wages”
to workers – essentially all workers on government construction contracts
have to be paid whatever unionized construction workers are getting paid.

       Davis-BaconÂ’s wage requirements
were enacted specifically to keep low skilled black construction workers
and black-owned construction firms in the South from competing with white
construction workers and white-owned construction firms in the North.
And it worked. Unable to offer lower wages for lower skills, black construction
workers found it more difficult than whites to get the training and on-the-job
experience necessary to increase their skills, productivity and thereby
wages. The resulting racial disparity in construction employment is amazing
to behold. In Detroit, whose population is 80 percent minority, a mere
3 percent of construction union membership is held by minority laborers.
The unemployment rate for black construction workers has run as high as
25 percent in recent years; far higher than the white unemployment rate
in the construction field.

       Cooney and others who support
living wage ordinances say they want a more “fair” economy. Passing laws
that would put low skilled workers on the unemployment line is a strange
way to go about achieving that objective.

Banning Genetic Tests For Insurance Poses Threat For Consumers

Some time ago my wife and I
learned one of us might have inherited a genetic disease. Our biggest
concern was for our newborn daughter — might she be afflicted? If so,
could the results of genetic testing be used to deny her or us insurance?

Governor John Engler’s promise
in the State of the State address to prevent insurance companies from
requiring patients to take genetic tests struck a special chord with us
because it is such a misguided solution to the problem.

Insurance companies are the
target of choice for politicians. As Republican State Senator (and surgeon)
John Schwarz of Battle Creek told the Detroit News recently, “There are
members of the legislature that are too cozy with insurance companies.
This ought to be a no-brainer. We don’t want insurance companies denying
people coverage based on genetic disposition.”

The problem with this view
it represents a fundamental lack of understanding about how insurance
companies work and as a result poses a long term threat to the very existence
of private insurance.

Insurance policies for life
and health insurance first came into widespread use with the advent of
modern statistical analysis. Such methods give us important but incomplete
information about risks. Today statisticians can predict the general risk
of heart disease for a male nonsmoker in his 40s, but no one has the ability
to determine which particular men will get heart disease and which men
will remain free of heart problems.

Insurance companies distribute
risk by charging rates that adjust for this incomplete information. Those
who never suffer heart disease end up subsidizing those who do, but the
cost of insurance is spread over many individuals so the cost remains
relatively low.

Ironically, genetic testing
provides information that makes it extremely difficult to efficiently
distribute risk.

Imagine scientists discover
a gene that increases the risk of heart disease three-fold. All other
things being equal, men who test positive for this gene will load up on
health and life insurance and gravitate toward plans with the largest
benefits, while those who test negative will tend to reduce the amount
of health and life coverage they buy and gravitate toward plans with fewer
benefits and lower premiums.

As the role genes play in disease
becomes clearer, the result of this trend will be for health care costs
to rise dramatically for insurance companies as people whose genetic tests
reveal serious future health problems buy far more insurance than they
normally would and those who hit the jackpot with relatively healthy genes
buy far less insurance than they would have without genetic tests. The
insurance industry gets squeezed at both ends and may have trouble remaining
solvent even with extremely high premiums.

Isn’t there anything that can
be done to protect both the insurers and the insured? Andrew Tabarrok,
an assistant professor economics at Ball State University, suggests an
alternative solution that benefits all parties — require people receiving
genetic tests to purchase genetic insurance.

The logic behind Tabarrok’s
proposal is similar to the justification for mandatory automobile insurance
— the knowledge gleaned from genetic tests potentially imposes very large
costs on individuals and the rest of society (when individuals can’t afford
the costs of their health care.) Genetic insurance would cover just those
additional costs.

Unlike the Governor’s proposal,
which risks ballooning insurance premiums, Tabarrok’s idea might actually
lower costs in the long-term.

Since the cost of genetic diseases
is already included in current health care costs (people are already dying
from genetic diseases, after all, even if our ability to detect such diseases
is only in its infancy), Tabarrok’s proposal merely separates current
insurance policies into genetic and non-genetic components — the cost
of the combination would be no greater than the current cost of health
insurance.

Since neither patients nor
insurance companies get short changed if policy holders test positive
for disease-causing genes, it is in the insurance companies’ interest
to encourage people to get genetic tests. People who test positive for
a heart disease gene, for example, could begin a low-fat diet and regimen
of exercise early in life thereby increasing the probability they will
avoid heart disease altogether. This benefits both parties by potentially
extending the life of patients and helping the insurance company to reduce
the costs spent treating disease.

The main defect of this system
is that it doesn’t lend itself very well to sloganeering. When Governor
Engler or some other politician says he’s going to solve a problem by
slapping another requirement on insurance companies, that’s a lot easier
to understand than Tabarrok’s somewhat counterintuitive (but effective)
scheme. Those who propose a more indirect route such as genetic insurance
risk being labeled as being “too cozy” with insurers.

It is quite clear, however,
that various mandates handed down to insurance companies in the past few
years are causing insurance premiums to rise dramatically, threatening
Americans’ ability to find affordable insurance. The new mandate Gov.
Engler proposes would only serve to needlessly exacerbate this trend.

U.S. Farmers Unlikely to See Turnaround Anytime Soon

Low prices for agricultural goods could continue through the year 2000, leaving
many US politicians urging a return to broad subsidies through crop and farm
insurance. According to Keith Collins, an economist with the US Department of
Agriculture, a rebound in the Asian economies is two to four years off, and
until that recovery takes place foreign demand for US agricultural products
will be weak.

And that’s got Congress ready to jump back on the subsidy wagon. After the
1996 Freedom to Farm Act, it appeared that US subsidies for farms might be on
their way out, but now Democrats and Republicans from farm states seem willing
to resurrect the system of subsidies through the back door of crop insurance.

Crop insurance compensates farmers if commodity prices fall below a certain
level. But because crop insurance encourages farmers to plant more than they
normally would, it also tends to result in larger than average crops and as
a result a greater risk of extremely low prices. In effect rather than a typical
insurance scheme, this is a roundabout way of setting a price floor on agricultural
commodities.

The US would be better off getting the government out of the crop insurance
business altogether.

Sources:

No quick fix seen for struggling farm economy. Joe Ruff, Associated Press,
Feb. 16, 1999.

Ending Water Shortages In India

All of the recent stories on the coming shortages in water seem to have overlooked
a key point – there are enormous amounts of recoverable water that go wasted
every year. A more rational, market-based system for water distribution would
go along way toward relieving water shortages by boosting efficiency and encouraging
recovery of wasted water.

India appears to be finally catching on to this. A recent Associated Press
story on India’s water storage notes that much of the country’s water reclamation
efforts are poorly managed. The Indian government spent billions of rupees setting
up 14 desalination plants in Ramanathpuram, for example. Today only one of those
plants is still operational; the rest have all failed due to poor maintenance
by government workers.

Similarly although many parts of India receive up to 38 inches of rainfall
annually, only 10 to 20 percent of it is actually captured – the rest washes
out to sea. Again, although there are literally thousands of tanks and water
reservoirs dotting the landscape of southern India, they are poorly maintained.

The New Delhi-based Center for Science and Environment estimates that merely
capturing the rainwater and runoff on 2 percent of India’s land area could supply
26 gallons of water per person.

India is taking an important step in starting to maintain and rebuild its
water capture and desalination facilities, but an important complement must
be market prices that give individuals and companies incentives to spend the
time and money to capture and use water efficiently.

Source:

India’s farmers tap into demand for water. Neelesh Misra, Associated Press,
March 8, 1999.

Do Good Harvests Stop War? (Or Selection Bias 101)

A study by researchers working on behalf of Future Harvest recently released
an odd report linking poor agricultural practices with war. As Dr. Indra De
Soysa sums up the conclusions of the study, “this report demonstrates that
providing developing world farmers with the fruits of research, when combined
with other measures, not only helps to end hunger, but can also contribute to
ending the increasingly vicious warfare that the world has seen during the 1990s.”

The researchers point to India, for example, which has seen both agricultural
successes and a decline in war over the past few decades.

All I can say is – are these folks serious? Of course nations that lack war
are likely to have improved agricultural success rates, but does this mean that
the latter causes the former? I think that’s a highly credulous claim.

Consider the two examples that De Soysa and his researchers compare and contrast
– India and sub-Saharan Africa. In 1960 both areas produced about 50 million
tons of food each year.

According to the Associated Press, though, by 1988 India was producing 150
million tons of food while sub-Saharan Africa was still producing only about
50 million tons of food each year. Since 1960 sub-Saharan Africa has been wracked
by numerous regional wars, while India has been relatively free of that sort
of widespread conflict (though it has had several conflicts).

What these researchers for Future Harvest are claiming is that India avoided
wars because it received food aid while sub-Saharan wars were driven because
of a lack of agricultural assistance. This is nonsense. A much better explanation
of the facts it that India’s agricultural output increased precisely because
it managed to avoid widespread conflicts, while sub-Sahara Africa floundered
because it devolved into one bloody conflict after another.

De Soysa’s hypothesis might have some currency if researchers could demonstrate
that Africa’s conflicts originated due to a lack of food, but this is undercut
by the evidence that both India and Africa started out in the same positions
when it came to food. This claim also ignores the evidence of the many African
wars themselves, few of which had their origins in a lack of food.

As it is, this study seems to get everything backwards. Peace is a prerequisite
of functional agricultural markets, not the other way around.

Sources:

New report says wars are rooted in roots. David Briscoe, Associated Press,
Feb. 16, 1999.